OpenAI
Describes how to track OpenAI LLM calls using Opik
This guide explains how to integrate Opik with the OpenAI Python SDK. By using the track_openai
method provided by opik, you can easily track and evaluate your OpenAI API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.
Getting started
First, ensure you have both opik
and openai
packages installed:
In addition, you can configure Opik using the opik configure
command which will prompt you for the correct local server address or if you are using the Cloud platform your API key:
Tracking OpenAI API calls
The track_openai
will automatically track and log the API call, including the input prompt, model used, and response generated. You can view these logs in your Opik project dashboard.
Using Azure OpenAI
The OpenAI integration also supports Azure OpenAI Services. To use Azure OpenAI, initialize your client with Azure configuration and use it with track_openai
just like the standard OpenAI client:
Supported OpenAI methods
The track_openai
wrapper supports the following OpenAI methods:
openai_client.chat.completions.create()
openai_client.beta.chat.completions.parse()
If you would like to track another OpenAI method, please let us know by opening an issue on GitHub.