Using Opik with aisuite
Opik integrates with aisuite to provide a simple way to log traces for all aisuite LLM calls.
Creating an account on Comet.com
Comet provides a hosted version of the Opik platform, simply create an account and grab you API Key.
You can also run the Opik platform locally, see the installation guide for more information.
%pip install --upgrade opik "aisuite[openai]"
import opik
opik.configure(use_local=False)
Preparing our environment
First, we will set up our OpenAI API keys.
import os
import getpass
if "OPENAI_API_KEY" not in os.environ:
os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")
Logging traces
In order to log traces to Opik, we need to wrap our OpenAI calls with the track_openai
function:
from opik.integrations.aisuite import track_aisuite
import aisuite as ai
client = track_aisuite(ai.Client(), project_name="aisuite-integration-demo")
messages = [
{"role": "user", "content": "Write a short two sentence story about Opik."},
]
response = client.chat.completions.create(
model="openai:gpt-4o", messages=messages, temperature=0.75
)
print(response.choices[0].message.content)
The prompt and response messages are automatically logged to Opik and can be viewed in the UI.
Using it with the track
decorator
If you have multiple steps in your LLM pipeline, you can use the track
decorator to log the traces for each step. If OpenAI is called within one of these steps, the LLM call with be associated with that corresponding step:
from opik import track
from opik.integrations.aisuite import track_aisuite
import aisuite as ai
client = track_aisuite(ai.Client(), project_name="aisuite-integration-demo")
@track
def generate_story(prompt):
res = client.chat.completions.create(
model="openai:gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}]
)
return res.choices[0].message.content
@track
def generate_topic():
prompt = "Generate a topic for a story about Opik."
res = client.chat.completions.create(
model="openai:gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}]
)
return res.choices[0].message.content
@track(project_name="aisuite-integration-demo")
def generate_opik_story():
topic = generate_topic()
story = generate_story(topic)
return story
generate_opik_story()
The trace can now be viewed in the UI: