Using Opik with aisuite

Opik integrates with aisuite to provide a simple way to log traces for all aisuite LLM calls.

Creating an account on Comet.com

Comet provides a hosted version of the Opik platform, simply create an account and grab you API Key.

You can also run the Opik platform locally, see the installation guide for more information.

1%pip install --upgrade opik "aisuite[openai]"
1import opik
2
3opik.configure(use_local=False)

Preparing our environment

First, we will set up our OpenAI API keys.

1import os
2import getpass
3
4if "OPENAI_API_KEY" not in os.environ:
5 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")

Logging traces

In order to log traces to Opik, we need to wrap our OpenAI calls with the track_openai function:

1from opik.integrations.aisuite import track_aisuite
2import aisuite as ai
3
4client = track_aisuite(ai.Client(), project_name="aisuite-integration-demo")
5
6messages = [
7 {"role": "user", "content": "Write a short two sentence story about Opik."},
8]
9
10response = client.chat.completions.create(
11 model="openai:gpt-4o", messages=messages, temperature=0.75
12)
13print(response.choices[0].message.content)

The prompt and response messages are automatically logged to Opik and can be viewed in the UI.

aisuite Integration

Using it with the track decorator

If you have multiple steps in your LLM pipeline, you can use the track decorator to log the traces for each step. If OpenAI is called within one of these steps, the LLM call with be associated with that corresponding step:

1from opik import track
2from opik.integrations.aisuite import track_aisuite
3import aisuite as ai
4
5client = track_aisuite(ai.Client(), project_name="aisuite-integration-demo")
6
7
8@track
9def generate_story(prompt):
10 res = client.chat.completions.create(
11 model="openai:gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}]
12 )
13 return res.choices[0].message.content
14
15
16@track
17def generate_topic():
18 prompt = "Generate a topic for a story about Opik."
19 res = client.chat.completions.create(
20 model="openai:gpt-3.5-turbo", messages=[{"role": "user", "content": prompt}]
21 )
22 return res.choices[0].message.content
23
24
25@track(project_name="aisuite-integration-demo")
26def generate_opik_story():
27 topic = generate_topic()
28 story = generate_story(topic)
29 return story
30
31
32generate_opik_story()

The trace can now be viewed in the UI:

aisuite Integration

Built with