Using Opik with Instructor

Instructor is a Python library for working with structured outputs for LLMs built on top of Pydantic. It provides a simple way to manage schema validations, retries and streaming responses.

Creating an account on Comet.com

Comet provides a hosted version of the Opik platform, simply create an account and grab your API Key.

You can also run the Opik platform locally, see the installation guide for more information.

1%pip install --upgrade --quiet opik instructor anthropic google-generativeai google-genai
1import os
2import getpass

Opik Config

Configure your development environment (If you click the key icon on the left side, you can set API keys that are reusable across notebooks.)

1import opik
2
3opik.configure(use_local=False)
4
5os.environ["OPIK_PROJECT_NAME"] = "opik-cookbook-instructor"

For this demo we are going to use an OpenAI, Anthropic and Gemini, so we will need to configure our API keys:

1if "OPENAI_API_KEY" not in os.environ:
2 os.environ["OPENAI_API_KEY"] = getpass.getpass("Enter your OpenAI API key: ")
3
4if "ANTHROPIC_API_KEY" not in os.environ:
5 os.environ["ANTHROPIC_API_KEY"] = getpass.getpass("Enter your Anthropic API key: ")
6
7if "GOOGLE_API_KEY" not in os.environ:
8 os.environ["GOOGLE_API_KEY"] = getpass.getpass("Enter your Google API key: ")

Using Opik with Instructor library

In order to log traces from Instructor into Opik, we are going to patch the instructor library. This will log each LLM call to the Opik platform.

For all the integrations, we will first add tracking to the LLM client and then pass it to the Instructor library:

1from opik.integrations.openai import track_openai
2import instructor
3from pydantic import BaseModel
4from openai import OpenAI
5
6
7# We will first create the OpenAI client and add the `track_openai`
8# method to log data to Opik
9openai_client = track_openai(OpenAI())
10
11# Patch the OpenAI client for Instructor
12client = instructor.from_openai(openai_client)
13
14
15# Define your desired output structure
16class UserInfo(BaseModel):
17 name: str
18 age: int
19
20
21user_info = client.chat.completions.create(
22 model="gpt-4o-mini",
23 response_model=UserInfo,
24 messages=[{"role": "user", "content": "John Doe is 30 years old."}],
25)
26
27print(user_info)

Thanks to the track_openai method, all the calls made to OpenAI will be logged to the Opik platform. This approach also works well if you are also using the opik.track decorator as it will automatically log the LLM call made with Instructor to the relevant trace.

Trace screenshot

Integrating with other LLM providers

The instructor library supports many LLM providers beyond OpenAI, including: Anthropic, AWS Bedrock, Gemini, etc. Opik supports the majority of these providers aswell.

Here are two additional code snippets needed for the integration.

Anthropic

1from opik.integrations.anthropic import track_anthropic
2import instructor
3from anthropic import Anthropic
4
5# Add Opik tracking
6anthropic_client = track_anthropic(Anthropic())
7
8# Patch the Anthropic client for Instructor
9client = instructor.from_anthropic(
10 anthropic_client, mode=instructor.Mode.ANTHROPIC_JSON
11)
12
13user_info = client.chat.completions.create(
14 model="claude-3-5-sonnet-20241022",
15 response_model=UserInfo,
16 messages=[{"role": "user", "content": "John Doe is 30 years old."}],
17 max_tokens=1000,
18)
19
20print(user_info)

Gemini

1from opik.integrations.genai import track_genai
2import instructor
3from google import genai
4
5# Add Opik tracking
6gemini_client = track_genai(genai.Client())
7
8# Patch the GenAI client for Instructor
9client = instructor.from_genai(
10 gemini_client, mode=instructor.Mode.GENAI_STRUCTURED_OUTPUTS
11)
12
13user_info = client.chat.completions.create(
14 model="gemini-2.0-flash-001",
15 response_model=UserInfo,
16 messages=[{"role": "user", "content": "John Doe is 30 years old."}],
17)
18
19print(user_info)