Using Opik with Anthropic

Opik integrates with Anthropic to provide a simple way to log traces for all Anthropic LLM calls. This works for all supported models, including if you are using the streaming API.

Creating an account on Comet.com

Comet provides a hosted version of the Opik platform, simply create an account and grab you API Key.

You can also run the Opik platform locally, see the installation guide for more information.

1%pip install --upgrade opik anthropic
1import opik
2
3opik.configure(use_local=False)

Preparing our environment

First, we will set up our anthropic client. You can find or create your Anthropic API Key in this page page and paste it below:

1import os
2import getpass
3import anthropic
4
5if "ANTHROPIC_API_KEY" not in os.environ:
6 os.environ["ANTHROPIC_API_KEY"] = getpass.getpass("Enter your Anthropic API key: ")

Logging traces

In order to log traces to Opik, we need to wrap our Anthropic calls with the track_anthropic function:

1import os
2
3from opik.integrations.anthropic import track_anthropic
4
5anthropic_client = anthropic.Anthropic()
6anthropic_client = track_anthropic(anthropic, project_name="anthropic-integration-demo")
1PROMPT = "Why is it important to use a LLM Monitoring like CometML Opik tool that allows you to log traces and spans when working with Anthropic LLM Models?"
2
3response = anthropic_client.messages.create(
4 model="claude-3-5-sonnet-20241022",
5 max_tokens=1024,
6 messages=[{"role": "user", "content": PROMPT}],
7)
8print("Response", response.content[0].text)

The prompt and response messages are automatically logged to Opik and can be viewed in the UI.

Anthropic Integration

Using it with the track decorator

If you have multiple steps in your LLM pipeline, you can use the track decorator to log the traces for each step. If Anthropic is called within one of these steps, the LLM call with be associated with that corresponding step:

1import anthropic
2
3from opik import track
4from opik.integrations.anthropic import track_anthropic
5
6os.environ["OPIK_PROJECT_NAME"] = "anthropic-integration-demo"
7
8anthropic_client = anthropic.Anthropic()
9anthropic_client = track_anthropic(anthropic)
10
11
12@track
13def generate_story(prompt):
14 res = anthropic_client.messages.create(
15 model="claude-3-5-sonnet-20241022",
16 max_tokens=1024,
17 messages=[{"role": "user", "content": prompt}],
18 )
19 return res.content[0].text
20
21
22@track
23def generate_topic():
24 prompt = "Generate a topic for a story about Opik."
25 res = anthropic_client.messages.create(
26 model="claude-3-5-sonnet-20241022",
27 max_tokens=1024,
28 messages=[{"role": "user", "content": prompt}],
29 )
30 return res.content[0].text
31
32
33@track
34def generate_opik_story():
35 topic = generate_topic()
36 story = generate_story(topic)
37 return story
38
39
40generate_opik_story()

The trace can now be viewed in the UI:

Anthropic Integration

Built with