Skip to main content

Anthropic

Anthropic is an AI safety and research company that's working to build reliable, interpretable, and steerable AI systems.

This guide explains how to integrate Opik with the Anthropic Python SDK. By using the track_anthropic method provided by opik, you can easily track and evaluate your Anthropic API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.

You can check out the Colab Notebook if you'd like to jump straight to the code: Open In Colab

Getting Started

Configuring Opik

To start tracking your Anthropic LLM calls, you'll need to have both the opik and anthropic. You can install them using pip:

pip install opik anthropic

In addition, you can configure Opik using the opik configure command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:

opik configure

Configuring Anthropic

In order to configure Anthropic, you will need to have your Anthropic API Key set, see this section how to pass your Anthropic API Key.

Once you have it, you can set create your Anthropic client:

import anthropic

anthropic_client = anthropic.Anthropic()

Logging LLM calls

In order to log the LLM calls to Opik, you will need to create the wrap the anthropic client with track_anthropic. When making calls with that wrapped client, all calls will be logged to Opik:

from opik.integrations.anthropic import track_anthropic

anthropic_client = track_anthropic(anthropic_client, project_name="anthropic-integration-demo")

PROMPT = "Why is it important to use a LLM Monitoring like CometML Opik tool that allows you to log traces and spans when working with Anthropic LLM Models?"

response = anthropic_client.messages.create(
model="claude-3-5-sonnet-20241022",
max_tokens=1024,
messages=[
{"role": "user", "content": PROMPT}
]
)
print("Response", response.content[0].text)

Anthropic Integration