Anthropic
Anthropic is an AI safety and research company that’s working to build reliable, interpretable, and steerable AI systems.
This guide explains how to integrate Opik with the Anthropic Python SDK. By using the track_anthropic
method provided by opik, you can easily track and evaluate your Anthropic API calls within your Opik projects as Opik will automatically log the input prompt, model used, token usage, and response generated.
Getting Started
Configuring Opik
To start tracking your Anthropic LLM calls, you’ll need to have both the opik
and anthropic
. You can install them using pip:
In addition, you can configure Opik using the opik configure
command which will prompt you for the correct local server address or if you are using the Cloud platform your API key:
Configuring Anthropic
In order to configure Anthropic, you will need to have your Anthropic API Key set, see this section how to pass your Anthropic API Key.
Once you have it, you can set it as an environment variable:
Logging LLM calls
In order to log the LLM calls to Opik, you will need to create the wrap the anthropic client with track_anthropic
. When making calls with that wrapped client, all calls will be logged to Opik:
