Deepseek is an Open-Source LLM model that rivals o1 from OpenAI. You can learn more about DeepSeek on Github or on deepseek.com.

In this guide, we will showcase how to track DeepSeek calls using Opik. As DeepSeek is open-source, there are many way to run and call the model. We will focus on how to integrate Opik with the following hosting options:

  1. DeepSeek API
  2. Fireworks AI API
  3. Together AI API

Getting started

Configuring your hosting provider

Before you can start tracking DeepSeek calls, you need to get the API key from your hosting provider.

In order to use the DeepSeek API, you will need to have an API key. You can register for an account on DeepSeek.com. Once you have signed up, you can register for an API key.

Configuring Opik

$pip install --upgrade --quiet opik
>
>opik configure

Opik is fully open-source and can be run locally or through the Opik Cloud platform. You can learn more about hosting Opik on your own infrastructure in the self-hosting guide.

Tracking DeepSeek calls

The easiest way to call DeepSeek with Opik is to use the OpenAI Python SDK and the track_openai decorator. This approach is compatible with the DeepSeek API, Fireworks AI API and Together AI API:

1from opik.integrations.openai import track_openai
2from openai import OpenAI
3
4# Create the OpenAI client that points to DeepSeek API
5client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")
6
7# Wrap your OpenAI client to track all calls to Opik
8client = track_openai(client)
9
10# Call the API
11response = client.chat.completions.create(
12 model="deepseek-chat",
13 messages=[
14 {"role": "system", "content": "You are a helpful assistant"},
15 {"role": "user", "content": "Hello"},
16 ],
17 stream=False
18)
19
20print(response.choices[0].message.content)
Built with