Skip to main content

Deepseek

Deepseek is an Open-Source LLM model that rivals o1 from OpenAI. You can learn more about DeepSeek on Github or on deepseek.com.

In this guide, we will showcase how to track DeepSeek calls using Opik. As DeepSeek is open-source, there are many way to run and call the model. We will focus on how to integrate Opik with the following hosting options:

  1. DeepSeek API
  2. Fireworks AI API
  3. Together AI API

Getting started

Configuring your hosting provider

Before you can start tracking DeepSeek calls, you need to get the API key from your hosting provider.

In order to use the DeepSeek API, you will need to have an API key. You can register for an account on DeepSeek.com. Once you have signed up, you can register for an API key.

Configuring Opik

pip install --upgrade --quiet opik

opik configure
tip

Opik is fully open-source and can be run locally or through the Opik Cloud platform. You can learn more about hosting Opik on your own infrastructure in the self-hosting guide.

Tracking DeepSeek calls

The easiest way to call DeepSeek with Opik is to use the OpenAI Python SDK and the track_openai decorator. This approach is compatible with the DeepSeek API, Fireworks AI API and Together AI API:

from opik.integrations.openai import track_openai
from openai import OpenAI

# Create the OpenAI client that points to DeepSeek API
client = OpenAI(api_key="<DeepSeek API Key>", base_url="https://api.deepseek.com")

# Wrap your OpenAI client to track all calls to Opik
client = track_openai(client)

# Call the API
response = client.chat.completions.create(
model="deepseek-chat",
messages=[
{"role": "system", "content": "You are a helpful assistant"},
{"role": "user", "content": "Hello"},
],
stream=False
)

print(response.choices[0].message.content)