Describes how to track OpenRouter LLM calls using Opik

This guide explains how to integrate Opik with OpenRouter using the OpenAI SDK. OpenRouter provides a unified API for accessing hundreds of AI models through a single OpenAI-compatible interface.

Getting started

First, ensure you have both opik and openai packages installed:

$pip install opik openai

You’ll also need an OpenRouter API key which you can get from OpenRouter.

Tracking OpenRouter API calls

1from opik.integrations.openai import track_openai
2from openai import OpenAI
3
4# Initialize the OpenAI client with OpenRouter base URL
5client = OpenAI(
6 base_url="https://openrouter.ai/api/v1",
7 api_key="YOUR_OPENROUTER_API_KEY"
8)
9client = track_openai(client)
10
11# Optional headers for OpenRouter leaderboard
12headers = {
13 "HTTP-Referer": "YOUR_SITE_URL", # Optional. Site URL for rankings
14 "X-Title": "YOUR_SITE_NAME" # Optional. Site title for rankings
15}
16
17response = client.chat.completions.create(
18 model="openai/gpt-4", # You can use any model available on OpenRouter
19 extra_headers=headers,
20 messages=[
21 {"role": "user", "content": "Hello, world!"}
22 ],
23 temperature=0.7,
24 max_tokens=100
25)
26
27print(response.choices[0].message.content)

Available Models

OpenRouter provides access to a wide variety of models, including many open source models from different providers.

You can find the complete list of available models in the OpenRouter documentation.

Supported Methods

OpenRouter supports the following methods:

Chat Completions

  • client.chat.completions.create(): Works with all models
  • Provides standard chat completion functionality
  • Compatible with the OpenAI SDK interface

Structured Outputs

For detailed information about available methods, parameters, and best practices, refer to the OpenRouter API documentation.

Built with