OpenRouter
Describes how to track OpenRouter LLM calls using Opik
This guide explains how to integrate Opik with OpenRouter using the OpenAI SDK. OpenRouter provides a unified API for accessing hundreds of AI models through a single OpenAI-compatible interface.
Getting started
First, ensure you have both opik
and openai
packages installed:
You’ll also need an OpenRouter API key which you can get from OpenRouter.
Tracking OpenRouter API calls
Available Models
OpenRouter provides access to a wide variety of models, including many open source models from different providers.
- OpenAI models (GPT-4o, o1, o3-mini)
- Anthropic models (Opus, Sonnet, Haiku)
- Google models (Gemini Pro, Flash, Flash Thinking)
- And many open source models
You can find the complete list of available models in the OpenRouter documentation.
Supported Methods
OpenRouter supports the following methods:
Chat Completions
client.chat.completions.create()
: Works with all models- Provides standard chat completion functionality
- Compatible with the OpenAI SDK interface
Structured Outputs
client.beta.chat.completions.parse()
: Only compatible with OpenAI models- For non-OpenAI models, see OpenRouter’s Structured Outputs documentation
For detailed information about available methods, parameters, and best practices, refer to the OpenRouter API documentation.