Overview
Opik aims to make it as easy as possible to log, view and evaluate your LLM traces. We do this by providing a set of integrations:
Integration | Description | Documentation | Try in Colab |
---|---|---|---|
OpenAI | Log traces for all OpenAI LLM calls | Documentation | |
LiteLLM | Call any LLM model using the OpenAI format | Documentation | |
LangChain | Log traces for all LangChain LLM calls | Documentation | |
Haystack | Log traces for all Haystack pipelines | Documentation | |
aisuite | Log traces for all aisuite LLM calls | Documentation | |
Anthropic | Log traces for all Anthropic LLM calls | Documentation | |
Bedrock | Log traces for all AWS Bedrock LLM calls | Documentation | |
LangGraph | Log traces for all LangGraph executions | Documentation | |
LlamaIndex | Log traces for all LlamaIndex LLM calls | Documentation | |
Ollama | Log traces for all Ollama LLM calls | Documentation | |
Predibase | Fine-tune and serve open-source LLMs | Documentation | |
Ragas | Evaluation framework for your Retrieval Augmented Generation (RAG) pipelines | Documentation | |
watsonx | Log traces for all watsonx LLM calls | Documentation | |
Dify | Log traces and LLM calls for your Dify Apps | Documentation |
If you would like to see more integrations, please open an issue on our GitHub repository.