Skip to main content

Overview

Opik aims to make it as easy as possible to log, view and evaluate your LLM traces. We do this by providing a set of integrations:

IntegrationDescriptionDocumentationTry in Colab
OpenAILog traces for all OpenAI LLM callsDocumentationOpen Quickstart In Colab
LiteLLMCall any LLM model using the OpenAI formatDocumentationOpen Quickstart In Colab
LangChainLog traces for all LangChain LLM callsDocumentationOpen Quickstart In Colab
HaystackLog traces for all Haystack pipelinesDocumentationOpen Quickstart In Colab
aisuiteLog traces for all aisuite LLM callsDocumentationOpen Quickstart In Colab
AnthropicLog traces for all Anthropic LLM callsDocumentationOpen Quickstart In Colab
BedrockLog traces for all AWS Bedrock LLM callsDocumentationOpen Quickstart In Colab
LangGraphLog traces for all LangGraph executionsDocumentationOpen Quickstart In Colab
LlamaIndexLog traces for all LlamaIndex LLM callsDocumentationOpen Quickstart In Colab
OllamaLog traces for all Ollama LLM callsDocumentationOpen Quickstart In Colab
PredibaseFine-tune and serve open-source LLMsDocumentationOpen Quickstart In Colab
RagasEvaluation framework for your Retrieval Augmented Generation (RAG) pipelinesDocumentationOpen Quickstart In Colab
watsonxLog traces for all watsonx LLM callsDocumentationOpen Quickstart In Colab
DifyLog traces and LLM calls for your Dify AppsDocumentation

If you would like to see more integrations, please open an issue on our GitHub repository.