Using Opik with LiteLLM
Lite allows you to call all LLM APIs using the OpenAI format [Bedrock, Huggingface, VertexAI, TogetherAI, Azure, OpenAI, Groq etc.]. You can learn more about LiteLLM here.
There are two main approaches to using LiteLLM, either using the litellm
python library that will query the LLM API for you or by using the LiteLLM proxy server. In this cookbook we will focus on the first approach but you can learn more about using Opik with the LiteLLM proxy server in our documentation.
Creating an account on Comet.com
Comet provides a hosted version of the Opik platform, simply create an account and grab you API Key.
You can also run the Opik platform locally, see the installation guide for more information.
Preparing our environment
In order to use LiteLLM, we will configure the OpenAI API Key, if you are using any other providers you can replace this with the required API key:
Logging traces
In order to log traces to Opik, you will need to set the opik
callback:
Every LiteLLM call will now be logged to Opik:
The trace will now be viewable in the Opik platform:
Logging LLM calls within a tracked function
If you are using LiteLLM within a function tracked with the @track
decorator, you will need to pass the current_span_data
as metadata to the litellm.completion
call: