Using Opik with Gemini

Opik integrates with Gemini to provide a simple way to log traces for all Gemini LLM calls. This works for all Gemini models.

Creating an account on Comet.com

Comet provides a hosted version of the Opik platform, simply create an account and grab you API Key.

You can also run the Opik platform locally, see the installation guide for more information.

1%pip install --upgrade opik google-generativeai litellm
1import opik
2
3opik.configure(use_local=False)

Preparing our environment

First, we will set up our OpenAI API keys.

1import os
2import getpass
3import google.generativeai as genai
4
5if "GEMINI_API_KEY" not in os.environ:
6 genai.configure(api_key=getpass.getpass("Enter your Gemini API key: "))

Configure LiteLLM

Add the LiteLLM OpikTracker to log traces and steps to Opik:

1import litellm
2import os
3from litellm.integrations.opik.opik import OpikLogger
4from opik import track
5from opik.opik_context import get_current_span_data
6
7os.environ["OPIK_PROJECT_NAME"] = "gemini-integration-demo"
8opik_logger = OpikLogger()
9litellm.callbacks = [opik_logger]

Logging traces

Now each completion will logs a separate trace to LiteLLM:

1prompt = """
2Write a short two sentence story about Opik.
3"""
4
5response = litellm.completion(
6 model="gemini/gemini-pro",
7 messages=[{"role": "user", "content": prompt}],
8)
9
10print(response.choices[0].message.content)

The prompt and response messages are automatically logged to Opik and can be viewed in the UI.

Gemini Cookbook

Using it with the track decorator

If you have multiple steps in your LLM pipeline, you can use the track decorator to log the traces for each step. If Gemini is called within one of these steps, the LLM call with be associated with that corresponding step:

1@track
2def generate_story(prompt):
3 response = litellm.completion(
4 model="gemini/gemini-pro",
5 messages=[{"role": "user", "content": prompt}],
6 metadata={
7 "opik": {
8 "current_span_data": get_current_span_data(),
9 },
10 },
11 )
12 return response.choices[0].message.content
13
14
15@track
16def generate_topic():
17 prompt = "Generate a topic for a story about Opik."
18 response = litellm.completion(
19 model="gemini/gemini-pro",
20 messages=[{"role": "user", "content": prompt}],
21 metadata={
22 "opik": {
23 "current_span_data": get_current_span_data(),
24 },
25 },
26 )
27 return response.choices[0].message.content
28
29
30@track
31def generate_opik_story():
32 topic = generate_topic()
33 story = generate_story(topic)
34 return story
35
36
37generate_opik_story()

The trace can now be viewed in the UI:

Gemini Cookbook

Built with