Skip to main content

Groq

Groq is Fast AI Inference.

You can check out the Colab Notebook if you'd like to jump straight to the code: Open In Colab

Getting Started

Configuring Opik

To start tracking your Groq LLM calls, you can use our LiteLLM integration. You'll need to have both the opik and litellm packages installed. You can install them using pip:

pip install opik litellm

In addition, you can configure Opik using the opik configure command which will prompt you for the correct local server address or if you are using the Cloud platfrom your API key:

opik configure
info

If you’re unable to use our LiteLLM integration with Groq, please open an issue

Configuring Groq

In order to configure Groq, you will need to have:

  • Your Groq API Key: You can create and manage your Groq API Keys on this page.

Once you have these, you can set them as environment variables:

import os

os.environ["GROQ_API_KEY"] = "" # Your Google AI Studio Groq API Key

Logging LLM calls

In order to log the LLM calls to Opik, you will need to create the OpikLogger callback. Once the OpikLogger callback is created and added to LiteLLM, you can make calls to LiteLLM as you normally would:

from litellm.integrations.opik.opik import OpikLogger
import litellm

opik_logger = OpikLogger()
litellm.callbacks = [opik_logger]

response = litellm.completion(
model="groq/llama3-8b-8192",
messages=[
{"role": "user", "content": "Why is tracking and evaluation of LLMs important?"}
]
)

Groq Integration

Logging LLM calls within a tracked function

If you are using LiteLLM within a function tracked with the @track decorator, you will need to pass the current_span_data as metadata to the litellm.completion call:

@track
def generate_story(prompt):
response = litellm.completion(
model="groq/llama3-8b-8192",
messages=[{"role": "user", "content": prompt}],
metadata={
"opik": {
"current_span_data": get_current_span_data(),
},
},
)
return response.choices[0].message.content


@track
def generate_topic():
prompt = "Generate a topic for a story about Opik."
response = litellm.completion(
model="Groq/Groq-pro",
messages=[{"role": "user", "content": prompt}],
metadata={
"opik": {
"current_span_data": get_current_span_data(),
},
},
)
return response.choices[0].message.content


@track
def generate_opik_story():
topic = generate_topic()
story = generate_story(topic)
return story


generate_opik_story()

Groq Integration