Using Opik with Ollama

Ollama allows users to run, interact with, and deploy AI models locally on their machines without the need for complex infrastructure or cloud dependencies.

In this notebook, we will showcase how to log Ollama LLM calls using Opik by utilizing either the OpenAI or LangChain libraries.

Getting started

Configure Ollama

In order to interact with Ollama from Python, we will to have Ollama running on our machine. You can learn more about how to install and run Ollama in the quickstart guide.

Configuring Opik

Opik is available as a fully open source local installation or using Comet.com as a hosted solution. The easiest way to get started with Opik is by creating a free Comet account at comet.com.

If you’d like to self-host Opik, you can learn more about the self-hosting options here.

In addition, you will need to install and configure the Opik Python package:

1%pip install --upgrade --quiet opik
2
3import opik
4
5opik.configure()

Tracking Ollama calls made with OpenAI

Ollama is compatible with the OpenAI format and can be used with the OpenAI Python library. You can therefore leverage the Opik integration for OpenAI to trace your Ollama calls:

1from openai import OpenAI
2from opik.integrations.openai import track_openai
3
4import os
5
6os.environ["OPIK_PROJECT_NAME"] = "ollama-integration"
7
8# Create an OpenAI client
9client = OpenAI(
10 base_url="http://localhost:11434/v1/",
11 # required but ignored
12 api_key="ollama",
13)
14
15# Log all traces made to with the OpenAI client to Opik
16client = track_openai(client)
17
18# call the local ollama model using the OpenAI client
19chat_completion = client.chat.completions.create(
20 messages=[
21 {
22 "role": "user",
23 "content": "Say this is a test",
24 }
25 ],
26 model="llama3.1",
27)
28
29print(chat_completion.choices[0].message.content)

Your LLM call is now traced and logged to the Opik platform.

Tracking Ollama calls made with LangChain

In order to trace Ollama calls made with LangChain, you will need to first install the langchain-ollama package:

1%pip install --quiet --upgrade langchain-ollama

You will now be able to use the OpikTracer class to log all your Ollama calls made with LangChain to Opik:

1from langchain_ollama import ChatOllama
2from opik.integrations.langchain import OpikTracer
3
4# Create the Opik tracer
5opik_tracer = OpikTracer(tags=["langchain", "ollama"])
6
7# Create the Ollama model and configure it to use the Opik tracer
8llm = ChatOllama(
9 model="llama3.1",
10 temperature=0,
11).with_config({"callbacks": [opik_tracer]})
12
13# Call the Ollama model
14messages = [
15 (
16 "system",
17 "You are a helpful assistant that translates English to French. Translate the user sentence.",
18 ),
19 (
20 "human",
21 "I love programming.",
22 ),
23]
24ai_msg = llm.invoke(messages)
25ai_msg

You can now go to the Opik app to see the trace:

Ollama trace in Opik

Built with