Using Opik with Predibase

This notebook demonstrates how to use Predibase as an LLM provider with LangChain, and how to integrate Opik for tracking and logging.

Setup

First, let’s install the necessary packages and set up our environment variables.

1%pip install --upgrade --quiet predibase opik

We will now configure Opik and Predibase:

1# Configure Opik
2import opik
3import os
4import getpass
5
6opik.configure(use_local=False)
7
8# Configure predibase
9os.environ["PREDIBASE_API_TOKEN"] = getpass.getpass("Enter your Predibase API token")

Creating the Opik Tracer

In order to log traces to Opik, we will be using the OpikTracer from the LangChain integration.

1# Import Opik tracer
2from opik.integrations.langchain import OpikTracer
3
4# Initialize Opik tracer
5opik_tracer = OpikTracer(
6 tags=["predibase", "langchain"],
7)

Initial Call

Let’s set up our Predibase model and make an initial call.

1from langchain_community.llms import Predibase
2import os
3
4model = Predibase(
5 model="mistral-7b",
6 predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
7)
8
9# Test the model with Opik tracing
10response = model.invoke(
11 "Can you recommend me a nice dry wine?",
12 config={"temperature": 0.5, "max_new_tokens": 1024, "callbacks": [opik_tracer]},
13)
14print(response)

In addition to passing the OpikTracer to the invoke method, you can also define it during the creation of the Predibase object:

1model = Predibase(
2 model="mistral-7b",
3 predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
4).with_config({"callbacks": [opik_tracer]})

SequentialChain

Now, let’s create a more complex chain and run it with Opik tracing.

1from langchain.chains import LLMChain, SimpleSequentialChain
2from langchain_core.prompts import PromptTemplate
3
4# Synopsis chain
5template = """You are a playwright. Given the title of play, it is your job to write a synopsis for that title.
6
7Title: {title}
8Playwright: This is a synopsis for the above play:"""
9prompt_template = PromptTemplate(input_variables=["title"], template=template)
10synopsis_chain = LLMChain(llm=model, prompt=prompt_template)
11
12# Review chain
13template = """You are a play critic from the New York Times. Given the synopsis of play, it is your job to write a review for that play.
14
15Play Synopsis:
16{synopsis}
17Review from a New York Times play critic of the above play:"""
18prompt_template = PromptTemplate(input_variables=["synopsis"], template=template)
19review_chain = LLMChain(llm=model, prompt=prompt_template)
20
21# Overall chain
22overall_chain = SimpleSequentialChain(
23 chains=[synopsis_chain, review_chain], verbose=True
24)
25
26# Run the chain with Opik tracing
27review = overall_chain.run("Tragedy at sunset on the beach", callbacks=[opik_tracer])
28print(review)

Accessing Logged Traces

We can access the trace IDs collected by the Opik tracer.

1traces = opik_tracer.created_traces()
2print("Collected trace IDs:", [trace.id for trace in traces])
3
4# Flush traces to ensure all data is logged
5opik_tracer.flush()

Fine-tuned LLM Example

Finally, let’s use a fine-tuned model with Opik tracing.

Note: In order to use a fine-tuned model, you will need to have access to the model and the correct model ID. The code below will return a NotFoundError unless the model and adapter_id are updated.

1fine_tuned_model = Predibase(
2 model="my-base-LLM",
3 predibase_api_key=os.environ.get("PREDIBASE_API_TOKEN"),
4 predibase_sdk_version=None,
5 adapter_id="my-finetuned-adapter-id",
6 adapter_version=1,
7 **{
8 "api_token": os.environ.get("HUGGING_FACE_HUB_TOKEN"),
9 "max_new_tokens": 5,
10 },
11)
12
13# Configure the Opik tracer
14fine_tuned_model = fine_tuned_model.with_config({"callbacks": [opik_tracer]})
15
16# Invode the fine-tuned model
17response = fine_tuned_model.invoke(
18 "Can you help categorize the following emails into positive, negative, and neutral?",
19 **{"temperature": 0.5, "max_new_tokens": 1024},
20)
21print(response)
22
23# Final flush to ensure all traces are logged
24opik_tracer.flush()
Built with