Skip to content

Integrate with LangChain¶

To monitor your LLM Applications, you can use the Opik platform by Comet.

Opik screenshot

Using the LangChain integration¶

You can learn more about Opik's integration with LangChain here.

Logging all your LangChain calls to the platform is as simple as enabling the OpikTracer:

from langchain.chains import LLMChain
from langchain_openai import OpenAI
from langchain.prompts import PromptTemplate
from opik.integrations.langchain import OpikTracer

# Initialize the tracer
opik_tracer = OpikTracer()

# Create the LLM Chain using LangChain
llm = OpenAI(temperature=0)

prompt_template = PromptTemplate(
    input_variables=["input"],
    template="Translate the following text to French: {input}"
)

llm_chain = LLMChain(llm=llm, prompt=prompt_template).with_config({"callbacks": [opik_tracer]})

# Generate the translations
translation = llm_chain.run("Hello, how are you?")
print(translation)
Nov. 18, 2024