Track Agents
When working with agents, it can become challenging to track the flow of the agent and its interactions with the environment. Opik provides a way to track both the agent definition and it's flow.
Opik includes an integration with many popular Agent frameworks (LangGrah, LLamaIndex)
and can also be used to log agents manually using the @track
decorator.
We are working on improving Opik's support for agent workflows, if you have any ideas or suggestions for the roadmap, you can create a new Feature Request issue in the Opik Github repo or book a call with the Opik team: Talk to the Opik team.
Track agent execution
You can track the agent execution by using either one of Opik's integrations or the @track
decorator:
- LangGraph
- Haystack
- LLamaIndex
- Manual Tracking
You can log the agent execution by using the OpikTracer callback:
from opik.integrations.langchain import OpikTracer
# create your LangGraph graph
graph = ...
app = graph.compile(...)
opik_tracer = OpikTracer(graph=app.get_graph(xray=True))
# Pass the OpikTracer callback to the Graph.stream function
for s in app.stream({"messages": [HumanMessage(content = QUESTION)]},
config={"callbacks": [opik_tracer]}):
print(s)
# Pass the OpikTracer callback to the Graph.invoke function
result = app.invoke({"messages": [HumanMessage(content = QUESTION)]},
config={"callbacks": [opik_tracer]})
The OpikTracer
can be added
To log a Haystack pipeline run, you can use the OpikConnector
. This connector will log the pipeline run to the Opik platform and add a tracer
key to the pipeline run response with the trace ID:
import os
os.environ["HAYSTACK_CONTENT_TRACING_ENABLED"] = "true"
from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from opik.integrations.haystack import OpikConnector
pipe = Pipeline()
# Add the OpikConnector component to the pipeline
pipe.add_component(
"tracer", OpikConnector("Chat example")
)
# Add other pipeline components
# Run the pipeline
response = pipe.run(...)
print(response)
Opik has a built-in integration with LLamaIndex that makes it easy to track the agent execution:
from llama_index.core import global_handler, set_global_handler
# Configure the opik integration
set_global_handler("opik")
opik_callback_handler = global_handler
If you are not using any of the above integrations, you can track the agent execution manually using the @track
decorator:
import opik
@opik.track
def calculator_tool(input):
pass
@opik.track
def search_tool(input):
pass
@opik.track
def agent_graph(user_question):
calculator_tool(user_question)
search_tool(user_question)
agent_graph("What is Opik ?")
Once the agent is executed, you will be able to view the execution flow in the Opik dashboard. In the trace sidebar, you will be able to view each step that has been executed in chronological order:
Track the agent definition
If you are using out LangGraph integration, you can also track the agent definition by passing in the graph
argument to the OpikTracer
callback:
from opik.integrations.langchain import OpikTracer
# Graph definition
opik_tracer = OpikTracer(graph=app.get_graph(xray=True))
This allows you to view the agent definition in the Opik dashboard: