OpikTracer

class opik.integrations.langchain.OpikTracer(tags: List[str] | None = None, metadata: Dict[str, Any] | None = None, graph: Graph | None = None, project_name: str | None = None, distributed_headers: DistributedTraceHeadersDict | None = None, **kwargs: Any)

Bases: BaseTracer

Langchain Opik Tracer.

Parameters:
  • tags – List of tags to be applied to each trace logged by the tracer.

  • metadata – Additional metadata for each trace logged by the tracer.

  • graph – A LangGraph Graph object to track the Graph Definition in Opik.

  • project_name – The name of the project to log data.

flush() None

Flush to ensure all data is sent to the Opik server.

created_traces() List[Trace]

Get a list of traces created by OpikTracer.

Returns:

A list of traces.

Return type:

List[Trace]

on_chat_model_start(serialized: Dict[str, Any], messages: List[List[BaseMessage]], *, run_id: UUID, tags: List[str] | None = None, parent_run_id: UUID | None = None, metadata: Dict[str, Any] | None = None, name: str | None = None, **kwargs: Any) Run

Start a trace for an LLM run.

Duplicated from Langchain tracer, it is disabled by default in all tracers, see https://github.com/langchain-ai/langchain/blob/fdda1aaea14b257845a19023e8af5e20140ec9fe/libs/core/langchain_core/callbacks/manager.py#L270-L289 and https://github.com/langchain-ai/langchain/blob/fdda1aaea14b257845a19023e8af5e20140ec9fe/libs/core/langchain_core/tracers/core.py#L168-L180

Parameters:
  • serialized – The serialized model.

  • messages – The messages.

  • run_id – The run ID.

  • tags – The tags. Defaults to None.

  • parent_run_id – The parent run ID. Defaults to None.

  • metadata – The metadata. Defaults to None.

  • name – The name. Defaults to None.

  • kwargs – Additional keyword arguments.

Returns:

The run.

Return type:

Run