WatsonX
watsonx is a next generation enterprise studio for AI builders to train, validate, tune and deploy AI models.
Getting Started
Configuring Opik
To start tracking your watsonx LLM calls, you can use our LiteLLM integration. You’ll need to have both the opik
and litellm
packages installed. You can install them using pip:
In addition, you can configure Opik using the opik configure
command which will prompt you for the correct local server address or if you are using the Cloud platform your API key:
If you’re unable to use our LiteLLM integration with watsonx, please open an issue
Configuring watsonx
In order to configure watsonx, you will need to have:
- The endpoint URL: Documentation for this parameter can be found here
- Watsonx API Key: Documentation for this parameter can be found here
- Watsonx Token: Documentation for this parameter can be found here
- (Optional) Watsonx Project ID: Can be found in the Manage section of your project.
Once you have these, you can set them as environment variables:
Logging LLM calls
In order to log the LLM calls to Opik, you will need to create the OpikLogger callback. Once the OpikLogger callback is created and added to LiteLLM, you can make calls to LiteLLM as you normally would:

Logging LLM calls within a tracked function
If you are using LiteLLM within a function tracked with the @track
decorator, you will need to pass the current_span_data
as metadata to the litellm.completion
call:
