LiteLLMChatModel

class opik.evaluation.models.LiteLLMChatModel(model_name: str = 'gpt-3.5-turbo', must_support_arguments: List[str] | None = None, **completion_kwargs: Any)

Bases: OpikBaseModel

__init__(model_name: str = 'gpt-3.5-turbo', must_support_arguments: List[str] | None = None, **completion_kwargs: Any) None

Initializes the base model with a given model name. Wraps litellm.completion function. You can find all possible completion_kwargs parameters here: https://docs.litellm.ai/docs/completion/input.

Parameters:
  • model_name – The name of the LLM model to be used.

  • must_support_arguments – A list of arguments that the provider must support. litellm.get_supported_openai_params(model_name) call is used to get supported arguments. If any is missing, ValueError is raised.

  • **completion_kwargs – key-value arguments to always pass additionally into litellm.completion function.

property supported_params: Set[str]
generate_string(input: str, **kwargs: Any) str

Simplified interface to generate a string output from the model. You can find all possible completion_kwargs parameters here: https://docs.litellm.ai/docs/completion/input

Parameters:
  • input – The input string based on which the model will generate the output.

  • kwargs – Additional arguments that may be used by the model for string generation.

Returns:

The generated string output.

Return type:

str

generate_provider_response(**kwargs: Any) ModelResponse

Generate a provider-specific response. Can be used to interface with the underlying model provider (e.g., OpenAI, Anthropic) and get raw output. You can find all possible input parameters here: https://docs.litellm.ai/docs/completion/input

Parameters:

kwargs – arguments required by the provider to generate a response.

Returns:

The response from the model provider, which can be of any type depending on the use case and LLM model.

Return type:

Any

async agenerate_string(input: str, **kwargs: Any) str

Simplified interface to generate a string output from the model. Async version. You can find all possible input parameters here: https://docs.litellm.ai/docs/completion/input

Parameters:
  • input – The input string based on which the model will generate the output.

  • kwargs – Additional arguments that may be used by the model for string generation.

Returns:

The generated string output.

Return type:

str

async agenerate_provider_response(**kwargs: Any) ModelResponse

Generate a provider-specific response. Can be used to interface with the underlying model provider (e.g., OpenAI, Anthropic) and get raw output. Async version. You can find all possible input parameters here: https://docs.litellm.ai/docs/completion/input

Parameters:

kwargs – arguments required by the provider to generate a response.

Returns:

The response from the model provider, which can be of any type depending on the use case and LLM model.

Return type:

Any