Skip to content

LiteLLM Dependencies

The LiteLLM package provides dependency resolvers that integrate AI language models into Rustic AI agents through the LiteLLM library. This enables consistent access to various large language models (LLMs) across different providers.

Available Resolvers

  • LiteLLMResolver - Unified access to LLMs from OpenAI, Anthropic, Google, and other providers

Usage

LiteLLM dependencies simplify working with large language models by providing a consistent interface across different LLM providers. This allows agents to seamlessly use and switch between different models without changing their code.

For details on how to configure and use each resolver, please refer to the specific documentation links above.