LiteLLM
Use Confident AI for LLM observability and evals for LiteLLM
Overview
LiteLLM is a tool that makes it easy for developers to connect to and use many different large language models (LLMs)—like those from OpenAI, Anthropic, Hugging Face, and more—through a single, simple interface.
Tracing Quickstart
Confident AI integrates with LiteLLM to trace your LLM calls for both Proxy and SDK in the Observatory.
Setup via SDK
Install the LiteLLM SDK:
Setup via Proxy
There are two ways to start the LiteLLM proxy server from CLI i.e., either using pip package or docker container. We have used the pip package in the example above. Refer to the LiteLLM documentation for more information.