Integrations
Confident AI supports a wide range of integrations to seamlessly fit into your existing workflow. Whether you’re using native SDKs or third-party frameworks, we’ve got you covered.
Overview
Integrations come in handy for two different workflows:
- When you wish to trace your LLM app, and
- When you wish to run evals in development
All integrations support the former use case. For running evals in development, you’ll need to check in each individual integration’s documentation pages to know whether they are capable of running end-to-end, component-level, and multi-turn evals.
OpenTelemetry (OTEL)
OpenTelemetry is an open-source, vendor-neutral observability framework. Confident AI natively accepts OTEL traces at https://otel.confident-ai.com, making it the best option for teams that:
- Use any programming language (not just Python/TypeScript)
- Already have an OTEL-based observability stack
- Want a standards-based approach to tracing with no vendor lock-in
- Need distributed tracing across multiple services
Full quickstart and attribute reference for exporting OTEL traces to Confident AI.
Correlate traces across multiple services and microservices.
Third-Party Integrations
Auto-instrument your LLM application with one-line integrations for popular frameworks and providers.
Only Python is currently supported for third-party integrations, with TypeScript coming very soon.
Chat completion and responses APIs.
Framework for building AI applications.
Type-safe agent framework for Python.
Graph-based framework for stateful AI applications.
OpenAI’s agent framework for intelligent assistants.
Data framework for RAG systems and knowledge agents.
Multi-agent orchestration for collaborative AI workflows.
The TypeScript AI SDK by Vercel for all AI-based applications.
LLM Gateways
If you use an LLM gateway or proxy, these integrations automatically capture traces across all the providers you route through.