Integrations

Confident AI supports a wide range of integrations to seamlessly fit into your existing workflow. Whether you’re using native SDKs or third-party frameworks, we’ve got you covered.

Overview

Integrations come in handy for two different workflows:

  1. When you wish to trace your LLM app, and
  2. When you wish to run evals in development

All integrations support the former use case. For running evals in development, you’ll need to check in each individual integration’s documentation pages to know whether they are capable of running end-to-end, component-level, and multi-turn evals.

OpenTelemetry (OTEL)

OpenTelemetry is an open-source, vendor-neutral observability framework. Confident AI natively accepts OTEL traces at https://otel.confident-ai.com, making it the best option for teams that:

  • Use any programming language (not just Python/TypeScript)
  • Already have an OTEL-based observability stack
  • Want a standards-based approach to tracing with no vendor lock-in
  • Need distributed tracing across multiple services

Third-Party Integrations

Auto-instrument your LLM application with one-line integrations for popular frameworks and providers.

Only Python is currently supported for third-party integrations, with TypeScript coming very soon.

LLM Gateways

If you use an LLM gateway or proxy, these integrations automatically capture traces across all the providers you route through.