Integrations

Confident AI supports a wide range of integrations to seamlessly fit into your existing workflow. Whether you’re using native SDKs or third-party frameworks, we’ve got you covered.

Overview

Integrations come in handy for two different workflows:

  1. When you wish to trace your LLM app, and
  2. When you wish to run evals in development

All integrations support the former use case. For running evals in development, you’ll need to check in each individual integration’s documentation pages to know whether they are capable of running end-to-end, component-level, and multi-turn evals.

Native Integrations

Third-Party Integrations

LLM Gateways