Introduction to LLM Tracing
Overview
Confident AI offers LLM tracing for teams to trace and monitor LLM applications. Think Datadog for LLM apps, but with an additional suite of 30+ evaluation metrics to track continuous performance over time.
Get Started
Get LLM tracing for your LLM app with best in-class-evals.
Start tracing your LLM applications now by following this short quickstart.
Run online and offline evaluations for your LLM application’s traces, spans and threads.
Track your LLM application’s cost and latency during execution.
Advanced Features
You can configure tracing on Confident AI in virtually any way you wish:
Set different trace environments for your app.
Configure the trace sampling rate.
Log any custom metadata with your traces.
Log any tags for better trace organization.
Log entire conversations (threads) for multi-turn tracing.
Mask PII for traces to protect sensitive data.
Track user identities or sessions for trace attribution.
Set span and component types for detailed tracing.
Integrations
You can also setup tracing via 1-line integrations.
Only Python is supported for itnegrations, with Typescript coming very soon.
Industry-standard observability framework for LLM monitoring.
Chat completion and responses APIs.
Framework for building AI applications.
Type-safe agent framework for Python.
Graph-based framework for stateful AI applications.
OpenAI’s agent framework for intelligent assistants.
Data framework for RAG systems and knowledge agents.
Unified API for 100+ LLM providers.
Multi-agent orchestration for collaborative AI workflows.
FAQs
What evals are offered by Confident AI LLM tracing?
You can run evaluations using metrics for RAG, agents, chatbots, on:
- Traces (end-to-end)
- Spans (individual components)
- Threads (multi-turn conversations)
And these are be either done in an online fashion (run evals as they are being ingested in the platform), or offline (run evals retrospectively).
How will tracing affect my app?
Confident AI tracing is designed to be completely non-intrusive to your application. It:
- Can be disabled/enabled anytime through the
CONFIDENT_TRACING_ENABLED="YES"/"NO"enviornment variable. - Requires no rewrite of your existing code - just add the
@observedecorator. - Runs asynchronously in the background with zero impact on latency.
- Fails silently if there are any issues, ensuring your app keeps running.
- Works with any function signature - you can set input/output at runtime.
