Dropping Traces

Conditionally dropping traces before they are sent to Confident AI

Overview

Dropping lets you silently discard a trace based on runtime conditions. Unlike sampling, which randomly drops a percentage of traces, this gives you full programmatic control over which traces are sent.

This is useful when you want to conditionally exclude traces — for example, skipping health checks, internal test requests, or traces that don’t meet certain criteria.

Drop a Trace

To drop the current trace, call update_current_trace (Python) or updateCurrentTrace (TypeScript) with drop set to True/true. The trace will be silently discarded and never sent to the observatory.

main.py
1from deepeval.tracing import observe, update_current_trace
2from openai import OpenAI
3
4client = OpenAI()
5
6@observe()
7def llm_app(query: str):
8 if "health" in query.lower():
9 update_current_trace(drop=True)
10 return "OK"
11
12 return client.chat.completions.create(
13 model="gpt-4o",
14 messages=[{"role": "user", "content": query}]
15 ).choices[0].message.content
16
17llm_app("/health") # this trace is dropped
18llm_app("Write me a poem.") # this trace is sent
Dropped traces are discarded entirely — they will not appear in the observatory or count towards usage.