Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
95 commits
Select commit Hold shift + click to select a range
2be94ca
feat: Send GenAI spans as V2 envelope items
alexander-alderman-webb Apr 15, 2026
01f479a
.
alexander-alderman-webb Apr 15, 2026
80e6a10
.
alexander-alderman-webb Apr 15, 2026
0622cf4
.
alexander-alderman-webb Apr 15, 2026
7c75da1
.
alexander-alderman-webb Apr 15, 2026
54a9b07
update
alexander-alderman-webb Apr 15, 2026
d1aa07c
.
alexander-alderman-webb Apr 15, 2026
117a6c9
.
alexander-alderman-webb Apr 15, 2026
83c36b5
.
alexander-alderman-webb Apr 15, 2026
f71e0ce
openai tests
alexander-alderman-webb Apr 16, 2026
1fab632
anthropic tests
alexander-alderman-webb Apr 16, 2026
f44316d
google-genai tests
alexander-alderman-webb Apr 16, 2026
ff9c5ec
test litellm
alexander-alderman-webb Apr 17, 2026
b92ae36
test huggingface_hub
alexander-alderman-webb Apr 17, 2026
907ca1d
test langchain
alexander-alderman-webb Apr 17, 2026
b254297
test langgraph
alexander-alderman-webb Apr 17, 2026
6f7a054
accept any as sdk version
alexander-alderman-webb Apr 17, 2026
4f871a4
pydantic-ai tests
alexander-alderman-webb Apr 17, 2026
7befc7d
.
alexander-alderman-webb Apr 17, 2026
fb348bb
openai-agents tests
alexander-alderman-webb Apr 17, 2026
41e409d
fix openai-agents tests
alexander-alderman-webb Apr 17, 2026
8bf77f0
fix common tests
alexander-alderman-webb Apr 17, 2026
7c3da4f
client handle None
alexander-alderman-webb Apr 17, 2026
06c2a40
fix item_count
alexander-alderman-webb Apr 17, 2026
204b980
fix common tests
alexander-alderman-webb Apr 17, 2026
00733f9
fix common tests
alexander-alderman-webb Apr 17, 2026
a54cab4
common tests
alexander-alderman-webb Apr 17, 2026
4b0c47b
tests
alexander-alderman-webb Apr 17, 2026
6c5c812
add experimental v2 option
alexander-alderman-webb Apr 17, 2026
51a07ff
push experiment
alexander-alderman-webb Apr 17, 2026
bab7567
fix tests
alexander-alderman-webb Apr 17, 2026
3e55795
client changes
alexander-alderman-webb Apr 17, 2026
6d1d7ed
simplify client logic
alexander-alderman-webb Apr 17, 2026
6bf4006
Revert "add experimental v2 option"
alexander-alderman-webb Apr 17, 2026
700e8a1
retry adding experimental option to tests
alexander-alderman-webb Apr 17, 2026
9b20bd2
add experimental option to langgraph tests
alexander-alderman-webb Apr 17, 2026
88fc76e
cleanup
alexander-alderman-webb Apr 20, 2026
08af4b4
remove experimental option
alexander-alderman-webb Apr 20, 2026
7bd12ae
add constant again
alexander-alderman-webb Apr 20, 2026
ef843a0
add name fallback
alexander-alderman-webb Apr 20, 2026
4e3e2d0
remove remaining experimental option references
alexander-alderman-webb Apr 20, 2026
44b2c2d
update test with hardcoded version
alexander-alderman-webb Apr 21, 2026
240f123
merge master
alexander-alderman-webb May 5, 2026
137930c
merge master
alexander-alderman-webb May 11, 2026
307db73
merge fixes
alexander-alderman-webb May 11, 2026
efc37e1
adapt new test
alexander-alderman-webb May 11, 2026
bee6320
add parameter
alexander-alderman-webb May 12, 2026
ab47783
cleanup anthropic
alexander-alderman-webb May 12, 2026
75f4d3a
cleanup google-genai
alexander-alderman-webb May 12, 2026
8ba3d94
cleanup huggingface-hub
alexander-alderman-webb May 12, 2026
f156e92
cleanup langgraph
alexander-alderman-webb May 12, 2026
3b03ddf
cleanup litellm
alexander-alderman-webb May 12, 2026
261b9f0
cleanup openai
alexander-alderman-webb May 12, 2026
4f8a4c8
cleanup openai_agents
alexander-alderman-webb May 12, 2026
dcc3fbe
Merge branch 'master' into webb/gen-ai-v2
alexander-alderman-webb May 12, 2026
14e379f
fix pydantic-ai test
alexander-alderman-webb May 12, 2026
596db31
fix tracing tests
alexander-alderman-webb May 12, 2026
a2adf96
fix tests
alexander-alderman-webb May 12, 2026
401109a
feat: Remove truncation when stream_gen_ai_spans is enabled
alexander-alderman-webb May 12, 2026
5e8c254
add pytest mark asyncio
alexander-alderman-webb May 12, 2026
449457b
do not leak new option and use event_opt
alexander-alderman-webb May 12, 2026
96f86e3
send version field in json
alexander-alderman-webb May 12, 2026
aba2cf1
fix op fallback
alexander-alderman-webb May 12, 2026
a48d701
fix logic
alexander-alderman-webb May 12, 2026
dcce855
simplify logic
alexander-alderman-webb May 12, 2026
43920b5
promote to top level option
alexander-alderman-webb May 12, 2026
ffb339f
merge
alexander-alderman-webb May 12, 2026
9b4ad4b
add parameter
alexander-alderman-webb May 12, 2026
5889ad9
update tracing
alexander-alderman-webb May 12, 2026
cdcdf4d
Merge branch 'webb/gen-ai-v2' into webb/remove-truncation
alexander-alderman-webb May 12, 2026
c948c14
update to non-experimental option
alexander-alderman-webb May 12, 2026
cf04adb
update more tests
alexander-alderman-webb May 12, 2026
4f52f2d
Merge branch 'webb/gen-ai-v2' into webb/remove-truncation
alexander-alderman-webb May 12, 2026
398559b
restore legitimate test
alexander-alderman-webb May 12, 2026
ec57859
test(langchain): Inline global state
alexander-alderman-webb May 13, 2026
854c9af
merge
alexander-alderman-webb May 13, 2026
7886629
add parameterization
alexander-alderman-webb May 13, 2026
680649c
Merge branch 'webb/gen-ai-v2' into webb/remove-truncation
alexander-alderman-webb May 13, 2026
b781172
restore langgraph test
alexander-alderman-webb May 13, 2026
b618cc8
update test
alexander-alderman-webb May 13, 2026
e85dffe
remove None conversion
alexander-alderman-webb May 13, 2026
f8f98c1
update test with None attribute assertion
alexander-alderman-webb May 13, 2026
b46fd5f
mostly whitespace test cleanup
alexander-alderman-webb May 13, 2026
dde7bf4
restore type annotations in huggingface_hub tests
alexander-alderman-webb May 13, 2026
7773a27
merge
alexander-alderman-webb May 13, 2026
913ec9a
litellm test
alexander-alderman-webb May 13, 2026
f2bdff5
remove whitespace changes
alexander-alderman-webb May 13, 2026
ec26b90
one more whitespace removal
alexander-alderman-webb May 13, 2026
4ec3ff7
remove truncation per integration instead
alexander-alderman-webb May 13, 2026
962fd65
update tests with to have more than one input message
alexander-alderman-webb May 13, 2026
425ae27
cleanup one openai test
alexander-alderman-webb May 13, 2026
eda7fd5
merge
alexander-alderman-webb May 13, 2026
47680da
add message in openai_agents tests
alexander-alderman-webb May 13, 2026
ac6a343
merge master
alexander-alderman-webb May 13, 2026
7f01f96
merge follow up
alexander-alderman-webb May 13, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 6 additions & 2 deletions sentry_sdk/integrations/anthropic.py
Original file line number Diff line number Diff line change
Expand Up @@ -438,9 +438,13 @@ def _set_common_input_data(
normalized_messages.append(transformed_message)

role_normalized_messages = normalize_message_roles(normalized_messages)

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
role_normalized_messages, span, scope
messages_data = (
role_normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(role_normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
Expand Down
7 changes: 5 additions & 2 deletions sentry_sdk/integrations/google_genai/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -892,9 +892,12 @@ def set_span_data_for_request(

if messages:
normalized_messages = normalize_message_roles(messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
Expand Down
38 changes: 30 additions & 8 deletions sentry_sdk/integrations/langchain.py
Original file line number Diff line number Diff line change
Expand Up @@ -374,9 +374,15 @@ def on_llm_start(
}
for prompt in prompts
]

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(
normalized_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand Down Expand Up @@ -463,9 +469,15 @@ def on_chat_model_start(
self._normalize_langchain_message(message)
)
normalized_messages = normalize_message_roles(normalized_messages)

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(
normalized_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand Down Expand Up @@ -992,9 +1004,15 @@ def new_invoke(self: "Any", *args: "Any", **kwargs: "Any") -> "Any":
and integration.include_prompts
):
normalized_messages = normalize_message_roles([input])

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(
normalized_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand Down Expand Up @@ -1049,9 +1067,13 @@ def new_stream(self: "Any", *args: "Any", **kwargs: "Any") -> "Any":
and integration.include_prompts
):
normalized_messages = normalize_message_roles([input])

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
Expand Down
20 changes: 16 additions & 4 deletions sentry_sdk/integrations/langgraph.py
Original file line number Diff line number Diff line change
Expand Up @@ -181,9 +181,15 @@ def new_invoke(self: "Any", *args: "Any", **kwargs: "Any") -> "Any":
input_messages = _parse_langgraph_messages(args[0])
if input_messages:
normalized_input_messages = normalize_message_roles(input_messages)

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_input_messages, span, scope
messages_data = (
normalized_input_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(
normalized_input_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand Down Expand Up @@ -234,9 +240,15 @@ async def new_ainvoke(self: "Any", *args: "Any", **kwargs: "Any") -> "Any":
input_messages = _parse_langgraph_messages(args[0])
if input_messages:
normalized_input_messages = normalize_message_roles(input_messages)

client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_input_messages, span, scope
messages_data = (
normalized_input_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(
normalized_input_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand Down
14 changes: 11 additions & 3 deletions sentry_sdk/integrations/litellm.py
Original file line number Diff line number Diff line change
Expand Up @@ -119,8 +119,11 @@ def _input_callback(kwargs: "Dict[str, Any]") -> None:
if isinstance(embedding_input, list)
else [embedding_input]
)
messages_data = truncate_and_annotate_embedding_inputs(
input_list, span, scope
client = sentry_sdk.get_client()
messages_data = (
input_list
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_embedding_inputs(input_list, span, scope)
)
if messages_data is not None:
set_data_normalized(
Expand All @@ -133,9 +136,14 @@ def _input_callback(kwargs: "Dict[str, Any]") -> None:
# For chat, look for the 'messages' parameter
messages = kwargs.get("messages", [])
if messages:
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages = _convert_message_parts(messages)
messages_data = truncate_and_annotate_messages(messages, span, scope)
messages_data = (
messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
span,
Expand Down
46 changes: 38 additions & 8 deletions sentry_sdk/integrations/openai.py
Original file line number Diff line number Diff line change
Expand Up @@ -398,8 +398,13 @@ def _set_responses_api_input_data(

if isinstance(messages, str):
normalized_messages = normalize_message_roles([messages]) # type: ignore
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(normalized_messages, span, scope)
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
span, SPANDATA.GEN_AI_REQUEST_MESSAGES, messages_data, unpack=False
Expand All @@ -413,8 +418,13 @@ def _set_responses_api_input_data(
]
if len(non_system_messages) > 0:
normalized_messages = normalize_message_roles(non_system_messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(normalized_messages, span, scope)
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
span, SPANDATA.GEN_AI_REQUEST_MESSAGES, messages_data, unpack=False
Expand Down Expand Up @@ -472,8 +482,13 @@ def _set_completions_api_input_data(

if isinstance(messages, str):
normalized_messages = normalize_message_roles([messages]) # type: ignore
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(normalized_messages, span, scope)
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
span, SPANDATA.GEN_AI_REQUEST_MESSAGES, messages_data, unpack=False
Expand Down Expand Up @@ -503,8 +518,13 @@ def _set_completions_api_input_data(
]
if len(non_system_messages) > 0:
normalized_messages = normalize_message_roles(non_system_messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(normalized_messages, span, scope)
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
span, SPANDATA.GEN_AI_REQUEST_MESSAGES, messages_data, unpack=False
Expand Down Expand Up @@ -539,9 +559,14 @@ def _set_embeddings_input_data(
set_data_normalized(span, SPANDATA.GEN_AI_OPERATION_NAME, "embeddings")

normalized_messages = normalize_message_roles([messages]) # type: ignore
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_embedding_inputs(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_embedding_inputs(
normalized_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand All @@ -560,9 +585,14 @@ def _set_embeddings_input_data(

if len(messages) > 0:
normalized_messages = normalize_message_roles(messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_embedding_inputs(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_embedding_inputs(
normalized_messages, span, scope
)
)
if messages_data is not None:
set_data_normalized(
Expand Down
7 changes: 5 additions & 2 deletions sentry_sdk/integrations/openai_agents/spans/invoke_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -63,9 +63,12 @@ def invoke_agent_span(

if len(messages) > 0:
normalized_messages = normalize_message_roles(messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
set_data_normalized(
Expand Down
7 changes: 6 additions & 1 deletion sentry_sdk/integrations/openai_agents/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,8 +173,13 @@ def _set_input_data(
)

normalized_messages = normalize_message_roles(request_messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(normalized_messages, span, scope)
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
if messages_data is not None:
Comment thread
alexander-alderman-webb marked this conversation as resolved.
set_data_normalized(
span,
Expand Down
7 changes: 5 additions & 2 deletions sentry_sdk/integrations/pydantic_ai/spans/ai_client.py
Original file line number Diff line number Diff line change
Expand Up @@ -182,9 +182,12 @@ def _set_input_messages(span: "sentry_sdk.tracing.Span", messages: "Any") -> Non

if formatted_messages:
normalized_messages = normalize_message_roles(formatted_messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
set_data_normalized(
span, SPANDATA.GEN_AI_REQUEST_MESSAGES, messages_data, unpack=False
Expand Down
7 changes: 5 additions & 2 deletions sentry_sdk/integrations/pydantic_ai/spans/invoke_agent.py
Original file line number Diff line number Diff line change
Expand Up @@ -122,9 +122,12 @@ def invoke_agent_span(

if messages:
normalized_messages = normalize_message_roles(messages)
client = sentry_sdk.get_client()
scope = sentry_sdk.get_current_scope()
messages_data = truncate_and_annotate_messages(
normalized_messages, span, scope
messages_data = (
normalized_messages
if client.options.get("stream_gen_ai_spans", False)
else truncate_and_annotate_messages(normalized_messages, span, scope)
)
set_data_normalized(
span, SPANDATA.GEN_AI_REQUEST_MESSAGES, messages_data, unpack=False
Expand Down
Loading
Loading