Releases: DataDog/dd-trace-py
2.13.1
Bug Fixes
-
Code Security (IAST)
- Always report a telemetry log error when an IAST propagation error raises, regardless of whether the
_DD_IAST_DEBUG
environment variable is enabled or not. - Code Security: Fixes potential memory leak on IAST exception handling.
- Always report a telemetry log error when an IAST propagation error raises, regardless of whether the
-
Profiling:
- Updates filenames for all files with platform-dependent code to reflect the platform they are for. This fixes issues where the wrong file would be used on a given platform.
- Enables endpoint profiling for stack v2,
DD_PROFILING_STACK_V2_ENABLED
is set. - Fixes endpoint profiling when using libdatadog exporter, either with
DD_PROFILING_EXPORT_LIBDD_ENABLED
orDD_PROFILING_TIMELINE_ENABLED
. - Enables code provenance when using libdatadog exporter,
DD_PROFILING_EXPORT_LIBDD_ENABLED
,DD_PROFILING_STACK_V2_ENABLED
, orDD_PROFILING_TIMELINE_ENABLED
. - Fixes an issue where the flamegraph was upside down for stack v2 when enabling
DD_PROFILING_STACK_V2_ENABLED
.
-
Tracing
- Fixes an issue where
celery.apply
spans didn't close if theafter_task_publish
ortask_postrun
signals didn't get sent when usingapply_async
, which can happen if there is an internal exception during the handling of the task. This update also marks the span as an error if an exception occurs. - Fixes an issue where
celery.apply
spans using task_protocol 1 didn't close by improving the check for the task id in the body. - Removes a reference cycle that caused unnecessary garbage collection for top-level spans.
- Fixes an issue where
2.12.3
Bug Fixes
-
Code Security
- This fix resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- Ensure the
Initializer
object is always reset and freed before the Python runtime.
-
LLM Observability
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
LLMObs.enable(agentless_enabled=True)
or settingDD_LLMOBS_AGENTLESS_ENABLED=1
. - Resolves an issue in the
LLMObs.annotate()
method where non-JSON serializable arguments were discarded entirely. Now, theLLMObs.annotate()
method safely handles non-JSON-serializable arguments by defaulting to a placeholder text. - Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a
TypeError
in the OpenAI, LangChain, Bedrock, and Anthropic integrations. - Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError
. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a
TypeError
. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text.
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
-
Profiling
- All files with platform-dependent code have had their filenames updated to reflect the platform they are for. This fixes issues where the wrong file would be used on a given platform.
- Improves the error message when the native exporter fails to load and stops profiling from starting if ddtrace is also being injected.
- Enables endpoint profiling for stack v2,
DD_PROFILING_STACK_V2_ENABLED
is set. - Fixes endpoint profiling when using libdatadog exporter, either with
DD_PROFILING_EXPORT_LIBDD_ENABLED
orDD_PROFILING_TIMELINE_ENABLED
. - Enables code provenance when using libdatadog exporter,
DD_PROFILING_EXPORT_LIBDD_ENABLED
,DD_PROFILING_STACK_V2_ENABLED
, orDD_PROFILING_TIMELINE_ENABLED
. - Fixes an issue where flamegraph was upside down for stack v2,
DD_PROFILING_STACK_V2_ENABLED
.
-
Tracing
- Fixes an issue where
celery.apply
spans didn't close if theafter_task_publish
ortask_postrun
signals didn't get sent when usingapply_async
, which can happen if there is an internal exception during the handling of the task. This update also marks the span as an error if an exception occurs. - Fixes an issue where
celery.apply
spans using task_protocol 1 didn't close by improving the check for the task id in the body. - Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- Removes a reference cycle that caused unnecessary garbage collection for top-level spans.
- Fixed an issue where a
TypeError
exception would be raised if the first message'stopic()
returnedNone
during consumption. - Kinesis: Resolves an issue where unparsable data in a Kinesis record would cause a NoneType error.
- Fixes an issue where
2.14.2
Bug Fixes
-
Tracing
- celery: Fixes an issue where
celery.apply
spans didn't close if theafter_task_publish
ortask_postrun
signals didn't get sent when usingapply_async
, which can happen if there is an internal exception during the handling of the task. This update also marks the span as an error if an exception occurs. - celery: Fixes an issue where
celery.apply
spans using task_protocol 1 didn't close by improving the check for the task id in the body.
- celery: Fixes an issue where
-
Profiling
- All files with platform-dependent code have had their filenames updated to reflect the platform they are for. This fixes issues where the wrong file would be used on a given platform.
- Enables code provenance when using libdatadog exporter,
DD_PROFILING_EXPORT_LIBDD_ENABLED
,DD_PROFILING_STACK_V2_ENABLED
, orDD_PROFILING_TIMELINE_ENABLED
. - Fixes an issue where flamegraph was upside down for stack v2,
DD_PROFILING_STACK_V2_ENABLED
.
2.14.1
New Features
- Code Security (IAST): Always report a telemetry log error when an IAST propagation error raises, regardless of whether the _DD_IAST_DEBUG environment variable is enabled or not.
Bug Fixes
- tracing: Removes a reference cycle that caused unnecessary garbage collection for top-level spans.
- Code Security: fix potential memory leak on IAST exception handling.
- profiling: Fixes endpoint profiling when using libdatadog exporter, either with
DD_PROFILING_EXPORT_LIBDD_ENABLED
orDD_PROFILING_TIMELINE_ENABLED
.
2.13.0
New Features
- Datastreams Monitoring (DSM): Adds support for schema tracking.
- Exception Replay will capture any exceptions that are manually attached to a span with a call to
set_exc_info
. - LLM Observability: The LangChain integration now submits vectorstore
similarity_search
spans to LLM Observability as retrieval spans. - langchain : Adds support for tracing tool invocations.
- LLM Observability: Adds support for capturing tool calls returned from LangChain chat completions.
- LLM Observability: Introduces the ability to set
ml_app
andtimestamp_ms
fields inLLMObs.submit_evaluation
- openai: Introduces
model
tag for openai integration metrics for consistency with the OpenAI SaaS Integration. It has the same value asopenai.request.model
.
Deprecation Notes
- tracing: All public patch modules are deprecated. The non-deprecated methods are included in the
__all__
attribute. - yaaredis: The yaaredis integration is deprecated and will be removed in a future version. As an alternative to the yaaredis integration, the redis integration should be used.
- tracing: Deprecates the
priority_sampling
argument inddtrace.tracer.Tracer.configure(...)
.
Bug Fixes
- library injection: Resolves an issue where the version of
attrs
installed by default on some Ubuntu installations was treated as incompatible with library injection - anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError
. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- ASM: Resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- CI Visibility: Resolves an issue where exceptions other than timeouts and connection errors raised while fetching the list of skippable tests for ITR were not being handled correctly and caused the tracer to crash.
- CI Visibility: Fixes a bug where
.git
was incorrectly being stripped from repository URLs when extracting service names, resulting ing
,i
, ort
being removed (eg:test-environment.git
incorrectly becomingtest-environmen
) - botocore: Resolves a regression where trace context was not being injected into the input of Stepfunction
start_execution
commands. This re-enables distributed tracing when a Python service invokes a properly instrumented Step Function. - LLM Observability: Resolves an issue where custom trace filters were being overwritten in forked processes.
- LLM Observability: Resolves an issue where LLM Observability spans were not being submitted in forked processes, such as when using
celery
orgunicorn
workers. The LLM Observability writer thread now automatically restarts when a forked process is detected. - tracing: Fixes a side-effect issue with module import callbacks that could cause a runtime exception.
- tracing: Fixes an issue with some module imports with native specs that don't support attribute assignments, resulting in a
TypeError
exception at runtime. - tracing: Improves the accuracy of
X-Datadog-Trace-Count
payload header. - tracing: Resolves an issue where
ddtrace
package files were published with incorrect file attributes. - tracing: Resolves an issue where django db instrumentation could fail.
- LLM Observability: Resolves an issue where
session_id
was being defaulted totrace_id
, which was causing unexpected UI behavior. - openai: Fixes a bug where
asyncio.TimeoutError
s were not being propagated correctly from canceled OpenAI API requests. - profiling: Propagates tags in
DD_PROFILING_TAGS
andDD_TAGS
to the libdatadog exporter, a new exporter codepath which is enabled when either one of the following is set,DD_PROFILING_STACK_V2_ENABLED
,DD_PROFILING_EXPORT_LIBDD_ENABLED
, orDD_PROFILING_TIMELINE_ENABLED
or dd-trace-py is running in an injected environment. - ASM: Fixes a memory leak on the native slice aspect.
Other Changes
- tracing: Removes the
DD_PRIORITY_SAMPLING
configuration option. This option is not used in anyddtrace>=2.0
releases.
2.14.0
Deprecation Notes
- Tracing
- Deprecates the
DD_TRACE_SPAN_AGGREGATOR_RLOCK
environment variable. It will be removed in v3.0.0. - Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in v3.0.0.
DD_HTTP_CLIENT_TAG_QUERY_STRING
configuration is deprecated and will be removed in v3.0.0. UseDD_TRACE_HTTP_CLIENT_TAG_QUERY_STRING
instead.
- Deprecates the
New Features
-
DSM
- Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
- Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
-
LLM Observability
- Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
- The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
- The LangChain integration now submits tool spans to LLM Observability.
- LLM Observability spans generated by the OpenAI integration now have updated span name and
model_provider
values. Span names are now prefixed with the OpenAI client name (possible values:OpenAI/AzureOpenAI
) instead of the defaultopenai
prefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. Themodel_provider
field also now corresponds toopenai
orazure_openai
based on the OpenAI client. - The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}
option is set on the completion or chat completion call. - Introduces the
LLMObs.annotation_context()
context manager method, which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active. - Introduces prompt template annotation, which can be passed as an argument to
LLMObs.annotate(prompt={...})
for LLM span kinds. For more information on prompt annotations, see the docs. - google_generativeai: Introduces tracing support for Google Gemini API
generate_content
calls.
See the docs for more information. - openai: The OpenAI integration now includes a new
openai.request.client
tag with the possible valuesOpenAI/AzureOpenAI
to help differentiate whether the request was made to Azure OpenAI or OpenAI. - openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}
option is set on the completion or chat completion call.
-
Profiling
- Captures
asyncio.Lock
usages withwith
context managers.
- Captures
-
Other
- botocore: Adds span pointers to some successful AWS botocore spans. Currently only supports S3 PutObject.
- pymongo: Adds support for pymongo>=4.9.0
Bug Fixes
-
Code Security (ASM)
- Fixes a bug in the IAST patching process where
AttributeError
exceptions were being caught, interfering with the proper application cycle. - Resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- Fixes a bug in the IAST patching process where
-
LLM Observability
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
LLMObs.enable(agentless_enabled=True)
or settingDD_LLMOBS_AGENTLESS_ENABLED=1
. - Resolves an issue in the
LLMObs.annotate()
method where non-JSON serializable arguments were discarded entirely. Now, theLLMObs.annotate()
method safely handles non-JSON-serializable arguments by defaulting to a placeholder text. - Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a
TypeError
in the OpenAI, LangChain, Bedrock, and Anthropic integrations. - anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError
. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - langchain: Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a
TypeError
. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text.
- Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
-
Other
- SSI: This fix ensures injection denylist is included in published OCI package.
- postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- pymongo: Ensures instances of the
pymongo.MongoClient
can be patch after pymongo is imported.
2.14.0rc1
Deprecation Notes
- tracing: Deprecates the
DD_TRACE_SPAN_AGGREGATOR_RLOCK
environment variable. It will be removed in 3.0.0. - tracing: Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in a future release.
- tracing:
DD_HTTP_CLIENT_TAG_QUERY_STRING
configuration is deprecated and will be removed in v3.0.0. UseDD_TRACE_HTTP_CLIENT_TAG_QUERY_STRING
instead.
New Features
-
google_generativeai: Introduces tracing support for Google Gemini API
generate_content
calls.
See the docs for more information. -
LLM Observability: Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
-
LLM Observability: The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
-
LLM Observability: The LangChain integration now submits tool spans to LLM Observability.
-
openai: The OpenAI integration now includes a new
openai.request.client
tag with the possible valuesOpenAI/AzureOpenAI
to help differentiate whether the request was made to Azure OpenAI or OpenAI. -
LLM Observability: LLM Observability spans generated by the OpenAI integration now have updated span name and
model_provider
values. Span names are now prefixed with the OpenAI client name (possible values:OpenAI/AzureOpenAI
) instead of the defaultopenai
prefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. Themodel_provider
field also now corresponds toopenai
orazure_openai
based on the OpenAI client. -
openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}
option is set on the completion or chat completion call. -
LLM Observability: The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the
stream_options={"include_usage": True}
option is set on the completion or chat completion call. -
DSM: Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
-
DSM: Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
-
LLM Observability: Introduces the
LLMObs.annotation_context()
context manager method which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active. -
profiling: Captures
asyncio.Lock
usages withwith
context managers -
pymongo: Adds support for pymongo>=4.9.0
-
botocore: Adds span pointers to some successful AWS botocore spans. Currently only supports S3 PutObject.
-
LLM Observability: Introduces prompt template annotation, which can be passed as an argument to
LLMObs.annotate(prompt={...})
for LLM span kinds. For more information on prompt annotations, see Annotating a Span.
Bug Fixes
- library injection: Resolves an issue where the version of
attrs
installed by default on some Ubuntu installations was treated as incompatible with library injection - anthropic: Resolves an issue where attempting to tag non-JSON serializable request arguments caused a
TypeError
. The Anthropic integration now safely tags non-JSON serializable arguments with a default placeholder text. - LLM Observability: Fixes an issue where the OpenAI and LangChain integrations would still submit integration metrics even in agentless mode. Integration metrics are now disabled if using agentless mode via
LLMObs.enable(agentless_enabled=True)
or settingDD_LLMOBS_AGENTLESS_ENABLED=1
. - LLM Observability: Resolves an issue in the
LLMObs.annotate()
method where non-JSON serializable arguments were discarded entirely. Now, theLLMObs.annotate()
method safely handles non-JSON-serializable arguments by defaulting to a placeholder text. - LLM Observability: Resolves an issue where attempting to tag non-JSON serializable request/response parameters resulted in a
TypeError
in the OpenAI, LangChain, Bedrock, and Anthropic integrations. - langchain: Resolves an issue where attempting to tag non-JSON serializable tool config arguments resulted in a
TypeError
. The LangChain integration now safely tags non-JSON serializable arguments with a default placeholder text. - SSI: This fix ensures injection denylist is included in published OCI package.
- postgres: Fixes circular imports raised when psycopg automatic instrumentation is enabled.
- ASM: This fix resolves an issue where exploit prevention was not properly blocking requests with custom redirection actions.
- Code Security: This fixes a bug in the IAST patching process where
AttributeError
exceptions were being caught, interfering with the proper application cycle. - kafka: Fixes an issue where a
TypeError
exception would be raised if the first message'stopic()
returnedNone
during consumption. - kinesis: Resolves an issue where unparsable data in a Kinesis record would cause a NoneType error.
- pymongo: Ensures instances of the
pymongo.MongoClient
can be patch after pymongo is imported.
2.12.2
Bug Fixes
- library injection: Resolves an issue where the version of
attrs
installed by default on some Ubuntu installations was treated as incompatible with library injection - Code Security: This fixes a bug in the IAST patching process where
AttributeError
exceptions were being caught, interfering with the proper application cycle.
2.11.6
Bug Fixes
- library injection: Resolves an issue where the version of
attrs
installed by default on some Ubuntu installations was treated as incompatible with library injection - Code Security: This fixes a bug in the IAST patching process where
AttributeError
exceptions were being caught, interfering with the proper application cycle.
2.12.1
Bug Fixes
- SSI: This fix ensures injection denylist is included in published OCI package.