Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(llmobs): add prompt and name arguments to annotation context #10711

Open
wants to merge 22 commits into
base: main
Choose a base branch
from

Conversation

lievan
Copy link
Contributor

@lievan lievan commented Sep 18, 2024

This PR adds support for prompt and name arguments to annotation_context.

Because prompt depends on span kind being equal to llm, we also need to do a small refactor of annotate. Prompts will be tagged in annotate before span kind is checked. The span kind checked is then done at trace processing time, where we will log a warning and drop the prompt field for any non-LLM span kinds.

Checklist

  • PR author has checked that all the criteria below are met
  • The PR description includes an overview of the change
  • The PR description articulates the motivation for the change
  • The change includes tests OR the PR description describes a testing strategy
  • The PR description notes risks associated with the change, if any
  • Newly-added code is easy to change
  • The change follows the library release note guidelines
  • The change includes or references documentation updates if necessary
  • Backport labels are set (if applicable)

Reviewer Checklist

  • Reviewer has checked that all the criteria below are met
  • Title is accurate
  • All changes are related to the pull request's stated goal
  • Avoids breaking API changes
  • Testing strategy adequately addresses listed risks
  • Newly-added code is easy to change
  • Release note makes sense to a user of the library
  • If necessary, author has acknowledged and discussed the performance implications of this PR as reported in the benchmarks PR comment
  • Backport labels are set in a manner that is consistent with the release branch maintenance policy

@lievan lievan changed the title feat(llmobs): Add Prompt and name arguments to annotation-context feat(llmobs): Add prompt and name arguments to annotation-context Sep 18, 2024
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
tests/llmobs/test_llmobs_service.py Show resolved Hide resolved
@pr-commenter
Copy link

pr-commenter bot commented Sep 18, 2024

Benchmarks

Benchmark execution time: 2024-09-20 16:19:31

Comparing candidate commit ed7e205 in PR branch evan.li/enhance-annotation-context with baseline commit baac738 in branch main.

Found 3 performance improvements and 0 performance regressions! Performance is the same for 353 metrics, 48 unstable metrics.

scenario:iast_aspects-aspect_iast_do_lower

  • 🟩 max_rss_usage [-2.425MB; -2.113MB] or [-8.155%; -7.108%]

scenario:iast_aspects-aspect_iast_do_modulo

  • 🟩 max_rss_usage [-2.358MB; -2.088MB] or [-7.932%; -7.024%]

scenario:iast_aspects-aspect_iast_do_str

  • 🟩 max_rss_usage [-2.663MB; -2.129MB] or [-8.949%; -7.155%]

@lievan lievan changed the title feat(llmobs): Add prompt and name arguments to annotation-context feat(llmobs): Add prompt and name arguments to annotation context Sep 19, 2024
Copy link
Contributor

CODEOWNERS have been resolved as:

releasenotes/notes/annotation-context-modify-name-and-prompt-cc74b3b268983181.yaml  @DataDog/apm-python
ddtrace/llmobs/_llmobs.py                                               @DataDog/ml-observability
ddtrace/llmobs/_trace_processor.py                                      @DataDog/ml-observability
tests/llmobs/test_llmobs_service.py                                     @DataDog/ml-observability
tests/llmobs/test_llmobs_trace_processor.py                             @DataDog/ml-observability

Base automatically changed from evan.li/prompt-v2 to main September 20, 2024 13:17
@lievan lievan changed the title feat(llmobs): Add prompt and name arguments to annotation context feat(llmobs): add prompt and name arguments to annotation context Sep 20, 2024
@lievan lievan marked this pull request as ready for review September 20, 2024 15:40
"""
Sets specified attributes on all LLMObs spans created while the returned AnnotationContext is active.
Do not use nested annotation contexts to override the same tags since the order in which annotations
Do not use nested annotation contexts to override the same attributes since the order in which annotations
are applied is non-deterministic.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is it non-deterministic?

@@ -230,16 +230,23 @@ def disable(cls) -> None:
log.debug("%s disabled", cls.__name__)

@classmethod
def annotation_context(cls, tags: Optional[Dict[str, Any]] = None) -> AnnotationContext:
def annotation_context(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Let's make documentation for this in our public docs, probably next to Annotate a span.

---
features:
- |
For new features such as a new integration or component. Use present tense with the following format:
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we remove this line?

features:
- |
For new features such as a new integration or component. Use present tense with the following format:
LLM Observability: Introduces `prompt` and `name` arguments to LLMObs.annotation_context to support setting the `name` and `prompt` of integration generated spans.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
LLM Observability: Introduces `prompt` and `name` arguments to LLMObs.annotation_context to support setting the `name` and `prompt` of integration generated spans.
LLM Observability: Introduces `prompt` and `name` arguments to ``LLMObs.annotation_context()`` to support setting an integration-generated span's name and `prompt` field.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants