-
Notifications
You must be signed in to change notification settings - Fork 413
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(llmobs): submit llmobs payloads from gemini integration #10619
Conversation
|
Datadog ReportBranch report: ✅ 0 Failed, 3724 Passed, 5180 Skipped, 53m 46.44s Total duration (1h 13m 21.8s time saved) |
Besides some test CIs are failing for assertions. It looks good to me 👍 |
The backport to
To backport manually, run these commands in your terminal: # Fetch latest updates from GitHub
git fetch
# Create a new working tree
git worktree add .worktrees/backport-2.13 2.13
# Navigate to the new working tree
cd .worktrees/backport-2.13
# Create a new branch
git switch --create backport-10619-to-2.13
# Cherry-pick the merged commit of this pull request and resolve the conflicts
git cherry-pick -x --mainline 1 dc7e31ecd0e06735d850e5c317f2ee18fed98c32
# Push it to GitHub
git push --set-upstream origin backport-10619-to-2.13
# Go back to the original working tree
cd ../..
# Delete the working tree
git worktree remove .worktrees/backport-2.13 Then, create a pull request where the |
This PR enables submitting LLMObs spans from the Gemini integration.
generate_content/generate_content_async()
calls are traced by the Gemini APM integration. This PR also generates LLMObs span events from those spans and submits them to LLM Observability.The following data is collected by Gemini LLMObs spans:
temperature/max_output_tokens/candidate_count/top_p/top_k
) if set on the model instance or in the request itselfChecklist
Reviewer Checklist