Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DRAFT: metrics sharding handwritten #2304

Closed
wants to merge 9 commits into from

Conversation

fraillt
Copy link
Contributor

@fraillt fraillt commented Nov 17, 2024

As we discussed in our meeting, I'm creating a draft version with implemented sharding:
It mainly consist of these steps:

Could probably be using dashmap directly as well, but it's quite trivial to write it our own, and we can avoid having extra dependency :)

Waiting for your test results :)
Would be good to know CPU and rust version.

@fraillt fraillt requested a review from a team as a code owner November 17, 2024 20:41
Copy link

codecov bot commented Nov 17, 2024

Codecov Report

Attention: Patch coverage is 93.56725% with 11 lines in your changes missing coverage. Please review.

Project coverage is 79.6%. Comparing base (849778d) to head (88bf683).

Files with missing lines Patch % Lines
opentelemetry-sdk/src/metrics/internal/hashed.rs 90.7% 6 Missing ⚠️
opentelemetry-sdk/src/metrics/internal/mod.rs 95.5% 4 Missing ⚠️
opentelemetry-sdk/src/metrics/mod.rs 93.7% 1 Missing ⚠️
Additional details and impacted files
@@          Coverage Diff          @@
##            main   #2304   +/-   ##
=====================================
  Coverage   79.5%   79.6%           
=====================================
  Files        123     124    +1     
  Lines      21293   21384   +91     
=====================================
+ Hits       16938   17023   +85     
- Misses      4355    4361    +6     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.


🚨 Try these New Features:

@fraillt
Copy link
Contributor Author

fraillt commented Nov 22, 2024

I did some latency testing using #2323.
Wanted to compare it to #2305.
Flurry concurrent hashmap performs pretty well. However, it LOOSES SOMES UPDATES! So if want to move forward, we definitely need to fix this first, but for testing purposes I just ignored this "little" issue :)
The observation is as follows:

  • for updates this PR performs a bit worse (compared to flurry), especially when it comes to p99 (1.5x slower on low load) (although on high loads this PR comes close)
  • for inserts it performs much better (almost 3-8x better, especially at p99).
  • for mixed loads (mostly updates with few inserts) are mixes as well: on low loads this PR wins, on high loads flurry takes over).

@cijothomas
Copy link
Member

Closing. This is linked to from the tracking issue to revisit later: #2450

@cijothomas cijothomas closed this Dec 18, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants