-
Notifications
You must be signed in to change notification settings - Fork 439
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
consumption_metrics: very old cached values can be sent because of migrations #9032
Comments
Discussed in meeting: Generation number increasing by at most 1 to reuse cached values is the best filter. No idea right now: how to store generations in the cache? Cache might not be leaked outside, so perhaps that is easy to do. |
@skyzh can you take a look at this one when you're back in the office? |
I think storing generation would make sense, and I would like to have a new format for consumption_metrics.json (I assume nobody else except pageserver is using this json file).
I'm refactoring the storing/restoring code right now... |
... it seems that for |
otherwise, I think I'll have a PR to migrate to the new format, and another patch to implement the ">= generation => do not report" logic |
… cache (#9470) In #9032, I would like to eventually add a `generation` field to the consumption metrics cache. The current encoding is not backward compatible and it is hard to add another field into the cache. Therefore, this patch refactors the format to store "field -> value", and it's easier to maintain backward/forward compatibility with this new format. ## Summary of changes * Add `NewRawMetric` as the new format. * Add upgrade path. When opening the disk cache, the codepath first inspects the `version` field, and decide how to decode. * Refactor metrics generation code and tests. * Add tests on upgrade / compatibility with the old format. --------- Signed-off-by: Alex Chi Z <[email protected]>
this week: add generation number to the disk cache and the logic to invalidate cache based on generation number |
Internal slack thread: https://neondb.slack.com/archives/C061CPK7UQL/p1726556769067039?thread_ts=1726493755.104459&cid=C061CPK7UQL
Symptom:
Sent/exported synthetic sizes:
More broadly, what happened:
How to fix it:
The text was updated successfully, but these errors were encountered: