You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The Besu team ❤️ having a resource like hivetests.ethdevops.io to consult. It is a very helpful piece of infrastructure which we do not have to maintain! We would like to track Besu's performance on the wide variety of test suites and integrate it into our periodic reviews; it's important to be aware of how our test results are trending, hopefully always improving. So we're suggesting a feature that may be useful to all client teams- simple access to historical test results, over a medium-term timeframe.
This data wouldn't need to be deeply analyzed over time, we just want to keep a running metric of what test suites are passing at what rates. This could be as simple as appending to a static CSV file every night and making that available. Possible implementation details:
Separate the test results json from all logging output from clients and simulators. The logs are a ton of data, which we don't need and those can be pruned much more aggressively. The test results json files are much smaller, and should be retained for longer.
clientInfo fields would need to be populated consistently. They seem to only be populated for the first test a client executes. Including this consistently on each test makes aggregating easier.
Periodic job to aggregate results. Nightly is probably fine.
Means to serve the results up for use by client teams.
Eventually, we'd like to get to a point where we can proudly display a HIVE-PASSING badge on our github home :)
The text was updated successfully, but these errors were encountered:
The Besu team ❤️ having a resource like hivetests.ethdevops.io to consult. It is a very helpful piece of infrastructure which we do not have to maintain! We would like to track Besu's performance on the wide variety of test suites and integrate it into our periodic reviews; it's important to be aware of how our test results are trending, hopefully always improving. So we're suggesting a feature that may be useful to all client teams- simple access to historical test results, over a medium-term timeframe.
This data wouldn't need to be deeply analyzed over time, we just want to keep a running metric of what test suites are passing at what rates. This could be as simple as appending to a static CSV file every night and making that available. Possible implementation details:
clientInfo
fields would need to be populated consistently. They seem to only be populated for the first test a client executes. Including this consistently on each test makes aggregating easier.Eventually, we'd like to get to a point where we can proudly display a HIVE-PASSING badge on our github home :)
The text was updated successfully, but these errors were encountered: