-
Notifications
You must be signed in to change notification settings - Fork 63
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
spectool: Execute JSON test cases #245
base: master
Are you sure you want to change the base?
Conversation
Thanks for the PR! Hmm... I would want to at least wait for #238 to get in before merging this, although, that said, it might be better to not actually merge this here and instead finally start on w3f/jamtestvectors#7? |
I think waiting for #238 makes a lot of sense, that's the only context I'm interested in the runner actually. Common test harness definitely makes sense, however I still think that JSON-based test case specification is and will be valuable. I imagine a following workflow (correct me if I'm missing something):
Hence, being able to run that single JSON case on PolkaVM would be something that this PR enables. Contrasting this with running a presumably more complex test harness machinery just to debug a single test case. |
Well, isn't this orthogonal to having a test harness? (: The harness can still take the JSON test cases, just potentially run them on any PVM, not only on PolkaVM. In other words - I don't see why we should limit this to only PolkaVM - anyone interested should be able to use exactly the same code to load up the tests and run them on their PVM if they so desire. This wouldn't necessarily make it much more complex if we'd wrap the PolkaVM API in a trait (which can just have the same/very similar API to PolkaVM). |
Yes! :)
That's what I meant with "presumably more complex test harness machinery" :) But I agree it doesn't necessarily have to be more complex. My assumption is that most of the PVM implementers already have a setup to load JSON test cases for their PVM and my point is that it's currently missing for PolkaVM. As mentioned: I'm probably biased towards grabbing a low hanging fruit, since all other parts already exist, but if you prefer to rather focus on the test harness development I'll humbly follow too :) I'll add some discussion points to the harness issue to push the ball a bit. |
Hm, I guess yeah, this is a good point. (: But since PolkaVM itself is used to generate those test vectors having such a loader is not as useful as it is for other PVMs. Of course it's still useful to guard against regressions, but it's probably more sensible to just go all the way and start on the harness, since we do want to have a harness anyway, and we will want to integrate PolkaVM into the harness natively anyway for fast differential fuzzying. |
@koute let me know if/when you'd like me to get this updated to the latest |
Yes, I'm most likely going to contribute there. Could you hook up your PVM in there too? |
This PR attempts to implement the
main_test
function fromspectool
.main.rs
was moved to a new filegenerate.rs
to avoid cluttering themain
file.test.rs
module was created to implement JSON test vectors execution.Some open issues:
While running a bunch of generated test vectors from here I've noticed that the interpreter may hang instead of going OOG.setup_memory
function) - I didn't find a more generic API for this.Example program that hangs: https://pvm-debugger.netlify.app/#/load?program=0x0000015101 (BRANCH_EQ_IMM
).Edit: Apologies, the program has an extremely large gas and has an edge case in parameters parsing. Our implementations were rejecting that, while polkavm correctly attempts to execute it.