Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Emulate GPU without sleep #161

Open
spitzcor opened this issue Mar 8, 2024 · 0 comments
Open

Emulate GPU without sleep #161

spitzcor opened this issue Mar 8, 2024 · 0 comments

Comments

@spitzcor
Copy link

spitzcor commented Mar 8, 2024

Let me preface my comments with a declaration that this is just my understanding how things work. I haven't inspected or even run this code myself.

To keep the focus on IO and to make it possible to test without GPUs, the benchmark has no requirement for a real GPU. This makes sense, however, there is a sleep() involved to simulate the GPU computation time, which seems unnecessary. It appears that there is no overlapping IO and GPU emulation, so it could be removed. Otherwise, it is just a waste of time, because while it may be necessary to simulate the IO-compute flow of a real code, it isn’t necessary to simulate the wallclock runtime. Even if there is some IO and GPU emulation overlap, it seems that long-duration sleeps could be short-circuited once the IO for the stage was complete. Then, the metrics for compute, IO time, fill time, and samples/sec can be calculated as if the same wallclock time was spent, if necessary.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant