You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am wondering, if this algorithm is used at all in any practical application.
I get, that it is too resource intensive to be used on the CPU, and that your implementation is the only maintained one, that runs on a GPU.
I am kinda wondering, why the seemingly most advanced upscaling/shader algorithm is not more used and contributed to by the community - do you know of any emulator, that uses it yet?
Or even of an opportunity to run this on AMD GPUs? (You oddly enough mentioned only that it doesn't run on Intel, with no mention about AMD for some reason?
Thanks
The text was updated successfully, but these errors were encountered:
I cannot say much regarding the emulation community.
Maybe no one made them aware of it.
Regarding the AMD & intel GPU issues - In theory the opengl pipeline and shaders should also work on those devices - i never had access to AMD GPUs so i have not found a way to test/debug it.
a couple of weeks ago I found time to fix it so that it is at least running on my current setup (GeForce GTX 960).
have you tried the current commit ac7f42f on AMD?
Also this was one of my first encounters with c++/openGl and software-development in general so the code is very very messy as it is right now - sry :)
It's remarkable, to begin programming with a project like this.
I am quite new to programming as well, and certainly not yet able to comprehend code like this, which might also due to the fact, that I am inproficent in the underlying area.
So, this is implemented as a simple shader, ultimately?
I would like to test the code in the comming days, but can't promise anything.
My idea is to clean up the code, and then pitch it to RetroArch, and all the standalone emulators.
Hi there
I am wondering, if this algorithm is used at all in any practical application.
I get, that it is too resource intensive to be used on the CPU, and that your implementation is the only maintained one, that runs on a GPU.
I am kinda wondering, why the seemingly most advanced upscaling/shader algorithm is not more used and contributed to by the community - do you know of any emulator, that uses it yet?
Or even of an opportunity to run this on AMD GPUs? (You oddly enough mentioned only that it doesn't run on Intel, with no mention about AMD for some reason?
Thanks
The text was updated successfully, but these errors were encountered: