-
-
Notifications
You must be signed in to change notification settings - Fork 22
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🎉 Introduce VR Cube #152
base: main
Are you sure you want to change the base?
🎉 Introduce VR Cube #152
Conversation
@ila-embsys what do you mean by "there is no actual 3d due to no SBS"? SBS is toggled on the Nreal Air (in the latest firmware) by,as far as I remember, holding one of the buttons for about 3 or 5 seconds. Or do you mean you didn't implement it yet? |
I didn't implement, so no need to switch glasses to SBS mode. |
Hey there, thanks a ton for this awesome initiative! And sorry for the late reply; I am currently quite busy as we are moving to a different city. However, in some days I'll be back at work where I will have access to several AR and VR devices. Hopefully I'll get the chance to test this on actual hardware! How do you currently initiate the head tracking? Is it always active in overview mode? Or is it also active in normal desktop mode? |
Good luck with that!
Currently, the head tracking is active while the switch 'Enable VR' in the settings toggled on. So, tracking is active on normal desktop mode too, but I didn't apply that rotation data due to bad rotation maths. Any attempt to change maths sometimes is cause of loosing workspaces somewhere in space, so no way to step back and disable extension. Maybe I just don't know how to debug extensions right. Currently, I just run it and look what changed. This whole approach raise some questions: for example, unknown if the interaction even possible while workspaces is moving in desktop mode. Even if it is technically possible, unknown if interaction would be comfortable because the cursor still in display coordinates. However, I discovered that dragging windows between workspaces in overview mode by head movement rather than by mouse movement is quite perfect. I decided to release this draft on early stage to look for comments on the approach and maybe attract community power to bring it to a completion. |
Update:
|
Hi there! I just wanted to set this up but I am struggling a bit in getting the dependencies right - I am on Ubuntu 23.10 currently. Your prebuilt package for OpenHMD does not exist for Ubuntu, right? Therefore I installed the one from the Ubuntu repositories. This installs version 0.3.0-1 of I then proceeded to install your prebuilt GOpenHMD wrapper. This ships However, when I try to open the prefs of Desktop Cube, I get the error Do you have an idea? |
I'm glad to see your interest and progress of setting up.
It exists, but OpenBuildService generate wrong links. The right one for Debian/Ubuntu is that link. Also, the repository's binaries can be looked on the Repositories Tab. However, that package just contains additional patches for Nreal Air support, so the package from the native Ubuntu repo should be fine too.
It is my mistake. Now, I fixed package's build script and that 'typelib' file is added. However, I still haven't tested the package because it is not for my main system.
I'm not well familiar with GObject and its related things. I guess that in your case the generated 'typelib' loaded, but next it cannot find symbols in the related C library. I will set up Ubuntu distro and try to investigate the issue with 'typelib'. |
I've set up fresh Ubuntu 23.10, installed libopenhmd0 from the native repo and new prebuilt package of gopenhmd. $ gjs
gjs> const GOpenHMD = imports.gi.GOpenHMD;
gjs> GOpenHMD.version().to_string()
"0.3.0" I guess the fix I mentioned before that add 'typelib' is helped. |
The extension crash at the end of desktop gesture was fixed. Now, cube observing in desktop mode works stable, but it still stops immediately after the mouse button release. Any ideas how to stop abusing the gesture? |
Hi again, sorry for the lack of responses from my side. I just realized that we currently only have a bazillion of Meta Quests at work but no OpenHMD compatible HMDs. This will make testing this a bit tricky! But I still hope that I can find a compatible HMD somewhere... Anyways, your updated packages do work now and I can finally enable HMD mode. I'll look into your implementation and see how the interaction could be improved. Yet it will be difficult without compatible hardware. I am not really up-to-date with Linux support for various HMDs, runtimes, and libraries. However it seems that OpenHMD is somewhat deprecated. If we really want to make this a stable part of the Desktop Cube extension, wouldn't it be better to use a properly maintained library? (if there is any 😄). This field is really confusing but it seems that Monado is the follow-up project, isn't it? There is also gxr which seems to be a GLib wrapper around OpenVR and OpenXR APIs and thus could be compatible with Monado. I think it should be possible to create GJS bindings for gxr. Maybe we could then even use the SteamVR runtime which could bring support for even more hardware. What do you think? Have you considered alternatives to OpenHMD? |
A quick folow-up: I just compiled gxr from source and created corresponding typelib files. I can import them in GJS, but the lib is missing GObject introspection annotations. I just asked for them but it seems we would have to add them by ourselves. I do not have much experience there, but maybe it is not too difficult to add them. But still, I am not sure if I understood the relationship of OpenVR, OpenXR, OpenHMD, Monado, etc right. Would GJS types for gxr give us easy access to HMD rotation data? Also, most HMDs I have worked with require distorted images. This would most likely be beyond the scope of the Desktop Cube extension. The Nreal Air is simply detected as an external display and can display the video signal properly on a 2D plane in 3D space without any prior distortion? |
It is my first steps in packaging experience, so I am happy to hear the packages works correct.
Sad you can't see the result.
Well, it is a real problem. If I had known the library would be deprecated, I would have chosen another one. When It was happened, I was being in the mid of a whole work.
Sure, I guess Monado now is in the centre of entire VR stuff, and gxr definitely looks well if it can provide us IMU data access as OpenHMD does.
OpenHMD has been choice for the reason that in the comparison to Monado it super simple, doesn't have a lot of dependencies and provide only access to a rotational data. Also, it doesn't have unnecessary 3D engine and related stuff. Since Mutter has its own Cogl-based rendering, the choice was looking as a good idea.
It is a subject for the research. I've worked with Monado a little, but I don't know if it can be used without its 3D engine. If so, gxr will fit. I will try to investigate it.
Exactly. Nreal Air is just a display attached by Display Port in a USB Type-C cable. It doesn't need any additional distortion by a software. Flat image is flat. I bought them specially for a virtual desktop experience. In my opinion, it is the best fit for that: high resolution, good pixel density, simple, light, cheap, and nothing redundant. Another similar device is Rokid Air. This MR is the last part for getting virtual desktop. In case of the devices that need to get distorted image, I guess an additional shader like this could be added somewhere in the gnome's rendering pipeline. But I'm pretty far from that 3D stuff and doesn't have knowledge about how that works. However, I believe it's possible. Summarizing: need to try to get rotational data by |
That's very true! And I think I may get my hands on a an HTC Vive Pro (a colleague mentioned that there may be one somewhere around) and then I can at least test the rotation input. How do you imagine the experience to be like in the end?
I think this would boil down to this: Everything stays as it is in the I am pretty sure that Mutter has a pretty strict concept of the active workspace. Sending input to multiple workspaces will be hard, but maybe we can silently change the active workspace depending on the mouse pointer location. Is this somehow what you imagine? |
Nice!
I agree. Several commits ago, I added that feature. It works, but only while desktop drag gesture is active. For some reason, retrieving rotation angles stops immediately after the finish of the gesture. I still can't find a workaround for that. Now, switching between desktop mode and overview mode have effect of only reorganizing windows for its preview.
Yep. Several commits ago I added it too. E.g., moving workspaces that are in front are still possible by mouse scrolls, while possible to observe them in same time by the head rotation.
You have described it exactly how I tried it to do. And, I guess, it more or less implemented like this. At least, I tried only to add retrieved rotation whenever possible without changing existing behaviour.
It is a little unclear. Do you mean the workspaces could being hide in the desktop mode when drag gesture in not active? Currently, all workspaces can be observed by head rotation while the desktop drag gesture being active.
I don't think that need to sent input to multiple workspaces. I guess would be enough to change them by mouse location as you proposed. Or, another variant, by activating that of workspace, that are more under the user's look. E.g., activate the first workspace if a user is looking on them, activate the second if he is looking on the second, and so on.
Awesome, you described it in details as I image it. |
46c4f7e
to
5ff26cf
Compare
Good news, I could check window interaction in desktop mode. It works. The problem is the cursor position doesn't match the actual point where it does input. It is seen when I draw image in GIMP. Another new issue: after activating gesture that enables observing in desktop mode, impossible to move windows. They just get sticking and new opened windows doesn't show up. |
5ff26cf
to
b2ae021
Compare
Just a little update: windows movement ability in desktop mode is fixed, as well as windows creation. The usual workflow with opening windows, dragging them, closing and moving between spatial workspaces works well enough now. The last issue is cursor position. As I mentioned previously, visible cursor position is displaying in display coordinates, and the actual input is getting from the same offsets on the workspace, but the workspace in distorted by a 3D vision, so difficult to guess where the cursor input will be is. I'm thinking about adding something like a laser beam to point the right position. |
b2ae021
to
441ad61
Compare
I added the laser beam by shader effect to point mouse input location. Mouse is more usable now. It doesn't point carefully for the mouse input location, because I didn't implement point projection. First, I didn't find if a vertex shader is possible to use within a gnome extension. Second, maybe for a one point, the projection could be done by a CPU. I am not sure since didn't have to deal a 3D stuff previously. At this point, I am ready to doing attempts to find a replacement for the outdated OpenHMD library. Since you @Schneegans mentioned 'gxr', I will be going to look into this. |
🪲 Fix opacity error
441ad61
to
ef87d45
Compare
I added an option to select OpenXR as a backend for communication to HMD. So now it is possible to use Monado runtime and GXR library. Original GXR library need to be extended by GObject-Introspection annotations to be usable by GJS. Here is my branch with that annotations. Additionally, to import GXR into the gnome shell extension, all dependencies on 'Gdk-3.0' of GIR targets needed to be changed to 'Gdk-4.0'. At least for 'gxr' itself and 'gulkan'. With that addition of OpenXR variant, a new issue is added. As I expected, creating a GXR context results to open Monado's window for drawing content into it. That window is not needed, but uselessly stay opened. Though, it is not affect head tracking. While all Monado's HMDs now expected to be possible to use, it still no sense for those display that is not displaying its image flat and that need an additional image corrections. However, I already added here a shader for drawing a beam of cursor pointer, so some additional one could be added for distortion correction too. |
Sorry for the long silence! Thanks for your continued work on this! By now I have access to an HTC Vive, so I will be able to test this. I already see the device rotation as reported by OpenHMD and the inside-out cube also rotates accordingly. However, I obviously see the desktop still on my monitor, not inside the HMD. I think the Vive cannot easily act as a secondary display. Or can it? Then we would need the proper distortion shader. The Gxr backend did not work out-of-the-box for me. I think I still have some issues with the paths. At least GJS still complains that it does not find the typelib files. But I guess I'll figure this out. But I am still confused about the scope of gxr and OpenHMD. Can we somehow render directly to the HMD or will it only work if it is attached as a display? |
Well, if I understand correct, all Monado supported devices are some kind of external display. However, most of that devices need side-by-side stereoscopic image and lens distortion correction, so that HMDs doesn't show an image on its displays by default until some 'driver' prepared necessary image modifications and switched on the display. This is what I think Monado do: shows full-screen SBS distorted image on a related display and enables screen. So, I guess if you get working Monado, GXR will enable the screen in your HMD. The only thing I expect will be needed to do additionally is to avoid showing full-screen Monado's window on HMD's display and instead use what inside-out cube shows now. However, we will still need distortion correction and, of course, SBS. I think at this point, the best next moves will be to find the ways for adding SBS, adding Distortion Correction Shader and adding FOV adjustment. However, first we need to be sure that we can get Monado's HMDs able to act as an external display. Since I have nothing more than my Nreal Air glasses, I can not check that, but I think you can. Maybe, while you check the assumption about the usage of HMDs as an external display, I could start from FOV adjustment. For my glasses, I don't need SBS or distortion correction and can skip it, but correct FOV is important for feeling gnome's workspaces spatially static. |
Introduce VR Cube mode! 🥳
Turn nostalgic uselessness to a modern usefulness 🚀
What is it?
It is a cool new feature that turns Cube Desktop's cube inside out to get the view from inside the cube, and attaches cube's rotation angles to an HMD's rotation angles.
How it works?
To get it to work, needed two additional libraries:
That's all.
Get started
I personally have it tested on Xreal Air glasses, but probably it could work with any HMD that OpenHMD have support.
Known issues and limitations
Further ideas