Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🎉 Introduce VR Cube #152

Draft
wants to merge 12 commits into
base: main
Choose a base branch
from
Draft

Conversation

ila-embsys
Copy link

@ila-embsys ila-embsys commented Feb 11, 2024

Introduce VR Cube mode! 🥳

Turn nostalgic uselessness to a modern usefulness 🚀

Cube VR
Cube VR settings

What is it?

It is a cool new feature that turns Cube Desktop's cube inside out to get the view from inside the cube, and attaches cube's rotation angles to an HMD's rotation angles.

How it works?

To get it to work, needed two additional libraries:

  • The first one is OpenHMD, a small library that provides HMD's rotational data.
  • The second — GOpenHMD, a GObject wrapper of the first. It allows getting HMD's rotations data to Gnome Extension by using GObject Introspection and GJS.

That's all.

Get started

  • To install GOpenHMD, try to use prebuilt package built by Open Build Service: gopenhmd.
  • If you also need OpenHMD, try to use that prebuilt package: OpenHMD. It also includes patches for Nreal Air support.

I personally have it tested on Xreal Air glasses, but probably it could work with any HMD that OpenHMD have support.

Known issues and limitations

  • Should be set 3 workspaces in settings.
  • There is no lens distortion correction, so it would work only on glasses like Nreal Air that displays flat undistorted image projection.
  • There is no actual 3D due to no SBS.
  • There is pitch and yaw, but no roll, so don't tilt head.
  • FOV values isn't retrieving from a driver, so visible FOV of virtual image is currently mismatching the real one.
  • Rotation maths is poor, so some horizontal rotations have a strange behaviour.
  • I have no idea how I inside outed the cube, so sometimes it is not a cube at all. That behaviour is visible on 'show apps'.
  • Nreal Air driver that was ported to OpenHMD from Monado have a bug that crash anything that use the driver after two connections. So use it with a caution. It affects the whole display manager.
  • Dynamic GOpenHMD library loading from JS code is not implemented, so it breaks the extension for those, who doesn't need VR mode.

Further ideas

  • Following window mode when gnome shell is not in overview mode. E.g. having a possibility of trying to turn away from a workspace, the workspace still slides follow to the position in front of eye.
  • Workspaces zooming. Image inside glasses still unideal and small text is bad readable.

@alesya-h
Copy link

@ila-embsys what do you mean by "there is no actual 3d due to no SBS"? SBS is toggled on the Nreal Air (in the latest firmware) by,as far as I remember, holding one of the buttons for about 3 or 5 seconds. Or do you mean you didn't implement it yet?

@ila-embsys
Copy link
Author

Or do you mean you didn't implement it yet?

I didn't implement, so no need to switch glasses to SBS mode.

@Schneegans
Copy link
Owner

Hey there, thanks a ton for this awesome initiative! And sorry for the late reply; I am currently quite busy as we are moving to a different city. However, in some days I'll be back at work where I will have access to several AR and VR devices. Hopefully I'll get the chance to test this on actual hardware!

How do you currently initiate the head tracking? Is it always active in overview mode? Or is it also active in normal desktop mode?

@ila-embsys
Copy link
Author

ila-embsys commented Feb 13, 2024

I am currently quite busy as we are moving to a different city.

Good luck with that!

How do you currently initiate the head tracking? Is it always active in overview mode? Or is it also active in normal desktop mode?

Currently, the head tracking is active while the switch 'Enable VR' in the settings toggled on. So, tracking is active on normal desktop mode too, but I didn't apply that rotation data due to bad rotation maths. Any attempt to change maths sometimes is cause of loosing workspaces somewhere in space, so no way to step back and disable extension. Maybe I just don't know how to debug extensions right. Currently, I just run it and look what changed.

This whole approach raise some questions: for example, unknown if the interaction even possible while workspaces is moving in desktop mode. Even if it is technically possible, unknown if interaction would be comfortable because the cursor still in display coordinates. However, I discovered that dragging windows between workspaces in overview mode by head movement rather than by mouse movement is quite perfect.

I decided to release this draft on early stage to look for comments on the approach and maybe attract community power to bring it to a completion.

@ila-embsys
Copy link
Author

Update:

  • Rotations in right directions and workspaces in the right order. E.g. second and third workspaces at right, not at left.
  • Scrolling workspaces by mouse wheel is restored, and an active workspace is moving in front now. Also, a workspace can be activated and moved in front by a mouse click. The feeling a little unusual, probably due to is too fast moving. However, it is better than to seat with a turned head to look on workspaces on sides. Now, workspaces can be selected and moved in front by looking at them in overview mode and clicking.
  • Rotations works in desktop mode too, but still activate by a gesture on the desktop drag or the panel drag.

@Schneegans
Copy link
Owner

Hi there! I just wanted to set this up but I am struggling a bit in getting the dependencies right - I am on Ubuntu 23.10 currently.

Your prebuilt package for OpenHMD does not exist for Ubuntu, right? Therefore I installed the one from the Ubuntu repositories. This installs version 0.3.0-1 of /usr/lib/x86_64-linux-gnu/libopenhmd.so.

I then proceeded to install your prebuilt GOpenHMD wrapper. This ships /usr/share/gir-1.0/GOpenHMD-1.0.gir but no compiled typelib file. I manually created this with g-ir-compiler --output GOpenHMD-1.0.typelib /usr/share/gir-1.0/GOpenHMD-1.0.gir. This seems to work, I can now import GOpenHMD in gjs.

However, when I try to open the prefs of Desktop Cube, I get the error Could not locate gopen_hmd_version: 'gopen_hmd_version': /usr/bin/gjs: undefined symbol: gopen_hmd_version. So maybe there is some version mismatch? Or maybe the library has the wrong name? I do not know how the dynamic loading of the shared objects works here...

Do you have an idea?

@ila-embsys
Copy link
Author

I'm glad to see your interest and progress of setting up.

Your prebuilt package for OpenHMD does not exist for Ubuntu, right?

It exists, but OpenBuildService generate wrong links. The right one for Debian/Ubuntu is that link. Also, the repository's binaries can be looked on the Repositories Tab.

However, that package just contains additional patches for Nreal Air support, so the package from the native Ubuntu repo should be fine too.

I then proceeded to install your prebuilt GOpenHMD wrapper. This ships /usr/share/gir-1.0/GOpenHMD-1.0.gir but no compiled typelib file.

It is my mistake. Now, I fixed package's build script and that 'typelib' file is added. However, I still haven't tested the package because it is not for my main system.

I get the error Could not locate gopen_hmd_version: 'gopen_hmd_version': /usr/bin/gjs: undefined symbol: gopen_hmd_version

I'm not well familiar with GObject and its related things. I guess that in your case the generated 'typelib' loaded, but next it cannot find symbols in the related C library.

I will set up Ubuntu distro and try to investigate the issue with 'typelib'.

@ila-embsys
Copy link
Author

ila-embsys commented Mar 16, 2024

I've set up fresh Ubuntu 23.10, installed libopenhmd0 from the native repo and new prebuilt package of gopenhmd.
Then I checked the setup by these commands:

$ gjs
gjs> const GOpenHMD = imports.gi.GOpenHMD;
gjs> GOpenHMD.version().to_string()
"0.3.0"

I guess the fix I mentioned before that add 'typelib' is helped.

@ila-embsys
Copy link
Author

The extension crash at the end of desktop gesture was fixed. Now, cube observing in desktop mode works stable, but it still stops immediately after the mouse button release.

Any ideas how to stop abusing the gesture?

@Schneegans
Copy link
Owner

Hi again, sorry for the lack of responses from my side. I just realized that we currently only have a bazillion of Meta Quests at work but no OpenHMD compatible HMDs. This will make testing this a bit tricky! But I still hope that I can find a compatible HMD somewhere...

Anyways, your updated packages do work now and I can finally enable HMD mode. I'll look into your implementation and see how the interaction could be improved. Yet it will be difficult without compatible hardware.

I am not really up-to-date with Linux support for various HMDs, runtimes, and libraries. However it seems that OpenHMD is somewhat deprecated. If we really want to make this a stable part of the Desktop Cube extension, wouldn't it be better to use a properly maintained library? (if there is any 😄). This field is really confusing but it seems that Monado is the follow-up project, isn't it? There is also gxr which seems to be a GLib wrapper around OpenVR and OpenXR APIs and thus could be compatible with Monado. I think it should be possible to create GJS bindings for gxr. Maybe we could then even use the SteamVR runtime which could bring support for even more hardware. What do you think? Have you considered alternatives to OpenHMD?

@Schneegans
Copy link
Owner

A quick folow-up: I just compiled gxr from source and created corresponding typelib files. I can import them in GJS, but the lib is missing GObject introspection annotations. I just asked for them but it seems we would have to add them by ourselves. I do not have much experience there, but maybe it is not too difficult to add them.

But still, I am not sure if I understood the relationship of OpenVR, OpenXR, OpenHMD, Monado, etc right. Would GJS types for gxr give us easy access to HMD rotation data? Also, most HMDs I have worked with require distorted images. This would most likely be beyond the scope of the Desktop Cube extension.

The Nreal Air is simply detected as an external display and can display the video signal properly on a 2D plane in 3D space without any prior distortion?

@ila-embsys
Copy link
Author

ila-embsys commented Mar 31, 2024

updated packages do work now and I can finally enable HMD mode

It is my first steps in packaging experience, so I am happy to hear the packages works correct.

no OpenHMD compatible HMDs. This will make testing this a bit tricky!

Sad you can't see the result.

However it seems that OpenHMD is somewhat deprecated.

Well, it is a real problem. If I had known the library would be deprecated, I would have chosen another one. When It was happened, I was being in the mid of a whole work.

This field is really confusing but it seems that Monado is the follow-up project, isn't it? There is also gxr which seems to be a GLib wrapper around OpenVR and OpenXR APIs and thus could be compatible with Monado.

Sure, I guess Monado now is in the centre of entire VR stuff, and gxr definitely looks well if it can provide us IMU data access as OpenHMD does.

Have you considered alternatives to OpenHMD?

OpenHMD has been choice for the reason that in the comparison to Monado it super simple, doesn't have a lot of dependencies and provide only access to a rotational data. Also, it doesn't have unnecessary 3D engine and related stuff. Since Mutter has its own Cogl-based rendering, the choice was looking as a good idea.

But still, I am not sure if I understood the relationship of OpenVR, OpenXR, OpenHMD, Monado, etc right. Would GJS types for gxr give us easy access to HMD rotation data?

It is a subject for the research. I've worked with Monado a little, but I don't know if it can be used without its 3D engine. If so, gxr will fit. I will try to investigate it.

Also, most HMDs I have worked with require distorted images. This would most likely be beyond the scope of the Desktop Cube extension. The Nreal Air is simply detected as an external display and can display the video signal properly on a 2D plane in 3D space without any prior distortion?

Exactly. Nreal Air is just a display attached by Display Port in a USB Type-C cable. It doesn't need any additional distortion by a software. Flat image is flat. I bought them specially for a virtual desktop experience. In my opinion, it is the best fit for that: high resolution, good pixel density, simple, light, cheap, and nothing redundant. Another similar device is Rokid Air.

This MR is the last part for getting virtual desktop. In case of the devices that need to get distorted image, I guess an additional shader like this could be added somewhere in the gnome's rendering pipeline. But I'm pretty far from that 3D stuff and doesn't have knowledge about how that works. However, I believe it's possible.

Summarizing: need to try to get rotational data by gxr if we want more devices support and ready to try to get image distortion. But first, I think still important to check if interaction with windows is possible in desktop mode. Otherwise, maybe all that in vain.

@Schneegans
Copy link
Owner

Schneegans commented Apr 1, 2024

But first, I think still important to check if interaction with windows is possible in desktop mode. Otherwise, maybe all that in vain.

That's very true! And I think I may get my hands on a an HTC Vive Pro (a colleague mentioned that there may be one somewhere around) and then I can at least test the rotation input.

How do you imagine the experience to be like in the end?

  • I guess it would be cool if you always had the possibility to look around to the other workspaces. No need to explicitly enter the workspace-switch mode.
  • Manual rotation of the inside-out cube should still be possible for reorienting your head.

I think this would boil down to this: Everything stays as it is in the main branch, only in VR mode, an additional rotation is applied to the camera which comes from the head tracking. And we would need to ensure that the other workspaces are shown during normal interaction with the desktop. This will be the tricky part.

I am pretty sure that Mutter has a pretty strict concept of the active workspace. Sending input to multiple workspaces will be hard, but maybe we can silently change the active workspace depending on the mouse pointer location.

Is this somehow what you imagine?

@ila-embsys
Copy link
Author

I think I may get my hands on a an HTC Vive Pro

Nice!

I guess it would be cool if you always had the possibility to look around to the other workspaces. No need to explicitly enter the workspace-switch mode.

I agree. Several commits ago, I added that feature. It works, but only while desktop drag gesture is active. For some reason, retrieving rotation angles stops immediately after the finish of the gesture. I still can't find a workaround for that. Now, switching between desktop mode and overview mode have effect of only reorganizing windows for its preview.

Manual rotation of the inside-out cube should still be possible for reorienting your head.

Yep. Several commits ago I added it too. E.g., moving workspaces that are in front are still possible by mouse scrolls, while possible to observe them in same time by the head rotation.

An additional rotation is applied to the camera which comes from the head tracking.

You have described it exactly how I tried it to do. And, I guess, it more or less implemented like this. At least, I tried only to add retrieved rotation whenever possible without changing existing behaviour.

And we would need to ensure that the other workspaces are shown during normal interaction with the desktop.

It is a little unclear. Do you mean the workspaces could being hide in the desktop mode when drag gesture in not active? Currently, all workspaces can be observed by head rotation while the desktop drag gesture being active.

Sending input to multiple workspaces will be hard, but maybe we can silently change the active workspace depending on the mouse pointer location.

I don't think that need to sent input to multiple workspaces. I guess would be enough to change them by mouse location as you proposed. Or, another variant, by activating that of workspace, that are more under the user's look. E.g., activate the first workspace if a user is looking on them, activate the second if he is looking on the second, and so on.

Is this somehow what you imagine?

Awesome, you described it in details as I image it.

@ila-embsys
Copy link
Author

Good news, I could check window interaction in desktop mode. It works. The problem is the cursor position doesn't match the actual point where it does input. It is seen when I draw image in GIMP.

Another new issue: after activating gesture that enables observing in desktop mode, impossible to move windows. They just get sticking and new opened windows doesn't show up.

Preview: Switching between desktop and overview modes and dragging windows between workspaces

output2

Preview: Interaction to windows in desktop mode

output3

@ila-embsys
Copy link
Author

Just a little update: windows movement ability in desktop mode is fixed, as well as windows creation. The usual workflow with opening windows, dragging them, closing and moving between spatial workspaces works well enough now.

The last issue is cursor position. As I mentioned previously, visible cursor position is displaying in display coordinates, and the actual input is getting from the same offsets on the workspace, but the workspace in distorted by a 3D vision, so difficult to guess where the cursor input will be is.

I'm thinking about adding something like a laser beam to point the right position.

@ila-embsys
Copy link
Author

ila-embsys commented Aug 23, 2024

I added the laser beam by shader effect to point mouse input location. Mouse is more usable now.

It doesn't point carefully for the mouse input location, because I didn't implement point projection. First, I didn't find if a vertex shader is possible to use within a gnome extension. Second, maybe for a one point, the projection could be done by a CPU. I am not sure since didn't have to deal a 3D stuff previously.

At this point, I am ready to doing attempts to find a replacement for the outdated OpenHMD library. Since you @Schneegans mentioned 'gxr', I will be going to look into this.

@ila-embsys
Copy link
Author

I added an option to select OpenXR as a backend for communication to HMD. So now it is possible to use Monado runtime and GXR library.

Original GXR library need to be extended by GObject-Introspection annotations to be usable by GJS. Here is my branch with that annotations.

Additionally, to import GXR into the gnome shell extension, all dependencies on 'Gdk-3.0' of GIR targets needed to be changed to 'Gdk-4.0'. At least for 'gxr' itself and 'gulkan'.

With that addition of OpenXR variant, a new issue is added. As I expected, creating a GXR context results to open Monado's window for drawing content into it. That window is not needed, but uselessly stay opened. Though, it is not affect head tracking.

While all Monado's HMDs now expected to be possible to use, it still no sense for those display that is not displaying its image flat and that need an additional image corrections. However, I already added here a shader for drawing a beam of cursor pointer, so some additional one could be added for distortion correction too.

@Schneegans
Copy link
Owner

Sorry for the long silence! Thanks for your continued work on this! By now I have access to an HTC Vive, so I will be able to test this.

I already see the device rotation as reported by OpenHMD and the inside-out cube also rotates accordingly. However, I obviously see the desktop still on my monitor, not inside the HMD. I think the Vive cannot easily act as a secondary display. Or can it? Then we would need the proper distortion shader.

The Gxr backend did not work out-of-the-box for me. I think I still have some issues with the paths. At least GJS still complains that it does not find the typelib files. But I guess I'll figure this out.

But I am still confused about the scope of gxr and OpenHMD. Can we somehow render directly to the HMD or will it only work if it is attached as a display?

@ila-embsys
Copy link
Author

Well, if I understand correct, all Monado supported devices are some kind of external display. However, most of that devices need side-by-side stereoscopic image and lens distortion correction, so that HMDs doesn't show an image on its displays by default until some 'driver' prepared necessary image modifications and switched on the display.

This is what I think Monado do: shows full-screen SBS distorted image on a related display and enables screen. So, I guess if you get working Monado, GXR will enable the screen in your HMD. The only thing I expect will be needed to do additionally is to avoid showing full-screen Monado's window on HMD's display and instead use what inside-out cube shows now. However, we will still need distortion correction and, of course, SBS.

I think at this point, the best next moves will be to find the ways for adding SBS, adding Distortion Correction Shader and adding FOV adjustment. However, first we need to be sure that we can get Monado's HMDs able to act as an external display. Since I have nothing more than my Nreal Air glasses, I can not check that, but I think you can.

Maybe, while you check the assumption about the usage of HMDs as an external display, I could start from FOV adjustment. For my glasses, I don't need SBS or distortion correction and can skip it, but correct FOV is important for feeling gnome's workspaces spatially static.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants