Skip to content

Build and Run OpenXR in O3DE

moraaar edited this page Jan 20, 2023 · 19 revisions

OpenXR is a royalty-free, open standard that provides high-performance access to Augmented Reality (AR) and Virtual Reality (VR)—collectively known as XR—platforms and devices. OpenXR can be seen as a mechanism for interacting with VR/AR systems in a platform-agnostic way.

The gems XR and OpenXRVk provide an integration of OpenXR library into O3DE. XR gem is an abstraction layer for any XR platform and OpenXRVk gem provides an OpenXR implementation using Vulkan as render backend.

IMPORTANT

XR and OpenXRVk gems are in early stages of development and considered proof of concept. Features, workflows, APIs and code are subject to change.

Features

  • O3DE supports running OpenXR in two ways:
    • Running on PC and transmitted to OpenXR device using tethered (Link) connection.
    • Running natively on OpenXR device.
  • O3DE editor's game mode running on OpenXR device (tethered connection only).
  • O3DE game launcher running on OpenXR device.
  • Support for Meta Quest 2.
  • Head & Hand pose information.
  • Controller input state.
  • Haptic Response.

System requirements

How to set up the environment

  • Set up o3de from GitHub and sync to the development branch. For details refer to Setting up O3DE from GitHub.
  • Set up o3de-extras from GitHub and sync to the development branch. For details refer to O3DE-Extras readme.
  • Register XR and OpenXRVk gems within O3DE:
    >scripts\o3de.bat register -gp <your path to o3de-extras>/Gems/XR
    >scripts\o3de.bat register -gp <your path to o3de-extras>/Gems/OpenXRVk
    

How to enable XR gems

Once the gems are registered within O3DE you can enable XR and OpenXRVk gems in your project as usual with project manager or command line. For details refer to Adding and Removing Gems.

The project OpenXRTest is already setup with XR and OpenXRVk gems enabled and provides test levels (DefaultLevel and XR_Office).

How to build and run OpenXR using tethered connection

Build steps

  • Build O3DE on Windows PC with your project as usual. For details refer to the User Guide Build page.
  • Run Asset Processor and wait for all the assets to be processed.

Running OpenXR with the Editor

  • Connect Meta Quest 2 to the PC and launch Quest Link.
  • From command line run the Editor with the following options: <project-build-path>/bin/profile> Editor.exe -rhi=vulkan -openxr=enable
  • Open a level.
  • Enter game mode pressing Ctrl+G.
  • The level will be rendered in both PC and Meta Quest 2.
  • Pressing ESC will exit game mode and stop rendering on Meta Quest 2.

Running OpenXR with Game launcher

  • Connect Meta Quest 2 to the PC and launch Quest Link.
  • From command line run the Game Launcher with the following options: <project-build-path>/bin/profile> <ProjectName>.GameLauncher.exe -rhi=vulkan -openxr=enable

    Note Make sure to indicate an initial level to load.

  • The level will be rendered in both PC and Meta Quest 2.

How to build and run OpenXR natively on Meta Quest 2

Build steps

  • Download the Oculus OpenXR Mobile SDK and unzip it inside OpenXRVk gem in the following folder OpenXRVk\External\OculusOpenXRMobileSDK.
  • Make sure to indicate an initial level to load.
  • Create <project-path>/Registry/AssetProcessor.setreg file to generate android assets in the cache:
{
    "Amazon": {
        "AssetProcessor": {
            "Settings": {
                "Platforms": {
                    "android": "enabled"
                }
            }
        }
    }
}
  • Create <project-path>/Registry/OpenXR.setreg file to enable OpenXR. You can also set the ViewResolutionScale setting per platform to scale down the resolution and gain performance at the cost of picture quality:
{
    "O3DE": {
        "Atom": {
            "OpenXR": {
                "Enable": true,
                "android_ViewResolutionScale": 0.75
            }
        }
    }
}
  • Run Asset Processor and wait for all the assets to be processed.
  • Connect Meta Quest 2 to the PC.
  • Build and deploy android following these steps with the following alterations:
    • Use your project instead of o3de-atom-sampleviewer.
    • When running generate_android_project.py command use LOOSE for %O3DE_ANDROID_ASSET_MODE% and add the option --oculus-project.

      Note: When targeting OpenXR devices other than Meta Quest 2, do not include --oculus-project option.

    • When running deploy_android.py command use APK for %O3DE_ANDROID_DEPLOY_TYPE%.

Running app

  • Launch app from Meta Quest 2.

Head & controllers input

When running a level using OpenXR the headset's input will control the view of the camera. By default the camera location will remain stationary. In order to provide camera movement using the controllers you can add the XR Camera Movement component to the camera's entity.

The XR Camera Movement component will provide the following behavior to the camera:

  • With the thumbstick of the left hand controller it will strafe the camera sideways and also move it forward and backwards.
  • With the A and B buttons of the right hand controller it will move the camera down and up respectively.
  • The movement speed and sensitivity can be adjusted modifying the properties of XR Camera Movement component.

image

There is also the option to visualize the location of the controllers in the world. By running the console command xr_DebugDrawInput true it will draw one axis in 3D space per controller. There is also a short-cut to toggle it by using the left hand controller pressing at the same time the Menu and Trigger buttons.

Known Issues/Limitations

  • 2D rendering (ImGUI, LyShine) is not supported yet on VR.
  • VR render pipeline is currently not optimized to run natively on Quest 2 and therefore FPS can be low. You can use the setting O3DE/Atom/OpenXR/android_ViewResolutionScale to lower the resolution and increase the performance of your level. This render pipeline is also quite lean and mainly focusses on rendering the mesh with simple lights and PBR based BRDF shading model. It currently does not have other features like Shadows, Reflections, various types of post processes (other than tonemaping), etc. The plan is to slowly add a cheaper and more tbdr hardware based optimized version of these features based on our use case.