Replies: 2 comments 2 replies
-
I think ultimately this is a good idea, but I'd rather focus on getting direct 1:1 apis implemented first. Then we can identify commonality + start building abstractions (which should definitely go through the RFC process). Godot built a high level XR interface, so we can probably learn from that.
Hmm this is a tough call. I don't have an opinion off the top of my head. We should probably build a separate path first, then try to unify if possible. Ideally users don't have to build entirely new render logic when adding vr.
We'll probably need to change the graph structure according to whether or not XR is enabled (ex: replace the window swapchain node with the xr swapchain node). This will definitely require some thought / design work and like (1), ideally users don't need to build entirely new render graphs to account for this.
I've already removed Window from |
Beta Was this translation helpful? Give feedback.
-
I'd personally like to see bevy_arcore. |
Beta Was this translation helpful? Give feedback.
-
Hi, I wanted to help with XR integration in bevy. @blaind is currently working on it but I wanted to take a step back and discuss some details about how this integration should be implemented (I currently can't actively work on it but I know this can help any contributor).
First of all, we may want a unified system for handling VR, AR and XR devices. A single crate
bevy_xr
can be written as a unified interface for OpenXR, WebXR, ARCore and ARKit backends, but much work needs to be done in other bevy crates for a proper integration.For now I will discuss only the rendering code paths.
bevy_xr
? If we want full control in the backend we may need a newXR_CAMERA
node.WindowSwapChainNode
, but OpenXR uses an offscreen ad-hoc swapchain. if XR mode is enabled, should the primary swapchain be aXrSwapChainNode
or should we add a newXR_SWAP_CHAIN
node? A point in favor toXR_SWAP_CHAIN
is thatbevy_ui
uses thePRIMARY_SWAP_CHAIN
and expects it to be linked to a window.RenderResourceContext
requires aWindow
handle. Should we add acreate_swap_chain_xr()
andnext_swap_chain_texture_xr()
methods or should we abstractWindow
and a (new)XrScreen
into a new (enum) type?To house the most of the XR logic I though of adding a
bevy_xr
crate, which should handle input and should be renderer agnostic.bevy_xr
plugin struct can be used bybevy_wgpu
to construct wgpu context objects and swapchains (bevy_wgpu
will receive just the backend-specific handles).About wgpu and gfx-hal, @blaind is working on wgpu and it seems on the right path. I already submitted a PR on gfx-hal for constructing objects from raw handles but it still needs some tweaks to allow gfx-hal to take ownership of these handles (which should simplify the implementation of
bevy_xr
). Currently there is no support for multiview in gfx-hal and wgpu, this can be added later, and bevy integration can take this into account for future support.Beta Was this translation helpful? Give feedback.
All reactions