-
Notifications
You must be signed in to change notification settings - Fork 248
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpenXR doesn't support acceleration and angularAcceleration #504
Comments
The runtime is expected to use acceleration data itself to perform the tracking for the app. We do not want to delegate dead-reckoning tracking to the application because there are far too many ways to get it subtly wrong, among other reasons. So it is a conscious decision to not expose it to the application. It is definitely assumed to be used by the runtime, however. (In such a situation you'd probalby expect POSITION_VALID (but not TRACKED), and ORIENTATION_TRACKED as well as ORIENTATION_VALID) Do you have any other use cases besides handling degraded tracking quality? |
An issue (number 2373) has been filed to correspond to this issue in the internal Khronos GitLab (Khronos members only: KHR:openxr/openxr#2373 ), to facilitate working group processes. This GitHub issue will continue to be the main site of discussion. |
Hi,
Please let me explain why linear and angular accelerations are in my humble opinion precious data :
1) First of all, headset without external captors for handler detection and location are the most popular products. For these types of product, as soon as the handler is out of the FOV of the detection and location cameras, position and velocity are lost, and will only be available when (for the position) and 1 frame after (for the velocity) the redetection of the handler.
Consequently, all the gesture information meanwhile is lost and the first velocity value is at best (0,0,0), and in the worst case completely "random" (what is the case for the Meta Quest 1, 2 & 3)
The only information that remains valid when the handers are invisible is the handler acceleration (linear and angular). They allow for the reconstruction of all the gesture during the handler non detection time interval. This reconstruction is crucial for the following types of games (non exhaustive list):
a) FPS when you want to throw a grenade
b) Ball sport games
c) Games requiring large gesture identification. etc.
Part of the reconstruction is performed by the built in sofware (or OpenXR), but it rapidly stops and adds a supplementary delay on the estimated velocity when the hander become visible again.
2) For solving the motion sickness issue, two types of values are of importance : the velocity of the avatar versus the scenery, which indicates the discrepancies between the eye and the inner ear, and the headset instant acceleration and angular acceleration, which informs on the inner ear perturbation.
Therefore, acceleration invaluable for me. Several projects of mine are blocked due to the "disappearance" of accelerations, since they work only on Unity until 2020 and Unreal Engine until 4.27. This represents years of work. I thought the "loss" of acceleration was a bug to be corrected. Since I understand now that it was a design choice, I feel like being in a dead end.
I deeply and sincerely hope you will consider the inclusion of acceleration into OpenXR. Would it be possible to be have news of the issue when you have reached a decision?
Best Regards,
Renaud Maroy
Note : I am also a scientist in applied mathematics working on tools for game developers that would allow them to develop games that are impossible or difficult to develop right now.
|
"We do not want to delegate dead-reckoning tracking to the application because there are far too many ways to get it subtly wrong, among other reasons. So it is a conscious decision to not expose it to the application" I understand your concern and your decision. Could it be possible not to expose accelerations as default, but to allow acceleration to be retrieved if specially needed, what is my case. I would be instantly unblocked. |
My other use cases are:
|
Hi,
Hardware providers rely more and more on OpenXR. However, OpenXR does not support acceleration and angularAcceleration input (not velocity derived), which are, as position and orientation, key inputs of XR devices and tha are available for many headset products on the market. Accelerations and AngularAccelerations are the only vectors which are available when the handler are not visible by the headset. As such, they allow for the estimation of the movement when the disappearance of the handlers are momentary.
Would it be possible to add a feature to get the instant acceleration and angular acceleration from the devices ? This lack is a major issue since, at the present time, there is no other way to get this informations.
Best Regards
**
**
The text was updated successfully, but these errors were encountered: