-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Change how stage features are exposed #12
Comments
The way I've thought of this is: for AR, the stage (or the stage "center"), can travel with the user. The exact behavior can be defined by the UA (or the virtual reality), but I imagine a type of "leashing", such that the stage center is fixed at an appropriate point on the floor near the user, until the user moves beyond a certain radius, at which point the stage is dragged along. This is similar to the way that I've seen virtual realities implemented for HoloLens. |
When looking at what Hololens does, and to a certain extent what ARKit and ARCore do (no notion of stage, just anchors), compared to systems that have a notion of a "stage" (e.g., fixed VR setups like Vive/Rift), it makes less sense to think of this as a "stage". For example, in fixed VR, the user generally clears the floor of a room for the VR rig, and the "stage" actually does correspond to the physical area that the player can move. In a mobile setup (both AR systems, and untethered VR such as the newer WindowsMR devices) the floor cannot be assumed to be flat. So a "stage" means less. More importantly, the idea of "attaching things to world near the user" can be done in other ways using anchors and other application specific code. The thought with the "floor" is to have the UA make (at any point in time) it's best guess as to were the floor is under the user (some systems, such as those based on 3DOF VR or current ARCore/ARKit) might have to guess; others might use the VR stage; others (such as Hololens, that has a full mesh) might actually make a pretty good guess. A system that has a stage would expose that coordinate system. On other systems, given a |
Working on this in PR #29 |
In AR, the idea that the local area consists of a flat floor (aka stage) will often not be true. In VR, the platforms may expose a stage center in the tracker coordinate system and may expose a 2D polygon that determines the clear space in which to move.
Expose a 'floor' XRAnchor that is always at floor level (as near as the platform can determine) underneath the current head pose.
Expose the VR style stage info (center point, polygon) through API on XRSession, and remove the 'stage' XRCoordinateSystem.
Update the examples to use the floor anchor.
The text was updated successfully, but these errors were encountered: