Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change how stage features are exposed #12

Closed
TrevorFSmith opened this issue Sep 11, 2017 · 3 comments
Closed

Change how stage features are exposed #12

TrevorFSmith opened this issue Sep 11, 2017 · 3 comments

Comments

@TrevorFSmith
Copy link
Contributor

TrevorFSmith commented Sep 11, 2017

In AR, the idea that the local area consists of a flat floor (aka stage) will often not be true. In VR, the platforms may expose a stage center in the tracker coordinate system and may expose a 2D polygon that determines the clear space in which to move.

Expose a 'floor' XRAnchor that is always at floor level (as near as the platform can determine) underneath the current head pose.
Expose the VR style stage info (center point, polygon) through API on XRSession, and remove the 'stage' XRCoordinateSystem.
Update the examples to use the floor anchor.

@speigg
Copy link

speigg commented Sep 12, 2017

The way I've thought of this is: for AR, the stage (or the stage "center"), can travel with the user. The exact behavior can be defined by the UA (or the virtual reality), but I imagine a type of "leashing", such that the stage center is fixed at an appropriate point on the floor near the user, until the user moves beyond a certain radius, at which point the stage is dragged along. This is similar to the way that I've seen virtual realities implemented for HoloLens.

@blairmacintyre
Copy link
Contributor

When looking at what Hololens does, and to a certain extent what ARKit and ARCore do (no notion of stage, just anchors), compared to systems that have a notion of a "stage" (e.g., fixed VR setups like Vive/Rift), it makes less sense to think of this as a "stage". For example, in fixed VR, the user generally clears the floor of a room for the VR rig, and the "stage" actually does correspond to the physical area that the player can move. In a mobile setup (both AR systems, and untethered VR such as the newer WindowsMR devices) the floor cannot be assumed to be flat. So a "stage" means less. More importantly, the idea of "attaching things to world near the user" can be done in other ways using anchors and other application specific code.

The thought with the "floor" is to have the UA make (at any point in time) it's best guess as to were the floor is under the user (some systems, such as those based on 3DOF VR or current ARCore/ARKit) might have to guess; others might use the VR stage; others (such as Hololens, that has a full mesh) might actually make a pretty good guess.

A system that has a stage would expose that coordinate system. On other systems, given a floor, the application can implement the idea of a "traveling stage" as they need. I would imagine a simple demo system would have a method like getOrCreateStage(): Anchor to just retrieve a stage Anchor if it exists or create one based on the current value of the floor. A more complex system would, as you say, move it with the user.

@TrevorFSmith
Copy link
Contributor Author

Working on this in PR #29

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants