Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Get the object information with respect to the position of observer #73

Closed
ajay-sh21 opened this issue Mar 30, 2020 · 8 comments
Closed
Assignees
Labels
enhancement XAUR XR User Needs and Requirements draft

Comments

@ajay-sh21
Copy link

Help users to explore the environment using input gestures, and the objects should have a meta information associated which would include the distance from the person and co-ordinates for the direction with respect to the observer.
For example, Jasmine flowers in a vase, 2 meters at your 11 o’clock”.”.

@RealJoshue108 RealJoshue108 added the XAUR XR User Needs and Requirements draft label Mar 31, 2020
@RealJoshue108 RealJoshue108 self-assigned this Apr 6, 2020
@RealJoshue108
Copy link
Contributor

@ajay-sh21 Yes, getting context information that is relevant to the users relative positioning, is useful.

@RealJoshue108
Copy link
Contributor

RealJoshue108 commented Apr 15, 2020

Discussed in Research Questions Task Force. This could be useful depending on context - but needs to be useful, as the user moves. Could be a request type function.

@RealJoshue108
Copy link
Contributor

There are features in VR a11y, such as specifying a location you wish to go to, so you can portal to it. Also the ability to follow another character in a scene and use a follow command that is easier to use that having to follow direction.

@RealJoshue108
Copy link
Contributor

RealJoshue108 commented Apr 28, 2020

@ajay-sh21 Could you give me some more detail on the user need for this kind of functionality? Also what kinds of interactions you think this would be useful for?

@ajay-sh21
Copy link
Author

Virtual A11Y could make the task easy, didn't knew if anything like this existed. Though, for some one who want to carry out the tasks without guidance by own, in that scenario it would be good to know what all objects are there around.

@RealJoshue108
Copy link
Contributor

RealJoshue108 commented Jun 9, 2020

Discussed in Research Questions Task Force

https://www.w3.org/2020/04/15-rqtf-minutes.html#item03

@RealJoshue108
Copy link
Contributor

@ajay-sh21 This needs more detail of the user need you are suggesting, thanks

@RealJoshue108
Copy link
Contributor

@ajay-sh21 On reflection this could be largely an implementation issue - or enabled via a user preference. Currently I feel the user need outlined in 'User Need 1: A user of assistive technology wants to navigate, identify locations, objects and interact within an immersive environment.' as well as our call for personalisation/customisation is currently sufficient here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement XAUR XR User Needs and Requirements draft
Projects
None yet
Development

No branches or pull requests

2 participants