[New User Need] 4.21 Understanding Actionable Targets - Review of XR Accessibility User Requirements COGA #15
Filed on behalf of COGA @lseeman / john Kirkwood / Dave Fazio
User Need 21: Users with cognitive impairments need to understand what items in a visual display are actionable targets, and how to interact with them.
REQ 21b: Users need to be provided a clear understanding of how to interact with actionable targets. For example, a slider, where the user needs to click and drag.
The text was updated successfully, but these errors were encountered:
Users with cognitive disabilities can have difficulty determining what elements/components/features of a digital environment can be interacted with, and how to interact with them, as well as what the end result will be.
David F says: Personalization requires a user to first recognize what features don’t meet their needs, which in this case, would mean what buttons, links, controls, etc. they don’t recognize. If a user doesn’t recognize somethings there they won’t understand how, or even that they need, to change it.
The ownness should not be on the user with a cognitive impairment. It should be on the designer.
Asking a user to personalize content can be a cognitive heavy task. It shouldn’t be asked of a person with a cognitive impairment. At the same time it should still be available
@Helixopp Please note that I find the COGA content usable doc to be good as an example of good advice and practice in this area, as it relates to issues with affordances with regular webpages etc. I think this is a good jumping off point for more complex affordances in XR.
The questions are, 'is this list sufficient'? Are there other aspects in XR that we should consider?
@joshueoconnor I know you'll be attending the COGA TF meeting tomorrow and going into this in more detail, but I'm also trying to suss out what you need from us in this issue.
Following the task force meeting tomorrow, I'll review the transcript and see if I better understand this particular issue.
What I'm trying to understand:
Also, if I understand correctly, at this point you are looking for more examples. Can you list the examples you already have, and/or point us directly to the place in the document where this is discussed, so that we can reference what has already been done?
Regarding the comments on Personalization. How things are Personalized is really an implementation detail based on the users need. If there are particular requirements in the context of XR that we can add to the XR A11y User requirements document, then that would be helpful (and what I am getting at here). Is this clearer?
After the COGA call today I note that the issue #12 is NOT a duplicate, just similar, and has the addition suggested text for User Need 4:
@rainbreaw Also regarding personalization, if the XR environment was aware that the user had cognitive or learning disabilities then it could provide the customisation if we have clear idea of how to specify the requirements and customise various affordances (to make things more accessible etc to users with cognitive and learning disabilities) so the user doesn't have to.
Based on discussion in Research Questions - the task force feels these user needs/requirements are best expressed as a call for the development of suitable accessibility APIs. I've therefore drafted a note on same in this branch: