Skip to content

Remote access: Add a type input_gesture to execute gesture on follower's side #19899

@beqabeqa473

Description

@beqabeqa473

Is your feature request related to a problem? Please describe.

I am developing NVDA remote companion, which includes an android application.

I want to be able to controll pc with touch screen. For this, several modes should be added, because we have limited possibilities, like it is done in touchscreen support in nvda.

This mode includes text mode, object navigation mode, browse mode and so on.

Currently, Remote client in nvda only supports forwarding key presses, and braille input gesture subset. If we take object navigation mode as an example, we cannot simply forward keys, becaue on remote side, gesture might be different, and leader may not have information which keys to send, and when it comes to controll with touchscreen, it is limited to swipes in four directions, which should be mapped to object hyerarchy.

Describe the solution you'd like

I would like to propose an additional type, input_gesture, which will be handled on follower's side and will execute gesture which was sent by follower.

I am ready to contribute final solution and send a pull-request.

Describe alternatives you've considered

n/a

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    component/remoteCode or other developer-facing aspects of NVDA's Remote Access functionalityfeaturep5https://github.com/nvaccess/nvda/blob/master/projectDocs/issues/triage.md#prioritytriagedHas been triaged, issue is waiting for implementation.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions