Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider a simple API for low-latency pointer trails #204

Open
RByers opened this issue May 5, 2017 · 7 comments
Open

Consider a simple API for low-latency pointer trails #204

RByers opened this issue May 5, 2017 · 7 comments
Labels
v3

Comments

@RByers
Copy link
Contributor

@RByers RByers commented May 5, 2017

In drawing scenarios, having low input->screen latency is extremely important. In some cases it may be possible for the UA to draw from input to screen faster than an application will ever be able to (even with APIs like offthread canvas which can reliably draw at 60fps).

I wonder if it's worth exposing a declarative API similar to the CSS cursor property which can specify an image for the UA to stamp out as a trail behind the pointer for a couple frames (maybe whatever it predicts the typical app input->paint latency is plus a frame of overlap for safety). This could be used to "close the gap" between the normal lower-latency painting code and the tip of the pointing device (possibly even using a prediction algorithm internally).

We might want some ability to specify separate images for different pointerType values, eg. maybe:

  pointer-trail: pentrail.png pen, touchtrail.png touch, none
@RByers RByers added the enhancement label May 5, 2017
@RByers

This comment has been minimized.

Copy link
Contributor Author

@RByers RByers commented May 5, 2017

We'd probably want a size restriction on these - maybe 32x32px is more than enough?

We'd probably want to make sure these images could be efficiently generated from a canvas. Since they're required to be small, perhaps canvas toBlob and using data URIs is good enough?

@patrickhlauke patrickhlauke added v3 and removed enhancement labels Feb 15, 2018
@NavidZ

This comment has been minimized.

Copy link
Member

@NavidZ NavidZ commented May 15, 2019

Discussed in today's group meeting and although pointerrawupdate and getPredictedEvents can reduce latency, based on Microsoft team testing the latency is still there and can hurt user experience. But it seems the API format proposed in this issue might not be the best fit for the use case. They will be looking into this to come up with an API that hopefully will address the latency issue better.

@dlibby-

This comment has been minimized.

Copy link

@dlibby- dlibby- commented Sep 3, 2019

Yes, it is true that even with pointerrawupdate and prediction, there is still a gap in latency. I like the simplicity of specifying a cursor-like image for the UA to draw trailing the pointer, but in practice it will lead to undesirable artifacts, mainly because the stroke that is drawn by the app would not be taken into account.

Inking applications on the web already follow the pattern of ‘consume pointer event, produce strokes on a canvas’. This can be enhanced by things like pointerrawupdate, prediction, input delivered to workers, etc. without requiring web developers to change their underlying programming or data model. Similarly, I’d like to propose an additional enhancement to expose OS capabilities for drawing a 'cursor trail'.

Conceptually, we need to communicate the following information to the OS to enable this:
• Which pointer events were consumed and rendered as ink strokes
• Metadata around the ‘style’ of the drawn stroke, e.g. color, radius, tilt, pressure
Current thinking is that the metadata is limited in scope and is not designed to accommodate arbitrary shaders or complex images. Another option could be to expose supported metadata properties via a capabilities query.

We’re still working on the ergonomics and doing some prototyping, but I would love to get this groups feedback on whether a proposal along these lines make sense.

@gked

This comment has been minimized.

Copy link
Contributor

@gked gked commented Sep 12, 2019

creating a reference for RFC TPAC: w3c/webappswg#10

@dlibby-

This comment has been minimized.

Copy link

@dlibby- dlibby- commented Sep 30, 2019

We shared the following proposal at TPAC:

https://github.com/MicrosoftEdge/MSEdgeExplainers/blob/master/WebInkEnhancement/explainer.md

It's unclear whether this would fit under the PointerEvents WG charter, but we'd love to get this group of people's feedback on the proposal. I will post a link to WICG discourse thread once it is available.

Thanks!

@NavidZ

This comment has been minimized.

Copy link
Member

@NavidZ NavidZ commented Oct 1, 2019

@dlibby- I'm not quite sure about the charter stuff. From our discussions at TPAC I do believe we need to somehow tie this to the pointerevents and pointerid. But whether it fits under current PEWG charter or not I leave it up to @patrickhlauke to decide.

Regarding a feedback on the proposal, could you link here or in the explainer the native Windows API that supports the feature. I'd like to follow up with Android and ChromeOS team and see what they have for this.

@dlibby-

This comment has been minimized.

Copy link

@dlibby- dlibby- commented Oct 2, 2019

The native Windows API does not yet exist and is currently work-in-progress — however, the proposed Web API exposure maps closely to how we expect this to be exposed for Windows (a method for setting 'style' properties, a method for communicating the last point rendered, and a method for communicating pen input points as they arrive).

The work to create a public API is about exposing this pen tip trail rendering functionality that is currently an implementation detail and encapsulated behind the InkPresenter and DirectInk family of APIs on Windows.

We should have more updates regarding the Windows API in the next couple of weeks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
5 participants
You can’t perform that action at this time.