-
Notifications
You must be signed in to change notification settings - Fork 34
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Extend pointer events to support raw trackpad data #206
Comments
Note that there is presumably still a connection to the screen in this case - the normal co-ordinates, and event targeting would presumably all represent the mouse cursor location. So it's not an entirely "off screen pointing device". |
Note Edge is shipping something kinda like this where touchpad emulates a touchscreen, but without any of the API / spec changes we've discussed years ago. @dtapuska and I expressed our reservations about this plan to @scottlow at TPAC. Chrome has no intention of following this approach. Chrome gets the same perf benefit with passive wheel listeners (sadly opt-in) and soon nearly the same by default with async wheel events. Although we'd probably have to define a 'wheel-action' CSS property to handle all the same edge cases with full fidelity. We don't believe we could stop firing wheel events completely for touchpad scrolling for web compat reasons. We know the vast majority of websites still don't support pointer events for touchscreen operations like panning. Also it could be confusing to emulate a touchscreen on devices with a touchpad but no touchscreen (MacBook, some Chromebooks). All that said, I don't anticipate too much Interop pain from developers being surprised by the difference between Edge and Chrome here. I guess we'll see. |
A few questions for @scottlow on the design in Edge:
In terms of the Interop risk, I expect the main developer pain point to continue to be the difference in presence of wheel events. So I don't think this is net-worse than today (I can see an argument for it being net-positive for Edge, even without a path for Chrome to be able to match). |
|
I'd love access to these for a drawing surface, e.g. the drawing notes in keep.google.com. I don't have any way to differentiate between events from a mousewheel or a touchpad. We previously had a default behavior of wheel would zoom and ctrl+wheel would scroll, as zooming was much more common, but this worked poorly with touchpad gestures. Detecting rotation gestures would be a neat feature too. |
Closed (discussed in PEWG meeting today https://www.w3.org/2019/07/10-pointerevents-minutes.html) - concerns about being too low-level, crossing over with things such as sensor API, and - at least at this stage - few use cases/resources in this area. |
@patrickhlauke As soon as web apps facilitate more and more touch gestures, this will become a necessity. 3D web apps are a good example. The mention above i.e. is a use case in ScrollMaps which is hindered by your limitation. |
I wish this would be reconsidered. 😊 Just as an example, MacOS has a scroll bounce/overflow effect used in every native app on MacOS. Blink (and thus, electron) only support this effect on the main body of the HTML page. This means its currently impossible to replicate this effect with web technologies. There are two possibilities to resolve this:
I'd take either approach, but it really feels like the raw trackpad event data should be made available. In 2020, we have WebUSB, bluetooth apis and so much more, but for some reason we don't have trackpad event data to allow web apps to replicate a really fundamental macOS app design element. |
It is impossible to make proper custom intertial scrolling based on current The closest I could get is the following, but because there's no way to detect start/end of interaction, the janky inertial scroll happens even when fingers are still on the trackpad (tested in Chrome): https://codepen.io/trusktr/pen/YzGbeKG The jank intertia is my fault: I could probably average a few last deltas. But if you try flinging the rotation, that works really well without such needed trick. I hope for something as nice as the rotation (pointer events) but for the zoom (touchpad). It would be super great to have the ability to know when any finger goes down, moves, and goes up on a trackpad. I would imagine pointer events on touchpads would work the same, but the events would have some flag that says it's happening on a touchpad instead of in the window. |
I also believe it's important to expose touch locations on trackpads. If I could react to custom gestures I could improve the workflow on my website immensely. |
Just like @trusktr and @aeharding mentioned, there is no way to properly implement MacOSs rubberband effect/elastic scroll on browsers because of current API limitations. The main limitation for the rubberband effect is that there is no way to find whether the scroll event ended, as in whether the finger is off the trackpad. This is the same effect implemented in react-indiana-drag-scroll using the mouse event: In 2022 I believe this issue should be given some serious considerations. |
Once the finger is lifted, it should fire the |
Note I am talking about scroll events, as far as I noticed pointer events aren't called on scroll events, at-least not on MacOS. Are pointer events suppose to be called on scrolls on trackpads? |
It is not currently possible to detect when a user lifts a finger (and puts a finger down) on a macOS touchpad. View codepen, move finger around the div, pause, and lift finger up. No event fired. https://codepen.io/aeharding/pen/JjvBvpR We are missing a lower level API for macos touchpad. Native apps have this ability, like NSTouch. https://developer.apple.com/documentation/appkit/nstouch |
no, by their very nature pointer events do not fire when a gesture is executed with the pointer input. if the browser or OS take over for a particular movement (like doing a scroll in response to a trackpad gesture), then the pointer is cancelled and control is ceded directly to the browser/OS, so no further pointer events are fired. as this is fundamental to the current behaviour/way that pointer events are specified, this is not a feature that is likely to be changed. you'd have to do something like that react-indiana-drag-scroll you mentioned, using |
This is a shame. For interacting in 3d space, mouse interaction and trackpad interaction are completely different, and there is literally no way to tell if pointerevents come from a mouse or trackpad. They're all pointerType: "mouse" |
@patrickhlauke I would like to weigh in on the "demand" side and heavily vote for a reopen. As @stam already said, this is currently an unsolvable issue for web-based 3D applications and severely limiting using web technologies at all for 3d applications whose controls even remotely resemble non-web industry standards and that need to use all available degrees of freedom for spatial navigation. There is no real acceptable workaround. The underlying assumptions of the current high level solution, which gestures correspond to which intentions, are now just not true anymore I fear. Right now, we have to use heuristics to guess what the original gesture was, and thus have tons of users complaining that with certain input devices the inputs are mis-interpreted, rendering the webapp unusable. I understand there are some concerns on some technical implementations, but it is hard to imagine that whatever solution would be proposed is worse than us training an ML model running in the frontend just to guess the lost information of mouse events. This is actually our current plan if there will be no fix. |
@patrickhlauke You see to what lengths developers need to go who need trackpad data? ML to interpret mangled input data. Horrible what efforts they have to go to just b/c that data is not passed through… |
as this issue has now morphed beyond "get raw trackpad data" it seems (because if you got raw data, you'd still have to interpret it to work out what gesture, if any, the user actually intended to do), suggest opening a more specific new issue that specifically outlines the exact ask that you have here with regards to 3D applications. |
Heya, colleague of @Exatex here 👋 I have opened a new issue to explain the problem in more detail. |
"get raw trackpad data" is still the only issue. Interpreting the gestures can be handled by the developers respectively, instead of being provided by w3c. |
define what you mean exactly by "get raw trackpad data", exactly. what does this data look like? how does it differ from what pointer events currently offer? also, how this would just be limited to trackpad - is there any more generalised ask here for touchscreens or other pointer input types? (it's not going to make it into version 3, and it may not even make it into pointer events but rather a sibling spec, but the conversation here has veered into odd "i can't tell a scrollwheel and a trackpad apart" which is a different concern altogether) |
discussed this at today's pointer events working group meeting https://www.w3.org/2023/02/01-pointerevents-minutes.html#t04 while we understand the developer need/use cases here, we feel it's not quite the appropriate fit for the pointer events API - it is arguably far more low-level, and may be more closely related to the pointer lock API or a more generalised sensors API. we recommend posting a proposal (including use cases) to the Web Incubator Community Group (WICG) https://discourse.wicg.io/ where it may find more traction and a more appropriate venue. |
From old bug tracker:
I still don't think this is at all urgent, but I'd like to have a place in GitHub to refer to for this possible future scenario.
The text was updated successfully, but these errors were encountered: