-
Notifications
You must be signed in to change notification settings - Fork 22.4k
/
index.md
154 lines (109 loc) · 11.2 KB
/
index.md
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
---
title: Using Touch Events
slug: Web/API/Touch_events/Using_Touch_Events
page-type: guide
---
{{DefaultAPISidebar("Touch Events")}}
Today, most Web content is designed for keyboard and mouse input. However, devices with touch screens (especially portable devices) are mainstream and Web applications can either directly process touch-based input by using {{domxref("Touch_events","Touch Events")}} or the application can use _interpreted mouse events_ for the application input. A disadvantage to using mouse events is that they do not support concurrent user input, whereas touch events support multiple simultaneous inputs (possibly at different locations on the touch surface), thus enhancing user experiences.
The touch events interfaces support application specific single and multi-touch interactions such as a two-finger gesture. A multi-touch interaction starts when a finger (or stylus) first touches the contact surface. Other fingers may subsequently touch the surface and optionally move across the touch surface. The interaction ends when the fingers are removed from the surface. During this interaction, an application receives touch events during the start, move, and end phases. The application may apply its own semantics to the touch inputs.
## Interfaces
Touch events consist of three interfaces ({{domxref("Touch")}}, {{domxref("TouchEvent")}} and {{domxref("TouchList")}}) and the following event types:
- {{domxref("Element/touchstart_event", "touchstart")}} - fired when a touch point is placed on the touch surface.
- {{domxref("Element/touchmove_event", "touchmove")}} - fired when a touch point is moved along the touch surface.
- {{domxref("Element/touchend_event", "touchend")}} - fired when a touch point is removed from the touch surface.
- {{domxref("Element/touchcancel_event", "touchcancel")}} - fired when a touch point has been disrupted in an implementation-specific manner (for example, too many touch points are created).
The {{domxref("Touch")}} interface represents a single contact point on a touch-sensitive device. The contact point is typically referred to as a _touch point_ or just a _touch_. A touch is usually generated by a finger or stylus on a touchscreen, pen or trackpad. A touch point's [properties](/en-US/docs/Web/API/Touch#instance_properties) include a unique identifier, the touch point's target element as well as the _X_ and _Y_ coordinates of the touch point's position relative to the viewport, page, and screen.
The {{domxref("TouchList")}} interface represents a _list_ of contact points with a touch surface, one touch point per contact. Thus, if the user activated the touch surface with one finger, the list would contain one item, and if the user touched the surface with three fingers, the list length would be three.
The {{domxref("TouchEvent")}} interface represents an event sent when the state of contacts with a touch-sensitive surface changes. The state changes are starting contact with a touch surface, moving a touch point while maintaining contact with the surface, releasing a touch point and canceling a touch event. This interface's attributes include the state of several _modifier keys_ (for example the <kbd>shift</kbd> key) and the following touch lists:
- {{domxref("TouchEvent.touches","touches")}} - a list of all of the touch points currently on the screen.
- {{domxref("TouchEvent.targetTouches","targetTouches")}} - a list of the touch points on the _target_ DOM element.
- {{domxref("TouchEvent.changedTouches","changedTouches")}} - a list of the touch points whose items depend on the associated event type:
- For the {{domxref("Element/touchstart_event", "touchstart")}} event, it is a list of the touch points that became active with the current event.
- For the {{domxref("Element/touchmove_event", "touchmove")}} event, it is a list of the touch points that have changed since the last event.
- For the {{domxref("Element/touchend_event", "touchend")}} event, it is a list of the touch points that have been removed from the surface (that is, the set of touch points corresponding to fingers no longer touching the surface).
Together, these interfaces define a relatively low-level set of features, yet they support many kinds of touch-based interaction, including the familiar multi-touch gestures such as multi-finger swipe, rotation, pinch and zoom.
## From interfaces to gestures
An application may consider different factors when defining the semantics of a gesture. For instance, the distance a touch point traveled from its starting location to its location when the touch ended. Another potential factor is time; for example, the time elapsed between the touch's start and the touch's end, or the time lapse between two _consecutive_ taps intended to create a double-tap gesture. The directionality of a swipe (for example left to right, right to left, etc.) is another factor to consider.
The touch list(s) an application uses depends on the semantics of the application's _gestures_. For example, if an application supports a single touch (tap) on one element, it would use the {{domxref("TouchEvent.targetTouches","targetTouches")}} list in the {{domxref("Element/touchstart_event", "touchstart")}} event handler to process the touch point in an application-specific manner. If an application supports two-finger swipe for any two touch points, it will use the {{domxref("TouchEvent.changedTouches","changedTouches")}} list in the {{domxref("Element/touchmove_event", "touchmove")}} event handler to determine if two touch points had moved and then implement the semantics of that gesture in an application-specific manner.
Browsers typically dispatch _emulated_ mouse and click events when there is only a single active touch point. Multi-touch interactions involving two or more active touch points will usually only generate touch events. To prevent the emulated mouse events from being sent, use the {{domxref("Event.preventDefault()","preventDefault()")}} method in the touch event handlers. If you want to interact with both mouse and touches, use {{domxref("Pointer events", "", "", "nocode")}} instead.
## Basic steps
This section contains a basic usage of using the above interfaces. See the {{domxref("Touch_events","Touch Events Overview")}} for a more detailed example.
Register an event handler for each touch event type.
```js
// Register touch event handlers
someElement.addEventListener("touchstart", process_touchstart, false);
someElement.addEventListener("touchmove", process_touchmove, false);
someElement.addEventListener("touchcancel", process_touchcancel, false);
someElement.addEventListener("touchend", process_touchend, false);
```
Process an event in an event handler, implementing the application's gesture semantics.
```js
// touchstart handler
function process_touchstart(ev) {
// Use the event's data to call out to the appropriate gesture handlers
switch (ev.touches.length) {
case 1:
handle_one_touch(ev);
break;
case 2:
handle_two_touches(ev);
break;
case 3:
handle_three_touches(ev);
break;
default:
gesture_not_supported(ev);
break;
}
}
```
Access the attributes of a touch point.
```js
// Create touchstart handler
someElement.addEventListener(
"touchstart",
(ev) => {
// Iterate through the touch points that were activated
// for this element and process each event 'target'
for (let i = 0; i < ev.targetTouches.length; i++) {
process_target(ev.targetTouches[i].target);
}
},
false,
);
```
Prevent the browser from processing _emulated mouse events_.
```js
// touchmove handler
function process_touchmove(ev) {
// Set call preventDefault()
ev.preventDefault();
}
```
## Best practices
Here are some _best practices_ to consider when using touch events:
- Minimize the amount of work that is done in the touch handlers.
- Add the touch point handlers to the specific target element (rather than the entire document or nodes higher up in the document tree).
- Add {{domxref("Element/touchmove_event", "touchmove")}}, {{domxref("Element/touchend_event", "touchend")}} and {{domxref("Element/touchcancel_event", "touchcancel")}} event handlers within the {{domxref("Element/touchstart_event", "touchstart")}}.
- The target touch element or node should be large enough to accommodate a finger touch. If the target area is too small, touching it could result in firing other events for adjacent elements.
## Implementation and deployment status
The [touch events browser compatibility data](/en-US/docs/Web/API/Touch_events#browser_compatibility) indicates touch event support among mobile browsers is relatively broad, with desktop browser support lagging although additional implementations are in progress.
Some new features regarding a touch point's [touch area](/en-US/docs/Web/API/Touch#touch_area) - the area of contact between the user and the touch surface - are in the process of being standardized. The new features include the _X_ and _Y_ radius of the ellipse that most closely circumscribes a touch point's contact area with the touch surface. The touch point's _rotation angle_ - the number of degrees of rotation to apply to the described ellipse to align with the contact area - is also be standardized as is the amount of pressure applied to a touch point.
## What about Pointer Events?
The introduction of new input mechanisms results in increased application complexity to handle various input events, such as key events, mouse events, pen/stylus events, and touch events. To help address this problem, the [Pointer Events standard](https://www.w3.org/TR/pointerevents/) _defines events and related interfaces for handling hardware agnostic pointer input from devices including a mouse, pen, touchscreen, etc._. That is, the abstract _pointer_ creates a unified input model that can represent a contact point for a finger, pen/stylus or mouse. See the [Pointer Events MDN article](/en-US/docs/Web/API/Pointer_events).
The pointer event model can simplify an application's input processing since a pointer represents input from any input device. Additionally, the pointer event types are very similar to mouse event types (for example, `pointerdown` and `pointerup`) thus code to handle pointer events closely matches mouse handling code.
The implementation status of pointer events in browsers is [relatively high](https://caniuse.com/#search=pointer) with Chrome, Firefox, IE11 and Edge having complete implementations.
## Examples and demos
The following documents describe how to use touch events and include example code:
- {{domxref("Touch_events","Touch Events Overview")}}
- [Implement Custom Gestures](https://web.dev/articles/add-touch-to-your-site)
- [Add touch screen support to your website (The easy way)](https://www.codicode.com/art/easy_way_to_add_touch_support_to_your_website.aspx)
Touch event demonstrations:
- [Paint Program (by Rick Byers)](https://rbyers.github.io/paint.html)
- [Touch/pointer tests and demos (by Patrick H. Lauke)](https://patrickhlauke.github.io/touch/)
## Community
- [Touch Events Community Group](https://github.com/w3c/touch-events)
- [Mail list](https://lists.w3.org/Archives/Public/public-touchevents/)
- [W3C #touchevents IRC channel](irc://irc.w3.org:6667/)
## Related topics and resources
- [Pointer Events Standard](https://www.w3.org/TR/pointerevents/)