Copyright © 2022 the Contributors to the Touch Events - Level 2 Specification, published by the under the W3C Community Contributor License Agreement (CLA) . A human-readable summary is available.
This specification was published by the . It is not a W3C Standard nor is it on the W3C Standards Track. Please note that under the W3C Community Contributor License Agreement (CLA) there is a limited opt-out and other conditions apply. Learn more about W3C Community and Business Groups .
By publishing this Recommendation, W3C expects that the functionality specified in this Touch Interface Recommendation will not be affected by changes to HTML5 or Web IDL as those specifications proceed to Recommendation. The WG has completed and approved this specification's Test Suite and created an Implementation Report that shows that two or more independent implementations pass each test. This version of the specification includes fixes and improvements to Level 1 , and incorporates the features previously published as Touch Event Extensions .
The Touch Events specification defines a set of low-level events that represent one or more points of contact with a touch-sensitive surface, and changes of those points with respect to the surface and any DOM elements displayed upon it (e.g. for touch screens) or associated with it (e.g. for drawing tablets without displays). It also addresses pen-tablet devices, such as drawing tablets, with consideration toward stylus capabilities.
This section is non-normative.
User Agents that run on terminals which provide touch input to use web applications typically use interpreted mouse events to allow users to access interactive web applications. However, these interpreted events, being normalized data based on the physical touch input, tend to have limitations on delivering the intended user experience. Additionally, it is not possible to handle concurrent input regardless of device capability, due to constraints of mouse events: both system level limitations and legacy compatibility.
Meanwhile, native applications are capable of handling both cases with the provided system APIs.
The Touch Events specification provides a solution to this problem by specifying interfaces to allow web applications to directly handle touch events, and multiple touch points for capable devices.
As well as sections marked as non-normative, all authoring guidelines, diagrams, examples, and notes in this specification are non-normative. Everything else in this specification is normative.
The key word MUST in this document is to be interpreted as described in BCP 14 [ RFC2119 ] [ RFC8174 ] when, and only when, they appear in all capitals, as shown here.
This specification defines conformance criteria that apply to a single product: the user agent that implements the interfaces that it contains.
WindowProxy is defined in [ HTML5 ].
The IDL blocks in this specification are conforming IDL fragments as defined by the WebIDL specification [ WEBIDL ].
A conforming user agent must also be a conforming ECMAScript implementation of this IDL fragments in this specification, with the following exception:
Note:
Both
ways
of
reflecting
IDL
attributes
allow
for
simply
getting
and
setting
the
property
on
the
platform
object
to
work.
For
example,
given
a
Touch
object
aTouch
,
evaluating
aTouch.target
would
return
the
EventTarget
for
the
Touch
object.
If
the
user
agent
implements
IDL
attributes
as
accessor
properties,
then
the
property
access
invokes
the
getter
which
returns
the
EventTarget
.
If
the
user
agent
implements
IDL
attributes
as
data
properties
on
the
platform
object
with
the
same
behavior
as
would
be
found
with
the
accessor
properties,
then
the
object
would
appear
to
have
an
own
property
named
target
whose
value
is
an
EventTarget
object,
and
the
property
access
would
return
this
value.
Touch
Interface
This
interface
describes
an
individual
touch
point
for
a
touch
event.
objects
are
immutable;
after
one
is
created,
its
attributes
must
not
change.
Touch
WebIDLenumTouchType
{ "direct
", "stylus
" }; dictionaryTouchInit
{ required longidentifier
; required EventTargettarget
; doubleclientX
= 0; doubleclientY
= 0; doublescreenX
= 0; doublescreenY
= 0; doublepageX
= 0; doublepageY
= 0; floatradiusX
= 0; floatradiusY
= 0; floatrotationAngle
= 0; floatforce
= 0; doublealtitudeAngle
= 0; doubleazimuthAngle
= 0;TouchType
touchType
= "direct"; }; [Exposed=Window] interfaceTouch
{constructor
(TouchInit
touchInitDict); readonly attribute longidentifier
; readonly attribute EventTargettarget
; readonly attribute doublescreenX
; readonly attribute doublescreenY
; readonly attribute doubleclientX
; readonly attribute doubleclientY
; readonly attribute doublepageX
; readonly attribute doublepageY
; readonly attribute floatradiusX
; readonly attribute floatradiusY
; readonly attribute floatrotationAngle
; readonly attribute floatforce
; readonly attribute floataltitudeAngle
; readonly attribute floatazimuthAngle
; readonly attributeTouchType
touchType
; };
identifier
An identification number for each touch point .
When
a
touch
point
becomes
active,
it
must
be
assigned
an
identifier
that
is
distinct
from
any
other
active
touch
point
.
While
the
touch
point
remains
active,
all
events
that
refer
to
it
must
assign
it
the
same
identifier
.
target
The
EventTarget
on
which
the
touch
point
started
when
it
was
first
placed
on
the
surface,
even
if
the
touch
point
has
since
moved
outside
the
interactive
area
of
that
element.
Some implementations alter the target element to correct for the imprecision of coarse input. Therefore, the target element may not necessarily be the element directly at the coordinates of the event. The methods used to target/disambiguate coarse input are out of scope for this specification.
screenX
The horizontal coordinate of point relative to the screen in pixels
screenY
The vertical coordinate of point relative to the screen in pixels
clientX
The horizontal coordinate of point relative to the viewport in pixels, excluding any scroll offset
clientY
The vertical coordinate of point relative to the viewport in pixels, excluding any scroll offset
pageX
The horizontal coordinate of point relative to the viewport in pixels, including any scroll offset
pageY
The vertical coordinate of point relative to the viewport in pixels, including any scroll offset
radiusX
The
radius
of
the
ellipse
which
most
closely
circumscribes
the
touching
area
(e.g.
finger,
stylus)
along
the
axis
indicated
by
rotationAngle,
in
CSS
pixels
(as
defined
by
[
CSS-VALUES
])
of
the
same
scale
as
screenX;
0
if
no
value
is
known.
The
value
must
not
be
negative.
radiusY
The
radius
of
the
ellipse
which
most
closely
circumscribes
the
touching
area
(e.g.
finger,
stylus)
along
the
axis
perpendicular
to
that
indicated
by
rotationAngle,
in
CSS
pixels
(as
defined
by
[
CSS-VALUES
])
of
the
same
scale
as
screenY;
0
if
no
value
is
known.
The
value
must
not
be
negative.
rotationAngle
The
angle
(in
degrees)
that
the
ellipse
described
by
radiusX
and
radiusY
is
rotated
clockwise
about
its
center;
0
if
no
value
is
known.
The
value
must
be
greater
than
or
equal
to
0
and
less
than
90
.
If
the
ellipse
described
by
radiusX
and
radiusY
is
circular,
then
rotationAngle
has
no
effect.
The
user
agent
may
use
0
as
the
value
in
this
case,
or
it
may
use
any
other
value
in
the
allowed
range.
(For
example,
the
user
agent
may
use
the
rotationAngle
value
from
the
previous
touch
event,
to
avoid
sudden
changes.)
force
A
relative
value
of
pressure
applied,
in
the
range
0
to
1
,
where
0
is
no
pressure,
and
1
is
the
highest
level
of
pressure
the
touch
device
is
capable
of
sensing;
0
if
no
value
is
known.
In
environments
where
force
is
known,
the
absolute
pressure
represented
by
the
force
attribute,
and
the
sensitivity
in
levels
of
pressure,
may
vary.
altitudeAngle
The
altitude
(in
radians)
of
a
stylus,
the
transducer
(e.g.
pen/stylus),
in
the
range
[0,π/2]
—
where
0
(parallel
is
parallel
to
the
surface)
to
surface
(X-Y
plane),
and
π/2
(perpendicular
is
perpendicular
to
the
surface).
surface.
For
hardware
and
platforms
that
do
not
report
tilt
or
angle,
the
value
MUST
be
0.
0
altitudeAngle
altitudeAngle
property,
which
altitudeAngle
of
π/4
(45
degrees
from
the
X-Y
plane).
azimuthAngle
The
azimuth
angle
(in
radians)
of
a
stylus,
the
transducer
(e.g.
pen/stylus),
in
the
range
0
to
2π
.
[0,
2π]
—
where
0
represents
a
stylus
transducer
whose
cap
is
pointing
in
the
direction
of
increasing
screenX
values.
π/2
represents
a
stylus
whose
cap
X
values
(point
to
"3
o'clock"
if
looking
straight
down)
on
the
X-Y
plane,
and
the
values
progressively
increase
when
going
clockwise
(π/2
at
"6
o'clock",
π
at
"9
o'clock",
3π/2
at
"12
o'clock").
When
the
transducer
is
pointing
in
perfectly
perpendicular
to
the
direction
of
increasing
screenY
values.
The
value
surface
(
0
altitudeAngle
should
of
π/2),
the
value
MUST
be
used
for
devices
which
0.
For
hardware
and
platforms
that
do
not
support
this
property.
report
tilt
or
angle,
the
value
should
be
0.
azimuthAngle
of
π/6
("4
o'clock").
touchType
The type of device used to trigger the touch.
TouchType
An enumeration representing the different types of possible touch input.
direct
A direct touch from a finger on the screen.
stylus
A touch from a stylus or pen device.
TouchList
Interface
This
interface
defines
a
list
of
individual
points
of
contact
for
a
touch
event.
objects
are
immutable;
after
one
is
created,
its
contents
must
not
change.
TouchList
A
TouchList
object's
supported
property
indices
([
WEBIDL
])
are
the
numbers
in
the
range
0
to
one
less
than
the
length
of
the
list.
WebIDL[Exposed=Window] interfaceTouchList
{ readonly attribute unsigned longlength
; getterTouch
?item
(unsigned long index); };
TouchEvent
Interface
This
interface
defines
the
,
touchstart
,
touchend
,
and
touchmove
event
types.
touchcancel
objects
are
immutable;
after
one
is
created
and
initialized,
its
attributes
must
not
change.
TouchEvent
TouchEvent
inherits
from
the
UIEvent
interface
defined
in
[
DOM-LEVEL-3-EVENTS
].
The
TouchEventInit
dictionary
is
used
by
the
TouchEvent
interface's
constructor
to
provide
a
mechanism
by
which
to
construct
untrusted
(synthetic)
touch
events.
It
inherits
from
the
EventModifierInit
dictionary
defined
in
[
DOM-LEVEL-3-EVENTS
].
The
steps
for
constructing
an
event
are
defined
in
[
DOM4
].
See
the
example
for
sample
code
demonstrating
how
to
fire
an
untrusted
touch
event.
WebIDLdictionaryTouchEventInit
: EventModifierInit { sequence<Touch
>touches
= []; sequence<Touch
>targetTouches
= []; sequence<Touch
>changedTouches
= []; }; [Exposed=Window] interfaceTouchEvent
: UIEvent {constructor
(DOMString type, optionalTouchEventInit
eventInitDict = {}); readonly attributeTouchList
touches
; readonly attributeTouchList
targetTouches
; readonly attributeTouchList
changedTouches
; readonly attribute booleanaltKey
; readonly attribute booleanmetaKey
; readonly attribute booleanctrlKey
; readonly attribute booleanshiftKey
; getter booleangetModifierState
(DOMString keyArg); };
touches
A
list
of
objects
for
every
point
of
contact
currently
touching
the
surface.
Touch
targetTouches
A
list
of
objects
for
every
point
of
contact
that
is
touching
the
surface
and
started
on
the
element
that
is
the
target
of
the
current
event.
Touch
changedTouches
A
list
of
objects
for
every
point
of
contact
which
contributed
to
the
event.
Touch
For
the
event
this
must
be
a
list
of
the
touch
points
that
just
became
active
with
the
current
event.
For
the
touchstart
event
this
must
be
a
list
of
the
touch
points
that
have
moved
since
the
last
event.
For
the
touchmove
and
touchend
events
this
must
be
a
list
of
the
touch
points
that
have
just
been
removed
from
the
surface,
with
the
last
known
coordinates
of
the
touch
points
before
they
were
removed.
touchcancel
altKey
true
if
the
alt
(Alternate)
key
modifier
is
activated;
otherwise
false
metaKey
true
if
the
meta
(Meta)
key
modifier
is
activated;
otherwise
false
.
On
some
platforms
this
attribute
may
map
to
a
differently-named
key
modifier.
ctrlKey
true
if
the
ctrl
(Control)
key
modifier
is
activated;
otherwise
false
shiftKey
true
if
the
shift
(Shift)
key
modifier
is
activated;
otherwise
false
getModifierState
(keyArg)
Queries
the
state
of
a
modifier
using
a
key
value.
Returns
true
if
it
is
a
modifier
key
and
the
modifier
is
activated,
false
otherwise.
This section is non-normative.
User
agents
should
ensure
that
all
objects
available
from
a
given
Touch
are
all
associated
to
the
same
document
that
the
TouchEvent
was
dispatched
to.
To
implement
this,
user
agents
should
maintain
a
notion
of
the
current
touch-active
document.
On
first
touch,
this
is
set
to
the
target
document
where
the
touch
was
created.
When
all
active
touch
points
are
released,
the
touch-active
document
is
cleared.
All
TouchEvent
s
are
dispatched
to
the
current
touch-active
document,
and
each
TouchEvent
object
it
contains
refers
only
to
DOM
elements
(and
co-ordinates)
in
that
document.
If
a
touch
starts
entirely
outside
the
currently
touch-active
document,
then
it
is
ignored
entirely.
Touch
This section is non-normative.
The
examples
below
demonstrate
the
relations
between
the
different
members
defined
in
a
TouchList
.
TouchEvent
touches
and
targetTouches
of
a
TouchEvent
This
example
demonstrates
the
utility
and
relations
between
the
touches
and
targetTouches
members
defined
in
the
interface.
The
following
code
will
generate
different
output
based
on
the
number
of
touch
points
on
the
touchable
element
and
the
document:
TouchEvent
<div id='touchable'>This element is touchable.</div>
<script>
document.getElementById('touchable').addEventListener('touchstart', function(ev) {
if (ev.touches.item(0) == ev.targetTouches.item(0))
{
/**
* If the first touch on the surface is also targeting the
* "touchable" element, the code below should execute.
* Since targetTouches is a subset of touches which covers the
* entire surface, TouchEvent.touches >= TouchEvents.targetTouches
* is always true.
*/
document.write('Hello Touch Events!');
}
if (ev.touches.length == ev.targetTouches.length)
{
/**
* If all of the active touch points are on the "touchable"
* element, the length properties should be the same.
*/
document.write('All points are on target element')
}
if (ev.touches.length > 1)
{
/**
* On a single touch input device, there can only be one point
* of contact on the surface, so the following code can only
* execute when the terminal supports multiple touches.
*/
document.write('Hello Multiple Touch!');
}
}, false);
</
script
>
changedTouches
of
a
TouchEvent
This
example
demonstrates
the
utility
of
changedTouches
and
it's
relation
with
the
other
members
of
the
TouchList
interface.
The
code
is
a
example
which
triggers
whenever
a
touch
point
is
removed
from
the
defined
touchable
element:
TouchEvent
<div id='touchable'>This element is touchable.</div>
<script>
document.getElementById('touchable').addEventListener('touchend', function(ev) {
/**
* Example output when three touch points are on the surface,
* two of them being on the "touchable" element and one point
* in the "touchable" element is lifted from the surface:
*
* Touch points removed: 1
* Touch points left on element: 1
* Touch points left on document: 2
*/
document.write('Touch points removed: ' + ev.changedTouches.length);
document.write('Touch points left on element: ' + ev.targetTouches.length);
document.write('Touch points left on document: ' + ev.touches.length);
}, false);
</
script
>
TouchEvent
from
script
This
example
demonstrates
how
to
create
and
fire
a
from
script.
TouchEvent
if (Touch.length < 1 || TouchEvent.length < 1)
throw "TouchEvent constructors not supported";
var touch = new Touch({
identifier: 42,
target: document.body,
clientX: 200,
clientY: 200,
screenX: 300,
screenY: 300,
pageX: 200,
pageY: 200,
radiusX: 5,
radiusY: 5
});
var touchEvent = new TouchEvent("touchstart", {
cancelable: true,
bubbles: true,
composed: true,
touches: [touch],
targetTouches: [touch],
changedTouches: [touch]
});
document
.body.dispatchEvent(touchEvent);
TouchEvent
types
This section is non-normative.
The
following
table
provides
a
summary
of
the
event
types
defined
in
this
specification.
All
events
should
accomplish
the
bubbling
phase.
All
events
should
be
composed
[
WHATWG-DOM
]
events.
TouchEvent
Event Type | Sync / Async | Bubbling phase | Composed | Trusted proximal event target types | Interface | Cancelable | Default Action |
---|---|---|---|---|---|---|---|
|
Sync | Yes | Yes |
Document
,
Element
|
|
Varies | undefined |
|
Sync | Yes | Yes |
Document
,
Element
|
|
Varies | Varies: user agents may dispatch mouse and click events |
|
Sync | Yes | Yes |
Document
,
Element
|
|
Varies | undefined |
|
Sync | Yes | Yes |
Document
,
Element
|
|
No | none |
Canceling
a
touch
event
can
prevent
or
otherwise
interrupt
scrolling
(which
could
be
happening
in
parallel
with
script
execution).
For
maximum
scroll
performance,
a
user
agent
may
not
wait
for
each
touch
event
associated
with
the
scroll
to
be
processed
to
see
if
it
will
be
canceled.
In
such
cases
the
user
agent
should
generate
touch
events
whose
cancelable
property
is
false
,
indicating
that
preventDefault
cannot
be
used
to
prevent
or
interrupt
scrolling.
Otherwise
cancelable
will
be
true
.
In particular, a user agent may generate only uncancelable touch events when it observes that there are no non-passive listeners for the event.
touchstart
event
A user agent must dispatch this event type to indicate when the user places a touch point on the touch surface.
The
target
of
this
event
must
be
an
Element
.
If
the
touch
point
is
within
a
frame,
the
event
should
be
dispatched
to
an
element
in
the
child
browsing
context
of
that
frame.
If this event is canceled , it should prevent any default actions caused by any touch events associated with the same active touch point , including mouse events or scrolling.
touchend
event
A user agent must dispatch this event type to indicate when the user removes a touch point from the touch surface, also including cases where the touch point physically leaves the touch surface, such as being dragged off of the screen.
The
target
of
this
event
must
be
the
same
Element
on
which
the
touch
point
started
when
it
was
first
placed
on
the
surface,
even
if
the
touch
point
has
since
moved
outside
the
interactive
area
of
the
target
element.
The
touch
point
or
points
that
were
removed
must
be
included
in
the
attribute
of
the
changedTouches
,
and
must
not
be
included
in
the
TouchEvent
and
touches
attributes.
targetTouches
If this event is canceled , any sequence of touch events that includes this event must not be interpreted as a click .
touchmove
event
A user agent must dispatch this event type to indicate when the user moves a touch point along the touch surface.
The
target
of
this
event
must
be
the
same
Element
on
which
the
touch
point
started
when
it
was
first
placed
on
the
surface,
even
if
the
touch
point
has
since
moved
outside
the
interactive
area
of
the
target
element.
Note
that
the
rate
at
which
the
user
agent
sends
events
is
implementation-defined,
and
may
depend
on
hardware
capabilities
and
other
implementation
details.
touchmove
A
user
agent
should
suppress
the
default
action
caused
by
any
event
until
at
least
one
touchmove
event
associated
with
the
same
active
touch
point
is
not
canceled
.
Whether
the
default
action
is
suppressed
for
touchmove
events
after
at
least
one
touchmove
event
associated
with
the
same
active
touch
point
is
not
canceled
is
implementation
dependent.
touchmove
touchcancel
event
A
user
agent
must
dispatch
this
event
type
to
indicate
when
a
touch
point
has
been
disrupted
in
an
implementation-specific
manner,
such
as
a
synchronous
event
or
action
originating
from
the
UA
canceling
the
touch,
or
the
touch
point
leaving
the
document
window
into
a
non-document
area
which
is
capable
of
handling
user
interactions
(e.g.
the
UA's
native
user
interface,
or
an
area
of
the
document
which
is
managed
by
a
plug-in).
A
user
agent
may
also
dispatch
this
event
type
when
the
user
places
more
touch
point
s
on
the
touch
surface
than
the
device
or
implementation
is
configured
to
store,
in
which
case
the
earliest
object
in
the
Touch
should
be
removed.
TouchList
The
target
of
this
event
must
be
the
same
Element
on
which
the
touch
point
started
when
it
was
first
placed
on
the
surface,
even
if
the
touch
point
has
since
moved
outside
the
interactive
area
of
the
target
element.
The
touch
point
or
points
that
were
removed
must
be
included
in
the
attribute
of
the
changedTouches
,
and
must
not
be
included
in
the
TouchEvent
and
touches
attributes.
targetTouches
The following section describes retargeting steps , defined in [ WHATWG-DOM ].
Touch
object
has
an
associated
unadjustedTarget
(null
or
EventTarget
).
Unless
stated
otherwise
it
is
null.
TouchEvent
's
retargeting
steps
,
given
a
touchEvent
,
must
run
these
steps:
For
each
touch
in
touchEvent
's
Touch
touches
,
targetTouches
,
and
changedTouches
members:
unadjustedTarget
to
touch
's
target
if
touch
's
unadjustedTarget
is
null.
target
to
the
result
of
invoking
retargeting
touch
's
unadjustedTarget
against
touchEvent
's
target
.
User agents have an associated boolean expose legacy touch event APIs whose value is implementation-defined .
Existing web content often use the existence of these APIs as a signal that the user agent is a touch-enabled "mobile" device, and therefore exposing these APIs on non-mobile devices, even if they are touch-enabled, could lead to a suboptimal user experience for such web content.
GlobalEventHandlers
mixin
The
following
section
describes
extensions
to
the
existing
GlobalEventHandlers
mixin,
defined
in
[
HTML5
],
to
facilitate
the
event
handler
registration.
For
user
agents
where
expose
legacy
touch
event
APIs
is
false,
this
mixin
must
not
be
implemented.
WebIDLpartial interface mixinGlobalEventHandlers
{ attribute EventHandlerontouchstart
; attribute EventHandlerontouchend
; attribute EventHandlerontouchmove
; attribute EventHandlerontouchcancel
; };
ontouchstart
The
event
handler
IDL
attribute
(see
[
HTML5
])
for
the
touchstart
event
type.
ontouchend
The
event
handler
IDL
attribute
(see
[
HTML5
])
for
the
touchend
event
type.
ontouchmove
The
event
handler
IDL
attribute
(see
[
HTML5
])
for
the
touchmove
event
type.
ontouchcancel
The
event
handler
IDL
attribute
(see
[
HTML5
])
for
the
touchcancel
event
type.
click
The
user
agent
may
dispatch
both
touch
events
and
(for
compatibility
with
web
content
not
designed
for
touch)
mouse
events
[
DOM-LEVEL-2-EVENTS
]
in
response
to
the
same
user
input.
If
the
user
agent
dispatches
both
touch
events
and
mouse
events
in
response
to
a
single
user
action,
then
the
event
type
must
be
dispatched
before
any
mouse
event
types
for
that
action.
If
touchstart
,
touchstart
,
or
touchmove
are
canceled
,
the
user
agent
should
not
dispatch
any
mouse
event
that
would
be
a
consequential
result
of
the
prevented
touch
event.
touchend
If a Web application can process touch events, it can cancel the events, and no corresponding mouse events would need to be dispatched by the user agent. If the Web application is not specifically written for touch input devices, it will react to the subsequent mouse events instead.
User agents will typically dispatch mouse and click events only for single-finger activation gestures (like tap and long press). Gestures involving movement of the touch point or multi-touch interactions – with two or more active touch points – will usually only generate touch events.
If
the
user
agent
interprets
a
sequence
of
touch
events
as
a
tap
gesture,
then
it
should
dispatch
mousemove
,
mousedown
,
mouseup
,
and
click
events
(in
that
order)
at
the
location
of
the
event
for
the
corresponding
touch
input.
If
the
contents
of
the
document
have
changed
during
processing
of
the
touch
events,
then
the
user
agent
may
dispatch
the
mouse
events
to
a
different
target
than
the
touch
events.
touchend
The default actions and ordering of any further touch and mouse events are implementation-defined, except as specified elsewhere.
The activation of an element (e.g., in some implementations, a tap) would typically produce the following event sequence (though this may vary slightly, depending on specific user agent behavior):
touchstart
touchmove
events,
depending
on
movement
of
the
finger
touchend
mousemove
(for
compatibility
with
legacy
mouse-specific
code)
mousedown
mouseup
click
If,
however,
either
the
touchstart
,
touchmove
or
touchend
event
has
been
canceled
during
this
interaction,
no
mouse
or
click
events
will
be
fired,
and
the
resulting
sequence
of
events
would
simply
be:
touchstart
touchmove
events,
depending
on
movement
of
the
finger
touchend
Even if a user agent supports Touch Events, this does not necessarily mean that a touchscreen is the only input mechanism available to users. Particularly in the case of touch-enabled laptops, or traditional "touch only" devices (such as phones and tablets) with paired external input devices, users may use the touchscreen in conjunction with a trackpad, mouse or keyboard. For this reason, developers should avoid binding event listeners with "either touch or mouse/keyboard" conditional code, as this results in sites/application that become touch-exclusive, preventing users from being able to use any other input mechanism.
// conditional "touch OR mouse/keyboard" event binding
// DON'T DO THIS, as it makes interactions touch-exclusive
// on devices that have both touch and mouse/keyboard
if ('ontouchstart' in window) {
// set up event listeners for touch
target.addEventListener('touchend', ...);
...
} else {
// set up event listeners for mouse/keyboard
target.addEventListener('click', ...);
...
}
Instead, developers should handle different forms of input concurrently.
// concurrent "touch AND mouse/keyboard" event binding
// set up event listeners for touch
target.addEventListener('touchend', function(e) {
// prevent compatibility mouse events and click
e.preventDefault();
...
});
...
// set up event listeners for mouse/keyboard
target.addEventListener('click', ...);
...
To avoid processing the same interaction twice for touch (once for the touch event, and once for the compatibility mouse events), developers should make sure to cancel the touch event, suppressing the generation of any further mouse or click events. Alternatively, see the InputDeviceCapabilities API for a way to detect mouse events that were generated as a result of touch events.
touchstart
event
indicating
its
appearance.
It
ceases
to
be
active
after
the
user
agent
dispatches
a
touchend
or
touchcancel
event
indicating
that
the
touch
point
is
removed
from
the
surface
or
no
longer
tracked.
preventDefault()
,
returning
false
in
an
event
handler,
or
other
means
as
defined
by
[
DOM-LEVEL-3-EVENTS
]
and
[
HTML5
].
This section is non-normative.
The working group maintains a list of open issues in this specification . These issues may be addressed in future revisions of the specification.
This section is non-normative.
Many thanks to the WebKit engineers for developing the model used as a basis for this spec, Neil Roberts (SitePen) for his summary of WebKit touch events, Peter-Paul Koch (PPK) for his write-ups and suggestions, Robin Berjon for developing the ReSpec.js spec authoring tool , and the WebEvents WG for their many contributions.
Many others have made additional comments as the spec developed, which have led to steady improvements. Among them are Matthew Schinckel, Andrew Grieve, Cathy Chan, Boris Zbarsky, Patrick H. Lauke, and Simon Pieters. If we inadvertently omitted your name, please let me know.
The group acknowledges the following contributors to this specification's test suite: Matt Brubeck, Olli Pettay, Art Barstow, Cathy Chan and Rick Byers.
This section is non-normative.
This is a summary of the major changes made since the 10 October 2013 Recommendation was published. Full commit history is also available.