Join GitHub today
GitHub is home to over 40 million developers working together to host and review code, manage projects, and build software together.Sign up
Security and privacy considerations for DOMHighResTimeStamp resolution #79
Note: the intent of this issue is to provide a reference and track the ongoing research, discussions, proposals, and implementation techniques employed by various browsers on how they expose or provide access to high resolution time.
Paraphrasing Section 7.1: Clock resolution...
This problem space space remains an unsolved and an evolving one. There is no existing industry consensus or a definitive set of recommendations that applies to all browsers, which is reflected in the range of different implementations and platform-specific techniques used by various browsers.
Relevant prior art and discussions:
Thanks for filing that!
When looking at this space, I think it's critical to understand which of the potential attack vectors are not either of:
I also think that any such remaining attack vectors are something which we should take into consideration with any of the implicit timers we have or planning to have in the platform (SharedArrayBuffers, postMessage, requestAnimationFrame, Date.now(), etc).
@tomlowenthal per my note in the introduction..
As discussed in #20 (comment) (please see linked doc), HR-Time as a spec offers more than sub-millisecond resolution and the intent of this thread is specifically to track ongoing discussions on various implementation strategies of providing sub-millisecond resolution. If that's what you're referring to in your comment, then yes this is the right thread to continue the conversation.
Thanks for following up here everyone. Just to reiterate, there are a number of concerns with exposing / adding more HRT timers; the concern is not only that Spectre mitigations don't address all the privacy violating timing attacks on the platform. Here's an overview of the topics that came up in the previous issue, and during the calls we had.
Spectre mitigations (e.g. process isolation) address only some of the ways HRT can be used to harm privacy. Exactly which attacks depend on implementation details, but a partial list include:
1: Micro architecture attacks
Would be happy to share more examples as needed…
Existing timing sources are rarer than they appear
Much of the argument is that there are already many existing HRT sources, so adding more has low marginal privacy cost. Besides being a privacy-harming way of approaching the issue (b/c of the way that functionality is hard to extract from the environment, privacy-debt-etc), we also believe that there are less of existing sources of HRT that are suggested.
The frequently cited paper, "Fantastic Timers", conflates async implicit timers (e.g.
W3C generally expects to shipping implementations of a standard prior to standardization. Firefox and Safari both don't implement the core feature of the standard (namely returning high-resolution time stamps), and so the standard does not seem to meet the expected bar.
Additionally, it's not a good practice to ship this standard, under the expectation that a future standard will correct it. That removes flexibility / options from the people working on that future standard, leaves users vulnerable in the interim, and makes it difficult-to-impossible the privacy impact of the current standard while the future standard is in flux.
Not User Serving
To echo the point @samuelweiler made on the call, the user-serving case for the feature seem narrow and uncommon. Making HRT globally available, when their user-serving cases are narrow, violates basic privacy and security principals (e.g. least privilege, etc.). The user serving cases mentioned in the standard appear to be achievable with greatly restricted feature availability.
There are mountains of papers and attacks that leverage HRT timing signals. Its worth being extremely cautious before introducing more sources of HRT into the environment then, especially (but not only) because the positive use cases seem very uncommon.
@snyderp thanks for your feedback. I feel that we're (still) talking past each other a bit here though..
These bits, I believe, we agree on.
I disagree with this evaluation. Please see our earlier discussion in #20 (comment). The specification offers multiple important features, one of which is exposing DOMHighResTimestamp, and specifically allows user agents to adjust the exposed resolution. Implementations that chose to raise it to same same resolution as Date.now(), or anything in between 5us-1ms, are compliant.
This touches on the same themes I covered in #20 (comment). Unfortunately I'm not aware of any strategies to restrict access with reasonable consensus amongst implementers.
@snyderp - Thanks for outlining your concerns. I understand you're relatively new to the standards world, but here we typically try to be respectful of people's time and avoid sending multiple ~20 page papers as an exercise to the reader. In the future when referring to such work, it'd be great if you could link to it, and sum up the different attacks that it details.
In order to better communicate your concerns to folks on this issue, I took the liberty to try and sum up the attacks you pointed out.
This attack is a "pixel stealing" attack (which can reveal sensitive contents inside a cross-origin resource or iframe, or the user's history through
It is using the following facts:
The attack then:
While the attack does use
If I had to look for the root cause here, I'd probably start with the application of SVG filters across-origins in unrestricted ways (e.g. in ways that enable the
This fascinating talk explores mitigations to the previously described pixel-stealing vulnerability, and bugs in those mitigations.
The mitigations described:
Note that no browser considered disabling or adding a permission prompt for the use of
This attack is based on the fact that some CPU caches are shared between different processes and other trust boundaries, and timing attacks can be used to inspect those caches and conclude things regarding user activity (e.g. keystroke/mouse events, presence of a user through ambient light sensors, etc).
A few highlights from the paper:
IIRC, that last point is the reason the granularity of HR-time was limited to 5 microseconds.
@snyderp - Do you know how native OSes deal with that type of attacks? Are they limiting access to timers in similar scenarios? Are there other mitigations used there that we could borrow?
This paper presents four different attacks which reveal the user's history.
IMO, the root cause for all these attacks that inspect
Could you share examples where these 3 conditions are met?
Could you expand on how