-
Notifications
You must be signed in to change notification settings - Fork 72
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
CORS and RFC1918 #143
Comments
https://bugzilla.mozilla.org/show_bug.cgi?id=1481298 has some background on this. I don't have time to do the digging right now, but there are some concerns there. If this is part of spectre mitigations, maybe a summary explanation would help. That thread (like the bugs) is long. |
We would like to run-enable High Resolution Timers and Shared Array Buffer. Doing so presents a rick of a Spectre attack - either a Variant 1 attack that we are missing a mitigation for (we have no known holes, but nonetheless) or a different variant or attack that someone creates a practical application to the web platform of. To mitigate the risk of what information is loaded into a Content Process (and thus vulnerable to Spectre) we would like to require the web page to opt into a more strict mode in order to re-enable High Res Timers. It would require websites to signal their intention with an opt-in header. The header would enforce:
We'd like to require CORS all the time, but that may not match what other browsers want to do. (1) is intended to prevent web resources intended to be kept private, private. However there are also the category if resources that are private but can be loaded solely on IP-based authentication. (2) is targeted at those: there seems to be a general opinion that most such resources don't use HTTPS. However that isn't the direction of the web we'd like to take: we want to encourage HTTPS adoption, not make it so enabling HTTPS makes you less secure. Hence (3) - requiring the stronger CORS preflight for any private resource. This would not protect IP-based authenticated resources that exist on a public IP address. The concern in 1481298 is that we may be legitimizing bad practice. It's bad practice to require IP-based authentication for anything, even local resources and even local network resources. While I tend to agree with this, I'm not certain we can meaningfully move this ship. Especially when we should consider that even if the local intranet site does require authentication in a proper way, the unauthenticated login page or resources can be an information disclosure for the user, identifying their place of employment, local software running, etc. The problem from 354493 is, as far as I can tell, that it blocked all loads from private addresses by default. Requiring CORS would initially perform the same sort of breakage; but at least for SAB re-enablement, we only intend to require a CORS pre-flight when the site opts in. So I'm supportive of this draft; but we'll have to figure out the breakage aspect of it before deployment. However I do think it's a good idea for SAB re-enablement. |
That analysis seems reasonable to me. If the goal of Spectre mitigations is to create processes that have absolutely zero secrets in them, even accidental ones, then we would need this, but I would say that option 1 is necessary. Networks that use IP authentication do use public address space. I worked for many years at such an institution. On that basis, we should preflight absolutely everything when entering a mode that enables SharedArrayBuffer or removes fuzzing on timers. That's 1, not 3. 2 isn't hugely useful in this context, except as you say to cut out the most insecure stuff. I support it anyway. We also need to ensure that the timing signal from other processes is sufficiently obscured. We can't have hostile processes equipped with high resolution timers attacking other processes using legitimate channels. That might mean obscuring timing of events from those processes by adding delays. That means that the site makes a trade-off when it enables this option: preferring its own performance and responsiveness at the cost of external communications latency. (We can continue this discussion separately, I just wanted to point out that this is only a tiny part of the overall strategy.) |
Where is best to do so? |
Requiring CORS is indeed safer, for the case you mention. (Note that this doesn't mean preflights. Preflights only happen in certain cases. Preflights are done in "RFC1918 and CORS" to protect against XSRF, not to protect the response.) However, it doesn't seem like CORS as the only solution has enough cross-browser buy-in, which is why we're considering this alternative (perhaps simplified to block on private and local IP addresses, rather than have a mechanism to make them work). We might also want to put out an advisory of some kind that relying on IP-based authentication is not good. |
Some interesting research published by the PortSwigger blog that is relevant to RFC1918. Hopefully, any implementation of this RFC would help prevent this sort of attack. Exposing Intranets with reliable Browser-based Port scanning |
Here's another example: Remote Code Execution on most Dell computers Dell running a localhost server on port 8884. |
Update: this is no longer relevant as a path toward high-resolution timers. It still seems like a good idea to have something like this as defense-in-depth, e.g., for the links mentioned above. However, deploying this without opt-in might not be web-compatible. And given that Chrome has not moved forward with this recently maybe they realized as much as well? |
For documentation purposes: Here's another example of this same abuse of a localhost server that could have been prevented by this RFC:
There actually was an RCE in that webserver: |
Here's another case from 2016 that would have been really bad if exploited. It was an RCE vulnerability in all JetBrains IDEs that enabled RCE. Edit: wrong link |
Who would do the opting in? And what sorts of compatibility problems would you expect? Mainly ones involving This does seem like a behavior that would be desirable if it were compatible. |
An idea we had for COEP was that instead of requiring a Cross-Origin-Resource-Policy header on responses we would instead not send credentials with any request and require CORS and RFC1918 to be used as well for affected URLs.
Yeah, I think it's desirable as well, especially for |
There's been a recent discussion about how sites like ebay are doing local port scans on users computers as a security measure. They aren't asking users for their permission to do this. |
I think this is something the web should have and I welcome Chrome's experiment here. I suggest we mark this as worth prototyping for these reasons:
This could go even further and better protect local-network-to-local-network requests, but I have some hope we'll be able to get there eventually. |
Thanks Anne for the update. If this is something worth prototyping, would Mozilla also be open to giving the green light to web-platform-tests/rfcs#72, which would allow testing the spec correctly? |
Request for Mozilla Position on an Emerging Web Specification
Other information
This might be on the critical path toward enabling high-resolution timers so it'd be good to know how much buy-in this has.
The text was updated successfully, but these errors were encountered: