Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CORS and RFC1918 #143

Open
annevk opened this issue Mar 12, 2019 · 8 comments

Comments

Projects
None yet
4 participants
@annevk
Copy link
Collaborator

commented Mar 12, 2019

Request for Mozilla Position on an Emerging Web Specification

Other information

This might be on the critical path toward enabling high-resolution timers so it'd be good to know how much buy-in this has.

@martinthomson

This comment has been minimized.

Copy link
Member

commented Mar 13, 2019

https://bugzilla.mozilla.org/show_bug.cgi?id=1481298 has some background on this. I don't have time to do the digging right now, but there are some concerns there.

If this is part of spectre mitigations, maybe a summary explanation would help. That thread (like the bugs) is long.

@tomrittervg

This comment has been minimized.

Copy link

commented Mar 15, 2019

We would like to run-enable High Resolution Timers and Shared Array Buffer. Doing so presents a rick of a Spectre attack - either a Variant 1 attack that we are missing a mitigation for (we have no known holes, but nonetheless) or a different variant or attack that someone creates a practical application to the web platform of.

To mitigate the risk of what information is loaded into a Content Process (and thus vulnerable to Spectre) we would like to require the web page to opt into a more strict mode in order to re-enable High Res Timers.

It would require websites to signal their intention with an opt-in header. The header would enforce:

  1. All resource loads (javascript, css, images, etc) to EITHER disable all third party credentials (cookie and HTTP auth entries) OR require CORS (We'd prefer the latter)
  2. IF we don't require CORS all the time, probably require all resource loads to be over HTTPS
  3. IF we don't require CORS all the time, we would consider requiring a CORS preflight for any load from a private IP address (either 127.0.0.1, RFC1918, or their IPv6 analogs)

We'd like to require CORS all the time, but that may not match what other browsers want to do.

(1) is intended to prevent web resources intended to be kept private, private. However there are also the category if resources that are private but can be loaded solely on IP-based authentication. (2) is targeted at those: there seems to be a general opinion that most such resources don't use HTTPS. However that isn't the direction of the web we'd like to take: we want to encourage HTTPS adoption, not make it so enabling HTTPS makes you less secure.

Hence (3) - requiring the stronger CORS preflight for any private resource. This would not protect IP-based authenticated resources that exist on a public IP address.


The concern in 1481298 is that we may be legitimizing bad practice. It's bad practice to require IP-based authentication for anything, even local resources and even local network resources. While I tend to agree with this, I'm not certain we can meaningfully move this ship. Especially when we should consider that even if the local intranet site does require authentication in a proper way, the unauthenticated login page or resources can be an information disclosure for the user, identifying their place of employment, local software running, etc.

The problem from 354493 is, as far as I can tell, that it blocked all loads from private addresses by default. Requiring CORS would initially perform the same sort of breakage; but at least for SAB re-enablement, we only intend to require a CORS pre-flight when the site opts in.


So I'm supportive of this draft; but we'll have to figure out the breakage aspect of it before deployment. However I do think it's a good idea for SAB re-enablement.

@martinthomson

This comment has been minimized.

Copy link
Member

commented Mar 20, 2019

That analysis seems reasonable to me. If the goal of Spectre mitigations is to create processes that have absolutely zero secrets in them, even accidental ones, then we would need this, but I would say that option 1 is necessary. Networks that use IP authentication do use public address space. I worked for many years at such an institution.

On that basis, we should preflight absolutely everything when entering a mode that enables SharedArrayBuffer or removes fuzzing on timers. That's 1, not 3.

2 isn't hugely useful in this context, except as you say to cut out the most insecure stuff. I support it anyway.

We also need to ensure that the timing signal from other processes is sufficiently obscured. We can't have hostile processes equipped with high resolution timers attacking other processes using legitimate channels. That might mean obscuring timing of events from those processes by adding delays. That means that the site makes a trade-off when it enables this option: preferring its own performance and responsiveness at the cost of external communications latency.

(We can continue this discussion separately, I just wanted to point out that this is only a tiny part of the overall strategy.)

@tomrittervg

This comment has been minimized.

Copy link

commented Mar 20, 2019

We can continue this discussion separately

Where is best to do so?

@annevk

This comment has been minimized.

Copy link
Collaborator Author

commented Apr 9, 2019

Requiring CORS is indeed safer, for the case you mention. (Note that this doesn't mean preflights. Preflights only happen in certain cases. Preflights are done in "RFC1918 and CORS" to protect against XSRF, not to protect the response.) However, it doesn't seem like CORS as the only solution has enough cross-browser buy-in, which is why we're considering this alternative (perhaps simplified to block on private and local IP addresses, rather than have a mechanism to make them work). We might also want to put out an advisory of some kind that relying on IP-based authentication is not good.

@JLLeitschuh

This comment has been minimized.

Copy link

commented Apr 29, 2019

Some interesting research published by the PortSwigger blog that is relevant to RFC1918. Hopefully, any implementation of this RFC would help prevent this sort of attack.

Exposing Intranets with reliable Browser-based Port scanning

Proof of concept link

@JLLeitschuh

This comment has been minimized.

Copy link

commented May 2, 2019

Here's another example:

Remote Code Execution on most Dell computers

Dell running a localhost server on port 8884.

@annevk

This comment has been minimized.

Copy link
Collaborator Author

commented Jun 25, 2019

Update: this is no longer relevant as a path toward high-resolution timers. It still seems like a good idea to have something like this as defense-in-depth, e.g., for the links mentioned above. However, deploying this without opt-in might not be web-compatible. And given that Chrome has not moved forward with this recently maybe they realized as much as well?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.