Skip to content
This repository has been archived by the owner. It is now read-only.

Specific script element attributes prevent injections #16

Open
coloco21 opened this issue Dec 8, 2015 · 23 comments
Open

Specific script element attributes prevent injections #16

coloco21 opened this issue Dec 8, 2015 · 23 comments

Comments

@coloco21
Copy link

@coloco21 coloco21 commented Dec 8, 2015

I have found that this extension broke the Google Play Store (http://play.google.com/) for me. It doesn't show the menu on the left, the search bar doesn't work, and clicking on install on an app's page doesn't work either. Seems like a JS script is broken, but I don't have time to figure out which one.

Deactivating the extension fixes the problem as expected.

@Synzvato Synzvato changed the title Breaks the Google Play Store Breaks the Google Play Store website Dec 8, 2015
@Synzvato Synzvato added the bug label Dec 8, 2015
@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Dec 8, 2015

@coloco21 Thanks for reporting the issues.

Let me share some initial findings. The bug occurs when a website assigns the "crossorigin" attribute to a script element that references an injectable resource.

<script src='//example.cdn.com/jquery/1.9.0/jquery.min' crossorigin></script>

According to this MDN article by Mozilla, the script element attribute allows for error logging on sites which use a separate domain to serve static media. However, it also seems to trigger an onerror event when a CORS request fails. This probably makes Firefox ignore the contents of the payload.

Without the "crossorigin" attribute, no errors are raised. Here's a reproduction of the bug. To anyone: ideas, thoughts and creative suggestions or solutions are highly welcome!

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Dec 9, 2015

For now, a temporary fix has been applied through f15ba32. It simply keeps injections from occuring on play.google.com, which is obviously not a solution to the underlying problem described above.

@RoxKilly
Copy link

@RoxKilly RoxKilly commented Dec 10, 2015

I'm not familiar with how extensions work so what I propose here may not be possible but... Do you have access to the DOM? If so you could use the beforescriptexecute event handler (docs) and examine the event's target property (the script element I think) for the crossorigin attribute. If the attribute is present you can decide whether to:

  1. Allow the script to execute (let the browser contact the CDN) instead of serving the JS locally
  2. Modify the element to remove the crossorigin attribute, then serve the JS locally

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Dec 14, 2015

@RoxKilly Thanks for joining in, very helpful, and much appreciated!

I did try out these and other methods and have found that it is indeed possible (but takes effort) to get the DOM context responsible for a request. I first tried to check for crossorigin attributes as soon as a valid candidate is detected. However, the requests go out before the DOM is properly parsed.

This means that, when taking this approach, all local injections will need to be delayed until the DOM is parsed, which would have a relatively big performance impact. I tried pulling the dreaded crossorigin attribute from a script element on beforescriptexecute, but that does not help.

Even after taking out the attribute before the scripts execute, the resources fail to load. I suspect there must be some form of pre-processing involved, and think it could be an idea to try and find out where this happens. Knowing this might be helpful as it could lead to a simple fix.

@Synzvato Synzvato changed the title Breaks the Google Play Store website Specific script element attributes prevent resource injections Jan 23, 2016
@Synzvato Synzvato changed the title Specific script element attributes prevent resource injections Specific script element attributes prevent injections Jan 23, 2016
@stewie
Copy link

@stewie stewie commented Feb 8, 2016

okay, I'm not claiming it's SIMPLE (and it sure isn't well-documented)
but it should be possible via nsITraceableChannel

Even after taking out the attribute before the scripts execute, the resources fail to load. I suspect there must be some form of pre-processing involved, and think it could be an idea to try and find out where this happens. Knowing this might be helpful as it could lead to a simple fix.

Here are some food-for-thought references:
http://www.softwareishard.com/blog/firebug/nsitraceablechannel-intercept-http-traffic/
https://github.com/BruceBerry/addon-proxy
https://code.google.com/archive/p/classilla/wikis/ByblosSteleDocs.wiki
the classilla tems ('stele' etc.) are wierd/creative... but the wiki clearly indicates the goal of string intercept-n-replace is achieved

Beyond the immediate problem at hand (stripping crossorigin=blah subresourceblah=blah)
gaining this functionality would be hugely beneficial, akin to achieving an in-browser proxomitron

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Feb 11, 2016

@stewie Thanks a lot for sharing your thoughts and findings.

The references are definitely inspiring. Detecting known issues before pages are interpreted by the browser could be very helpful, depending on what approach we take. We could try to:

  • modify source documents before the browser interprets them;
  • analyze DOM nodes that request a given resource and act accordingly;
  • provide more complex emulation by serving HTTP responses;
  • detect any Content Security Policy errors and act accordingly.

As a first step, I propose we use nsIContentPolicy to analyze DOM nodes that try to send out requests to known Content Delivery Networks. If an issue is detected, we mark the domain internally.

That should give us some breathing-room to come up with proper ways to circumvent these policies. I'm sure the final solution can be clean and simple, but taking the right approach is essential.

I'm open to any further ideas, suggestions, prototypes, and Pull Requests.

Synzvato added a commit that referenced this issue Feb 11, 2016
@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Feb 11, 2016

The new Load Watcher should be able to successfully detect a large amount of signals that indicate it's best to leave a specific domain alone for now. This keeps a lot of (reported) sites from breaking.

@RoxKilly
Copy link

@RoxKilly RoxKilly commented Feb 12, 2016

@Synzvato If you will use heuristics to automatically skip emulation when you expect the page to break, please consider giving users a way to set that behavior. Some users may not want to connect to a CDN even in that situation, other users will be OK connecting automatically if necessary, yet other users will prefer to manually allow connection only after the page breaks.

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Feb 19, 2016

@RoxKilly Very good suggestion, and thanks for making sure! All resources from tainted domains are treated as missing. Requests will go though unless you block requests for missing resources.

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Dec 3, 2017

Subresource integrity attributes can cause breakage, because of discrepancies between the various supported CDNs, and strict CORS-modes prevent any redirects to web accessible resources. Redirects will structurally fail, because corresponding responses do not contain the required HTTP-headers.

In case one or more integrity hashes are specified for any given third-party resource, they need to be accompanied by a CORS-mode indication. This means, that there's one single piece of information that can tell us everything we need to know, in order to be able to prevent any websites from breaking.

<script src="cdn.example.org/jquery.js" integrity="..." crossorigin="anonymous"></script>

An example of a script element that would prevent local injections.

Now that the dust around the WebExtension has settled, this issue has top-priority. I have decided to create a bug (1419459) on Mozilla's bugtracker. Hopefully, this will help us to gain access to data we'll need to improve overall stability. Feel free to upvote the bug there, if you agree with this approach.

@onodera-punpun
Copy link

@onodera-punpun onodera-punpun commented Dec 19, 2017

What about some default pages (the pages that break) in the whitelist for now? Or maybe an option like [x] Whitelist known broken pages?

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Dec 19, 2017

What about some default pages (the pages that break) in the whitelist for now?

Hi @onodera-punpun, and thanks for the suggestion! I'm happy to say that this idea has already been implemented. Libraries requested by known problematic domains, are always treated as missing.

@adamhotep
Copy link

@adamhotep adamhotep commented Jan 2, 2018

While DOM parsing like document.querySelectorAll('script[crossorigin]') is not available early enough to solve this issue (see @Synzvato's response to @RoxKilly's suggestion), it might still be useful for whitelisting purposes.

This might even be feasible as a long-term workaround to Mozilla Bug 1419459: create an option to auto-whitelist sites detected as having this issue, with or without reloading the page after the first discovery (another checkbox in the configuration). I think this is what @Synzvato was alluding to in February.

It could even be an interactive prompt. Disable the feature by default and the first time a user runs into the issue, pop open the Decentraleyes dialog and briefly describe the problem (with a link to more documentation, which itself links here and to the moz bug), providing two buttons ("Whitelist" and "Dismiss") and a checkbox ("Do not ask this again" with rollover text "Always answer this question like this"). Pressing "Whitelist" would also reload the page. (You'll need another checkbox in the config to toggle these interactive prompts.)

Don't forget to isolate sites whitelisted in this manner from sites whitelisted for other reasons. This way, when (erm, if) the Mozilla bug is fixed, those sites can be removed while other sites can remain (this won't affect sites whiteslisted through other means, so some users may have permanent CDN leaks to some sites).

@Cerberus-tm
Copy link

@Cerberus-tm Cerberus-tm commented Feb 5, 2018

Hmm too bad Firefox is still not really ready for the switch to Web Extensions. Everyone who wants Decentraleyes to work properly, do vote on the bug for Mozilla to fix it sooner!
https://bugzilla.mozilla.org/show_bug.cgi?id=1419459

@crssi
Copy link

@crssi crssi commented Feb 6, 2018

Gladly, but how do you vote? Cannot find any voting options. :(

@gitressa
Copy link

@gitressa gitressa commented Feb 6, 2018

After logging in, I could vote:

voting

@crssi
Copy link

@crssi crssi commented Feb 6, 2018

Thank you... I wasn't aware the voting is "hidden" under details.

Done

@gitressa
Copy link

@gitressa gitressa commented Feb 6, 2018

Great, thanks!

@Synzvato
Copy link
Owner

@Synzvato Synzvato commented Feb 20, 2018

Bug (#1419459) has been approved during today's WebExtensions API triage, which means we are one step closer to closing this issue. Many thanks for all of your votes and support, much appreciated!

@Cerberus-tm
Copy link

@Cerberus-tm Cerberus-tm commented Feb 21, 2018

Great, well done! Let's hope Firefox will find the time to implement this soonish.

@Jackymancs4
Copy link

@Jackymancs4 Jackymancs4 commented Mar 9, 2018

Hello @Synzvato
Has been any ETA discussed in the triage?
Like Firefox 59 or 60?

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Linked pull requests

Successfully merging a pull request may close this issue.

None yet