-
Notifications
You must be signed in to change notification settings - Fork 108
Specific script element attributes prevent injections #16
Comments
@coloco21 Thanks for reporting the issues. Let me share some initial findings. The bug occurs when a website assigns the "crossorigin" attribute to a script element that references an injectable resource. <script src='//example.cdn.com/jquery/1.9.0/jquery.min' crossorigin></script> According to this MDN article by Mozilla, the script element attribute allows for error logging on sites which use a separate domain to serve static media. However, it also seems to trigger an onerror event when a CORS request fails. This probably makes Firefox ignore the contents of the payload. Without the "crossorigin" attribute, no errors are raised. Here's a reproduction of the bug. To anyone: ideas, thoughts and creative suggestions or solutions are highly welcome! |
For now, a temporary fix has been applied through f15ba32. It simply keeps injections from occuring on play.google.com, which is obviously not a solution to the underlying problem described above. |
I'm not familiar with how extensions work so what I propose here may not be possible but... Do you have access to the DOM? If so you could use the
|
@RoxKilly Thanks for joining in, very helpful, and much appreciated! I did try out these and other methods and have found that it is indeed possible (but takes effort) to get the DOM context responsible for a request. I first tried to check for This means that, when taking this approach, all local injections will need to be delayed until the DOM is parsed, which would have a relatively big performance impact. I tried pulling the dreaded Even after taking out the attribute before the scripts execute, the resources fail to load. I suspect there must be some form of pre-processing involved, and think it could be an idea to try and find out where this happens. Knowing this might be helpful as it could lead to a simple fix. |
okay, I'm not claiming it's SIMPLE (and it sure isn't well-documented)
Here are some food-for-thought references: Beyond the immediate problem at hand (stripping crossorigin=blah subresourceblah=blah) |
@stewie Thanks a lot for sharing your thoughts and findings. The references are definitely inspiring. Detecting known issues before pages are interpreted by the browser could be very helpful, depending on what approach we take. We could try to:
As a first step, I propose we use That should give us some breathing-room to come up with proper ways to circumvent these policies. I'm sure the final solution can be clean and simple, but taking the right approach is essential. I'm open to any further ideas, suggestions, prototypes, and Pull Requests. |
The new |
@Synzvato If you will use heuristics to automatically skip emulation when you expect the page to break, please consider giving users a way to set that behavior. Some users may not want to connect to a CDN even in that situation, other users will be OK connecting automatically if necessary, yet other users will prefer to manually allow connection only after the page breaks. |
@RoxKilly Very good suggestion, and thanks for making sure! All resources from tainted domains are treated as missing. Requests will go though unless you block requests for missing resources. |
Subresource integrity attributes can cause breakage, because of discrepancies between the various supported CDNs, and strict In case one or more integrity hashes are specified for any given third-party resource, they need to be accompanied by a <script src="cdn.example.org/jquery.js" integrity="..." crossorigin="anonymous"></script>
Now that the dust around the |
What about some default pages (the pages that break) in the whitelist for now? Or maybe an option like |
Hi @onodera-punpun, and thanks for the suggestion! I'm happy to say that this idea has already been implemented. Libraries requested by known problematic domains, are always treated as missing. |
While DOM parsing like This might even be feasible as a long-term workaround to Mozilla Bug 1419459: create an option to auto-whitelist sites detected as having this issue, with or without reloading the page after the first discovery (another checkbox in the configuration). I think this is what @Synzvato was alluding to in February. It could even be an interactive prompt. Disable the feature by default and the first time a user runs into the issue, pop open the Decentraleyes dialog and briefly describe the problem (with a link to more documentation, which itself links here and to the moz bug), providing two buttons ("Whitelist" and "Dismiss") and a checkbox ("Do not ask this again" with rollover text "Always answer this question like this"). Pressing "Whitelist" would also reload the page. (You'll need another checkbox in the config to toggle these interactive prompts.) Don't forget to isolate sites whitelisted in this manner from sites whitelisted for other reasons. This way, when (erm, if) the Mozilla bug is fixed, those sites can be removed while other sites can remain (this won't affect sites whiteslisted through other means, so some users may have permanent CDN leaks to some sites). |
Hmm too bad Firefox is still not really ready for the switch to Web Extensions. Everyone who wants Decentraleyes to work properly, do vote on the bug for Mozilla to fix it sooner! |
Gladly, but how do you vote? Cannot find any voting options. :( |
Thank you... I wasn't aware the voting is "hidden" under details. Done |
Great, thanks! |
Bug ( |
Great, well done! Let's hope Firefox will find the time to implement this soonish. |
Hello @Synzvato |
I have found that this extension broke the Google Play Store (http://play.google.com/) for me. It doesn't show the menu on the left, the search bar doesn't work, and clicking on install on an app's page doesn't work either. Seems like a JS script is broken, but I don't have time to figure out which one.
Deactivating the extension fixes the problem as expected.
The text was updated successfully, but these errors were encountered: