-
Notifications
You must be signed in to change notification settings - Fork 28
For users on very slow connections, block document.written scripts #17
Comments
Possible Spec edits: And also add similar statement in the Warning section in https://html.spec.whatwg.org/multipage/webappapis.html#document.write() |
Summarizing mail thread discussing the spec changes here: As per feedback from ojan@ and domenic@ on the process of spec change: |
Per above, I've opened the linked issue on W3C HTML5 spec. |
Wrong repo :). We were talking about a pull request to https://github.com/whatwg/html, which is what is implemented in browsers.
It's important to note include normative requirements (even "may"s) inside non-normative notes. Instead, we need to work through the normative algorithms. Let's figure out instead how innerHTML normatively prevents script execution. It turns out it's done in the HTML parser itself. See e.g. https://html.spec.whatwg.org/multipage/syntax.html#parsing-main-inhead, where "A start tag whose tag name is "script"" step 4 says
(Note that "in body" delegates to "in head" for script, so this is the place to look.) To fix this, your PR should probably just add an extra step that's very similar: something like
I'm happy to write this PR if you'd prefer, although we're always happy to have more contributors to HTML. Instructions at https://github.com/whatwg/html#pull-requests |
Oh, but of course we should also update the note you found, too! My suggestion for how to do that would be to give an |
Thanks for the feedback , Domenic. |
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
OK! I wrote up the pull request at whatwg/html#1400. |
Great, thanks! |
Let me document the current behavior of the intervention and its associated feedback loops in Chrome. InterventionChrome will block the load of document.written scripts when the following conditions are met: Script related criteria
Circumstances
Browser HTTP cache hit caseWhen the script needed by the document.write statement is found in the browser HTTP cache, Chrome will use it even if stale, up to a lifetime of 1 year. To mitigate version skew issues, Chrome might queue a low priority asynchronous request in order to update stale assets. Feedback loopsIn order to make the intervention actionable for developers, we will offer the following feedback loops. Warning in devtoolsSince Chrome M53, Devtools is issuing warnings for potentially problematic document.write statements. The warning is issued regardless of the connectivity. Example: Intervention HTTP header (strawman)When a script inserted via document.write has been blocked, Chrome will send:
When a script inserted via document.write is found and could be blocked in different circumstances, Chrome might send:
The intervention header will be sent as part of the GET request for the script (asynchronously in case of an actual intervention). Updates10/25/2016: fix incorrect usage of the term "cross origin" when we meant "cross site (i.e. hosts with different eTLD+1)". |
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
I think this is assuming that document.write() is only used for additive third party scripts like ad networks but one common use case of document.write is to fallback to a local script when a CDN fails. You can see an example of this from a very popular starter kit. This pattern has existed as far back as CDNs have been around and is probably encoded in a large number of websites. |
In your specific scenario this intervention not trigger because your fallback script is not cross-origin. |
That's only true in Chrome. My concern is that giving UAs permission to do this will cause sites to behave differently in some browsers and break in some browsers. |
I object to blindly targeting cross-origin. It's too restrictive with potential to break some sites' core functionality. Including half a dozen of my own sites. As I mentioned in a Tweet to @PaulKinlan I have been using cross-browser patterns such as:
or with conditionals to load browser shims such as:
As described, prohibiting cross-origin scripts from CDNs break the above use cases. In addition, given the above jQuery version method, I use jQuery fallbacks from CDNJS to Google's API. Both of them via Some additional points:
I am all for preventing poorly implemented document.write "for third party content such as ads and trackers" that inject late content in the page. But we need a reasonable compromise here. As mitigation, here an option to consider, which i'd be fine with: Allow cross-origin scripts only when loaded in the |
Why do you believe that script usage (actually, it's document.write usage) early in the head makes no visible impact on performance? It janks the page just as bad as document.write anywhere else (it stops further parsing of the page, much less any rendering and layout, until the script is downloaded!). |
For what it's worth, this may not be true in all UAs. It needs to stop observable DOM construction, and it needs to act as if parsing had been stopped, but you can in fact parse ahead speculatively if you want, and some UAs do that. |
The combination of DOM construction blocking and speculative parsing is actually the problem here. The speculative parsing will load resources that are needed later in the page, delaying the load of the doc.written script. On a 2G connection, by the time you doc.write the script, the network connection is already flooded loading other resources, even if it's in the head. You can't just reprioritize the resource loads at that point because it's the upstream network bandwidth that's flooded. We were consistently seeing page load improvements of 10-30 seconds on many pages on 2G. @hexalys do you have a page you could point us to that doesn't see a significant improvement on a 2G connection? I'd like to understand it better. Maybe we should be doing something more nuanced. |
@ojanvafai A page might load quick, but load incorrectly if a needed script is ignored. If I write the entire page using jQuery and jQuery doesn't load because of this algorithm, the page is broken on such a connection. |
By tests and observation of network waterfalls in fairly all browsers. There is little to no timing penalty for the overall document in loading jQuery via document.write(). Compare those 2 tests as examples: Test A : Nexus 7 - Chrome - 2G speed (using document.write jQuery). The only difference is a very small negligible priority penalty of 10-20ms only for the script itself.
It doesn't stop rendering any more than any other synchronous script. It's not that's it's a magical no-jank container. As an aside, I just assume that having it at the top where the body is not yet parsed has a more minimal impact with the effect document.write can have on the document. The main point here is that, it obviously loads those scripts much faster and it is the only proper way to use document.write() without penalty. In contrast, see how bad Test C : Nexus 7 - Chrome - 2G speed (adding an ES6 shim using document.write at the bottom). The impact on page load or 'Document Complete' there is minimal, as it is a small script that doesn't do anything else other than being parsed. It however clearly delays 'Render' by 1-2s and 'Interactive' by 4-5 seconds already. That is the kind of harmful practice everyone should be told not to use at all.
Correct. That's true of Chrome since Chrome PLT Improvements Q1 2013. According to that document, even if a script blocks it may speculatively download images if the network is idle (i.e. a light head). The important thing here, is that document.write() does not block assets in the head from being preloaded. Images are low priority anyway and won't start if you are busy with quite a few script + css downloads.
Indeed as shown on C. But it's only a performance problem past the
Agreed past
I can see that with any bottom document.write(s), especially if those also modify the DOM themselves, or would do worse by loading additional assets. Those are really obvious performance killer and an anti-pattern. I totally get document.write() isn't ideal. But that's the only way that job gets done, at little to no cost, without a recourse to a more modular approach which in itself usually require an additional script and a performance cost in itself. (which mostly makes sense for a web app centric page as opposed to the average web site). Clearly |
As anyone who follows blink intent threads knows, I'm very concerned about web compat (often arguing that risking breaking even 0.01% of page views is unacceptable). There's no doubt that this is going to break some content. But when evaluating the compat cost (even in isolation before considering the benefits) we have to consider that this fixes 10% of pages loads on 2G that are otherwise so slow the user gives up waiting before anything paints. So to even be at neutral compat impact this would have to break at least 10% of 2G page loads. The initial evidence suggests it's well below that threshold. So this may be a rare intervention where we get to BOTH improve the user experience on many pages AND increase the number of pages that load correctly! Of course there's still a risk of some class of pages being harmed more then they're helped. I suggest we focus on collecting examples of real pages that are broken with little benefit to see if there are any patterns that can be used to tweak the discrimination of this heuristic. Also, interventions are premised on the idea that developers who follow best practices will not be impacted by them. @hexalys describes some use cases that seem pretty reasonable to me. Perhaps there's some way we can allow those to work without loosing most of the benefit? |
I think it is ok for an inline script to use document.write; technically an inline script is nearly identical to having the script tag in the html source. |
Interventions generally have an opt-out. If there are some legit use cases, should this get an opt-out too - at least for now? Eg. What if the script loaded was previously listed as a |
@RByers If we could expand that to a |
I'm not an expert here, but Ojan's argument above is that it's not - at least in Chrome. The script tag is exposed to the preload scanner (so can trigger loads early) while the doc.written script is not and so may block loading longer (especially due to all the later speculative resources that are now hogging the network). Also the biggest problems come when one render-blocking script doc.writes another which doc.writes another and so on. As I understand it this can't happen with script tags, even when added via DOM APIs. |
You mean to opt out of the restriction? It would for the multi version jQuery load, yes. Or the link Also, should you decide to preload document.write(s). It's no too hard to get around it for case 2. Which I already do. If you look at my source on this site. I have two custom methods for document.write(s) and async loads for both contexts. So I technically load this:
Well you can load synchronous ajax with similar effects. Though Chrome already get a "Synchronous XMLHttpRequest on the main thread" deprecation console warning to deter the practice. |
First shipped in Chrome 55 |
This ends up in my
And many similar ones that are fed into my syslog through compiz.
It looks like this is a place where issues like these are considered, if I'm wrong, I apologize and please correct me in that case! |
Hmm, I'm not sure why messages Chrome writes to the console are ending up in your syslog. I'd recommend disabling that (not sure how, will depend on how your system is set up) if you don't want these to show up there. |
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
I see this symptom in Chrome over a fast FTTH connection (>100Mbps symetric), which is definitely not a slow 2G connection. It occurs with various websites using scripts to load maps or satellite views from Google Map (and i doubt it is slow; not all sites are affected and not Google Map itself. Those that are affected are using document.write() to insert a script, that will then load the map asynchronously (this works) then will load additional data or markers on it: a part of those markers are visible, then the map and markers disappear (the canvas turns to gray) and then fails. Apparently Chrome does not correctly track the session, between successive requests, and then fail with the XSS issue when GoogleTagService trackers are loaded, and then we are returned an obscure error about daily quota exceeded for using the map... Example of what is logged in the console when visiting http://www.maplandia.com/belarus/mahilyow/podgoritsa/ 2www.maplandia.com/:39 A Parser-blocking, cross site (i.e. different eTLD+1) script, http://www.googletagservices.com/tag/js/gpt.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details. |
Apparently this is caused by GoogleAnalytics when a page inserts its tiny script with document.write() near the top of page, that will post an asynchronous request once the page is loaded. |
So Google Chgrome blocks Google Tag Services (which uses such bad practices of user tracking with cross site scripting). It is prevalent anyway on so many sites to track visitors and monetize websites via profiling of visitors in order to post "relevant" ads. These services won't work, it may be a good thing, but websites will no longer get monetized using the Tag Servies they subscribed Google for, so Google won't pay them. Is Google Tag Services top be dead? if it cannot postpone some scripts that will run at end of page load (even when they are scheduiled to be executed like here after 10 seconds and the page is compeltely loaded since long and the bandwidth is fully available to posting trackers andd loading porfiled ads?) Apparently there's a conlfict between the own's site profiling needs and the tracking made by Google itself when using its map services, where google will also profile users to send ads or customize the rendered map. Note: it may as well be bugs in using GPT and Google Map together on the same page, beause I can see several objets ion the console that should have been already initialized but that have been cleared when Google Map was loaded first Note: I'm not the author of the "maplandia.com" website, it's just no longer usable in Chrome 60, but it still works in IE, Edge, Firefox, or Opera, on Windows 10 x64. I did not test on MacOS, iOS or Android. |
The issue lies in how fast is fast and how slow is slow, 2G network may seems slow to someone who is inpatient to visit a large web page, but it is perfectly fine for a small web page which is viewed by somebody who occasionally visit it with a casual mind. Google seems decided that it wants to be the authority that defines the word "fast" and "slow" for us, make them a universal value that all chrome users and developers should compliant. |
Note that the logs mention this as a warning for developers so they know that their site may behave differently in slower networks. But from the logs given above (Copying them here for reference), it does not seem that the network is considered slow and scripts are blocked since there is no subsequent error message. 2www.maplandia.com/:39 A Parser-blocking, cross site (i.e. different eTLD+1) script, http://www.googletagservices.com/tag/js/gpt.js, is invoked via document.write. The network request for this script MAY be blocked by the browser in this or a future page load due to poor network connectivity. If blocked in this page load, it will be confirmed in a subsequent console message.See https://www.chromestatus.com/feature/5718547946799104 for more details. |
Google stopped the implementation of that Browser Interventention, because the performance improvement on real Users was too small. https://bugs.chromium.org/p/chromium/issues/detail?id=575850 |
@fwebdev that bug you link to is not related to the intervention discussed here. It has to do with document.write() and script tags, but with executing them more aggressively, not with blocking them. |
I get this error on a fast connection. cap.js:153 A Parser-blocking, cross site (i.e. different eTLD+1) script, https://back20.keycaptcha.com/swfs/caps.js?uid=79849&u=http%3A%2F%2Fstatic.y8.com%2Fupload&r=0.10667958468013206, is invoked via document.write. I thought it was for 2g only? |
That's right, this is a warning to let site developers know that their scripts may be blocked for users on slow connections, even if they aren't currently on a slow connection. The full message is: |
This intervention was being triggered because we were using old Google Charts loader code, and Chrome was warning that on slower connections the charts may not be loaded (to help the user experience). Using the latest loader code listed at https://developers.google.com/chart/interactive/docs/basic_load_libs resolves this issue. See WICG/interventions#17
This allows user agents to experiment with better heuristics for not executing such scripts, as per WICG/interventions#17.
Am I to understand that I should not get this error in my console log on a 30/10Mbps (u/d) cable-modem connection? Or...why might I get this error if I'm on a home connection? |
@Astara read the comment above #17 (comment) |
I read that -- in fact I remember almost that exact wording in the message I saw in Opera's console window. Ok, "dev-level msg", about what could happen for any reason with cross-site content inserted by a doc.write. A bit arcane, but a useful feature to have to preserve interactivity. Is there, or should there be a configurable somewhere to define 'slow'? Or is it done algorithmically based on performance of the static, cross-site content on the page (i.e. hypothetically, not in some browser-specific specific instance). I'm guessing 'unspecified' & possibly given that this is talking about allowed behavior. Ok, thanks for the feedback! |
Is there any way to override this or prevent this from occurring automatically in the browser? My apologies in advance if I am in the wrong place to report this issue (as I'm not sure if reporting it as a new issue was applicable, and) as I have tried resolving this matter for days to no avail and have been lead to this discussion from the this page as shown in the console message here: I am hardwired on a desktop running Chrome Version 83.0.4103.116 on a connection of greater than 300Mbps as shown in the speedtest here: Which seems to be contrary to the alleged remarks of only happening to users on 2G as confirmed by comment #17 and of all my efforts which include but is not limited to disabling my firewall, enabling pop-ups, enabling insecure content from all associated websites of this page, changing default programs, verifying group policies and what else I or any troubleshooting manual could suggest related to the matter that I could think of to determine why this page will not open the java application to open and these console messages are all I have left to go off of. Please correct me if something else other than the browser that may be preventing me from loading this page or is it possibly something else going on not shown in the console preventing me from loading the application? I am trying to load an application called MarketPro from merrilledge.com, it is a stock market trading application that is not loading as intended and continues to lead me to another page. In case there is any question about my attempt to contact Merrill Edge directly they have confirmed that it should be working but even after going through the troubleshooting guide and taking what other steps I could think of unrelated to Chrome, this is all I have left. I realize this is not a support page or technical support page but it seems to be an error related to this particular parser in Chrome's console that I have yet to eliminate as a reason for my being unable to access the application. So any re-direction or help would be greatly appreciated. |
@nucleare : The console warnings do not imply that this resource was actually blocked since there was no error log that confirmed it. I suggest testing if this works in other browsers or not. If the issue is occurring only on Chrome, you might want to open an issue via https://bugs.chromium.org/p/chromium/issues/entry |
Thank you for the reply and insight. I've only tested it on another computer with a fresh install of chrome and with Microsoft Edge on that same, other computer with no difference in results. Since it seems to produce an error on Microsoft Edge as well then it seems it's not exclusive to Chrome. Once again, I appreciate the insight and pointing me in the right direction if it had been a chrome issue. |
@nucleare : I would also suggest to test on a non-Chromium browser like Firefox/Safari/some other (both Chrome and Edge use the chromium code) to make sure it is not a bug in chromium. |
Does it means this intervention will block both download and execution of the script? I set my chrome to 2G mode, but I still has not observed this intervention |
As part of shutting down this repo (see #72), I'll close this issue. The behavior here is standardized in HTML so further spec-level discussions aren't necessary. For those interested in the Chromium project's implementation, and not the standardization process, I encourage you to discuss on the Chromium issue tracker, at https://crbug.com/. Here are some parting notes on things I discovered while investigating the status of Chromium's implementation: Chromium shipped (in version 55):
Chromium has a disabled-by-default feature to block such scripts on slow connections in general, not just connections that advertise as 2G. I've updated the ChromeStatus for these two features: shipped 2G feature, not-shipped slow-connections feature. I've also inquired on the bug for the slow-connections feature about the future of that work. Again, all discussion on these Chromium-specific things should happen on the Chromium issue tracker, and not here :). |
您好,我已收到您的邮件,会尽快处理!
|
For users on high latency connections, parser-blocking scripts loaded via document.write cause significant delays in user perceived page load latency.
Most scripts inserted via document.write are for third party content. A quick survey of the third parties suggests that async solutions are commonly offered. Given how bad the user experience can get for users on slow connections, it's quite likely that a large fraction of page visits never succeeds. The hope is that these newly rescued page views would incentivize publishers to adopt async solutions.
Chrome is exploring the following intervention:
The text was updated successfully, but these errors were encountered: