New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compress static assets served by Jupyter Lab #13189
Comments
I played around with this a bit more, setting -
in Could I be looking at the wrong spot for the webpack config? I set a few simple log statements in the file mentioned above and confirmed that it is executed during the Any pointers here would be helpful. |
I can reproduce this; it indeed includes assets such as JavaScript or settings JSON. It seems really important as the Lighhouse claims we could save many seconds on page load here, though the question is how much time the extra compression will take. This can be addressed in |
FWIW, we also have thought about jupyter-server/jupyter_server#312 (comment). This may be better to not compress in Jupyter Server and instead give us a way to "eject it" and put a high perfomrance webserver in front (e.g. nginx, caddy, apache). This server would take care of compression + serving. Alternatively, we can precompress assets and if a client accepts a compression type server that with the right headers. That is, we gzip/brotli everything at a high level of compression and store the compressed file in memory/on disk. Then if a client accepts this type, we use that, if not we don't compress that assets or maybe just compress on the fly. This can be done for all extensions. |
I should have added this in my first update, apologies. I did try starting Jupyterlab with |
If tornado compresses on the fly, most of the time savings could be negated by doing compression (dependent on a variety of factors specific to one's environment). What we care about the most here is pageload time. Since jupyterlab (and most plugins) hash assets, we could compress them at a very high level of compression and serve them (assuming the request accepts that encoding type) when much better caching headers. This is similar to gzip static. I'll note that the many small assets generated by lab are actually bad for the default settings of HTTP/1.1 and only really get benefits from HTTP/2 (which requires another server in front due to lack of support in tornado). An interesting article about this. FWIW, I don't think we should optimize for HTTP/1.1 but make it easy to run a server that supports HTTP/2. |
To make sure we've considered all angles here, I've logged jupyter-server/jupyter_server#1016. If the concern is time spent on compressing on the fly, it might be worthwhile to understand the possibility of static compression as well, just to evaluate all options. I also like Marc's alternative of caching compressing assets. On a related note,
Regardless of where/how compression occurs, jfyi, I'm trying out the nginx + symlink setup Marc brought up in jupyter-server/jupyter_server#312 (comment) to understand how HTTP 2 helps with overall page load time. If there are any insights on compression along the way, I'll update here. |
I looked into the highlighted To compress static assets we could use |
Yep, this is still a good idea, and something I brought up on #14038 (ironically, binder is one of the places we could guarantee HTTP2 and SSL). I don't think the general case of HTTP2 on the desktop is reasonable, as it requires acquiring an SSL cert, which is still a bridge too far for most end user, even if LetsEncrypt makes it very easy. From a privacy perspective, that would be yet another third-party system that security-minded folks would have to disable in the name of "performance" for some theoretical enterprise user. The work is really probably "just" overriding As to compressing with
But indeed: all of this is just wind if it's not measured, in real browsers, over time. |
Problem
Jupyterlab's static assets are marked in Chrome's Lighthouse tests for text compression, even when there are no extensions installed. With the prevalent push in the Jupyterlab community to improve performance, this might be a candidate worth looking into.
I have been trying to compress the static assets associated with a few custom extensions I have, which are pretty large in size and I noticed that the static assets served from
static/lab/
are flagged, even without any extensions. I'm wondering if compression might help here.Proposed Solution
Consider compressing the static assets served by Jupyterlab.
I dug through the build process of Jupyterlab a bit and noticed that compression is set to false here. Is there a specific reason this is so or does compression occur anywhere else? Any pointers to relevant discussions is helpful! I have a high-level understanding of the build process; pls redirect if I should be looking anywhere else.
The text was updated successfully, but these errors were encountered: