Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

websocket message size limitation introduced in tornadoweb causes kernel crashes #3468

Closed
AaronWatters opened this issue Mar 26, 2018 · 15 comments

Comments

@AaronWatters
Copy link

I have some IPython widgets that stopped working for larger data sets recently and with the help of chatters in https://gitter.im/jupyter-widgets/Lobby in particular Jason Grout @jasongrout it turns out that the failures are due to a new limitation in tornadoweb zmq over websockets

See first entry here: http://www.tornadoweb.org/en/stable/releases/v4.5.0.html

This commit: tornadoweb/tornado@104a302

This issue: tornadoweb/tornado#1269

This new limitation results in kernel crashes that I didn't see before if (for example) a javascript widget attempts to transfer more than 10 megs of data in a message to a python kernel.

Is it possible to fix/work-around the limitation in a robust way in general within the Jupyter framework?

@takluyver
Copy link
Member

It's possible to increase the limit, but you might just hit the higher limit anyway. There are probably good reasons to have some kind of limit on it - it's better to kill that kernel than use up memory until the server crashes.

@AaronWatters
Copy link
Author

I agree. However it would be nice to have a way to send an arbitrary amount of data
from javascript to jupyter. One approach is to break large messages into smaller ones and
send those in sequence. I'm probably going to do this at the widget implementation level to allow my widget(s) to send large data to the kernel. I don't know if it makes sense to implement support for large data transfers further down the Jupyter stack (at the comm level or something) but it might.

By the way in the current implementation I have crashed the Jupyter server when attempting to send large enough messages. I believe I can reproduce this by uploading a 200meg file in a javascript widget and then cramming the contents across the websocket as a single message. Boom.

@maartenbreddels
Copy link
Contributor

This is indeed quite a serious showstopper for widgets and binary transfers, did you find a good workaround for this?

@jasongrout
Copy link
Member

Of course, you can reset websocket_max_message_size at the tornado level...

@harmon
Copy link

harmon commented Aug 20, 2018

I'm also running into this I believe. Got a 12 MB iPyWidget Output field update being pushed from the kernel to the web browser, and it reliably closes the websocket connection with "Message too big (1009)". Can this "websocket_max_message_size " value be exposed so it's configurable?

@AaronWatters
Copy link
Author

jp_proxy_widgets implements its own meta protocol to transfer data in chunks.
AaronWatters/jp_proxy_widget@45186ae

@cosmoscalibur
Copy link

Of course, you can reset websocket_max_message_size at the tornado level...

How can I reset this value launching notebook? Thia value is not present in notebook app tornado settings.

@cosmoscalibur
Copy link

cosmoscalibur commented Jul 30, 2019

For future reference:

Example to limit size in 100 MB.
Create config file

jupyter notebook --generate-config

Edit (uncomment) the following line in jupyter_notebook_config.py (generated file) and change the value (desired size in MB * 1024 * 1024):

c.NotebookApp.tornado_settings = {"websocket_max_message_size": 100 * 1024 * 1024}

Launch notebook with --config parameter pointing to config file:

jupyter notebook --config="jupyter_notebook_config.py" your_notebook.ipynb

@slw07g
Copy link

slw07g commented Sep 17, 2020

In addition to @cosmoscalibur's excellent instructions, you could alternatively store the config file in one of the paths jupyter watches, and it will load the config automatically.

To see the paths jupyter is configured to watch, run jupyter --paths

@ElisonSherton
Copy link

Hi @cosmoscalibur

Thanks for posting the solution above. I tried to do use that workaround. In this notebook below, I tried to upload a 99 MB file since I have improved the upload limit in the jupyter_notebook_config.py

sn

Can you advise what could be going wrong here? Also, in the paths which jupyter looks for, there's the .jupyter folder which contains the jupyter_notebook_config.py file as mentioned by @slw07g but still the uploading is not happening properly...

image

I would be obliged if you could help with this issue please.

Thanks & Regards,
Vinayak.

@slw07g
Copy link

slw07g commented Feb 12, 2021

@ElisonSherton I notice you have 100*1024*1024 rather than 1000*1024*1024. If you add the zero, does it correct the issue?

@ElisonSherton
Copy link

@ElisonSherton I notice you have 100*1024*1024 rather than 1000*1024*1024. If you add the zero, does it correct the issue?

Hi @slw07g ,

Thanks for that advise. The upload now started working. However, as per @cosmoscalibur , the config needs to be changed to (desired size in MB * 1024 * 1024) right? Then I wonder why 99.1 MB file wouldn't get uploaded if I mention the limit as 100 * 1024 * 1024 instead of 1000 * 1024 * 1024?

sn3

The size of file for your reference is as follows
image

@slw07g
Copy link

slw07g commented Feb 13, 2021

@ElisonSherton This is because when you upload the file it gets encoded (Base64 I believe) which results in a payload that is larger than the actual file. There's some additional context on base-64 encoding here.

You can verify this yourself by using the developer tools in your browser and monitoring the messages sent over the websocket (make sure developer tools are open prior to loading the page so that it can capture the websocket).

If you increased the size to say 105MB, it'd probably be sufficient for the 99.1MB file.

Hope this helps.

@ElisonSherton
Copy link

Thanks @slw07g

That was really helpful.

@Zsailer
Copy link
Member

Zsailer commented Feb 16, 2021

Closing this issue in favor of @cosmoscalibur's reply (thank you for this contribution! 😎 ): #3468 (comment)

@Zsailer Zsailer closed this as completed Feb 16, 2021
@github-actions github-actions bot locked as resolved and limited conversation to collaborators Aug 16, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

9 participants