Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

IOPub data rate exceeded #1181

Closed
jbednar opened this issue Mar 8, 2017 · 24 comments
Closed

IOPub data rate exceeded #1181

jbednar opened this issue Mar 8, 2017 · 24 comments
Assignees
Labels
tag: notebook type: docs Related to the documentation and examples
Milestone

Comments

@jbednar
Copy link
Member

jbednar commented Mar 8, 2017

With the 5.0.0b1 version of the Jupyter notebook server, I'm getting error messages instead of some of my HoloViews plots:

image

If this is a new thing, should HoloViews be increasing the limit to something more in line with the amount of data that HoloViews notebooks typically generate?

You are using Jupyter notebook.

The version of the notebook server is 5.0.0b1 and is running on:
Python 2.7.13 |Anaconda 2.5.0 (x86_64)| (default, Dec 20 2016, 23:05:08) 
[GCC 4.2.1 Compatible Apple LLVM 6.0 (clang-600.0.57)]

Python 2.7.13 |Anaconda 2.5.0 (x86_64)| (default, Dec 20 2016, 23:05:08) 
Type "copyright", "credits" or "license" for more information.

IPython 5.3.0 -- An enhanced Interactive Python.
@jbednar
Copy link
Member Author

jbednar commented Mar 8, 2017

For the Showcase notebook, I was able to get the output to appear in place of those error messages if I launched the server with:

jupyter notebook --NotebookApp.iopub_data_rate_limit=10000000  Showcase.ipynb 

Not sure what units that value is in; 10 million somethings per something...

@philippjfr
Copy link
Member

Here's what their docs say:

NotebookApp.iopub_data_rate_limit : Float
Default: 1000000

(bytes/sec) Maximum rate at which messages can be sent on iopub before they are limited.

Surely there must be a way to override this for certain output, this was just meant to prevent a lot of text being dumped into your browser at once, which is a problem. I'll do some digging.

@philippjfr philippjfr added this to the v1.7.0 milestone Mar 13, 2017
@philippjfr
Copy link
Member

philippjfr commented Mar 13, 2017

As far as I can tell there is no way to override this except setting a higher rate in your Jupyter config, which would be a real pain.

@jbednar
Copy link
Member Author

jbednar commented Mar 14, 2017

Should we file an issue with Jupyter?

@philippjfr
Copy link
Member

Looks like this will be fixed in Jupyter 5.1, which means we'll have to pin 4.2.2 in our requirements or provide guidance to increase the limit.

@philippjfr philippjfr added type: docs Related to the documentation and examples tag: notebook labels Mar 15, 2017
@philippjfr
Copy link
Member

I can confirm that using 5.0 a bunch of examples in Showcase and Exploring Data now raise this error. What should we do until 5.1 is out?

@jbednar
Copy link
Member Author

jbednar commented Apr 5, 2017

Is there a dev release we can rely on? Otherwise, pin 4.2.2?

@philippjfr
Copy link
Member

philippjfr commented Apr 5, 2017

The actual fix has not even been merged yet (see jupyter/notebook#2368), so no dev release we can use. Suppose pinning 4.2.2 is our only option but then that's bound to annoy people if we downgrade their notebook on install.

@jbednar
Copy link
Member Author

jbednar commented Apr 5, 2017

Yep, it's very annoying. I would argue that the Jupyter 5.0 release is broken with its default values for the data rate, for visualization purposes, so I don't see any other option. Basically, anything they do with any viz program is likely to run into problems with 5.0, so if they are doing viz in a notebook, downgrading to 4.2.2 is actually doing them a favor.

@jlstevens
Copy link
Contributor

jlstevens commented Apr 5, 2017

... if they are doing viz in a notebook, downgrading to 4.2.2 is actually doing them a favor.

I agree!

Just need to make it clear in the release somehow that we're not the ones causing problems and that Jupyter 5.0 is what is causing some general problems for a lot of notebook users.

@jlstevens
Copy link
Contributor

This may deserve a section in the release notes, not only to explain why we are pinning but what people who've upgrades to 5.0 anyway should do if they see those warnings.

@jbednar
Copy link
Member Author

jbednar commented Apr 5, 2017

Right.

@jbednar
Copy link
Member Author

jbednar commented Apr 5, 2017

Note that the PR has now been merged to Jupyter master again, so whatever next dev release there is should be ok again.

@philippjfr
Copy link
Member

Basically, anything they do with any viz program is likely to run into problems with 5.0

I don't think this is true really, you'll be hard pressed to reach the limit with bokeh plots as bokeh will likely fall over before it bites and in matplotlib you have to push at least a 2000x2000 pixels (I suspect a lot more) to trigger it. I'm still fairly annoyed with it because it doesn't make sense to limit the data throughput, but I actually think outside of HoloMaps this will be a fairly rare problem.

@jbednar
Copy link
Member Author

jbednar commented Apr 5, 2017

I ran into the issue with vastly smaller plots in mpl, I just wasn't able to reproduce it reliably. It's not the total size that matters, it's the data rate...

@jlstevens
Copy link
Contributor

I ran into the issue with vastly smaller plots in mpl, I just wasn't able to reproduce it reliably. It's not the total size that matters, it's the data rate...

That is worrying - I was about to suggest we update the tutorials so they don't trigger this warning and leave notebook unpinned (and warn the user in the release notes). Sounds like that might not be possible.

@philippjfr
Copy link
Member

I ran into the issue with vastly smaller plots in mpl, I just wasn't able to reproduce it reliably. It's not the total size that matters, it's the data rate...

It is related to total size though since it averages the throughput over some small period of time and anything that completes before that time is up won't be affected. As far as I can tell the effective limit is about 2MB (after 100 attempts I haven't been able to reproduce it at that size).

@jbednar
Copy link
Member Author

jbednar commented Apr 5, 2017

A 2MB plot is not all that much...

@philippjfr
Copy link
Member

philippjfr commented Apr 5, 2017

A 2MB plot is not all that much...

Agreed, for a completely random image exported to png that's only about 1200x1200. Fortunately most charts aren't random and I was able to get 3000x3000 pixel pngs of a Curve to export reliably.

@BoomBidiBuyBuy
Copy link

Hello, I'm sorry, my question is might be a little dummy, but I wonder how to resolve it for jupyerhub? I've tried pass the coressponding argument argument right to 'jupyterhub' and tried to set it in config file, but without any success.

@philippjfr
Copy link
Member

@BoomBidiBuyBuy That's probably a question for the JupyterHub repo.

@philippjfr
Copy link
Member

There really isn't much more we can do here except hope that Jupyter 5.1 is released soon. Going to close.

@DataPsycho
Copy link

Good Choice: jupyter notebook --NotebookApp.iopub_data_rate_limit=1.0e10 . It works for me. I had the same problem. https://community.plot.ly/t/tips-for-using-plotly-with-jupyter-notebook-5-0-the-latest-version/4156

@antadlp
Copy link

antadlp commented Oct 5, 2018

As a workaround something that worked for me was put a "time.sleep(0.3)" inside the loop; of course it must worked for others time intervals

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
tag: notebook type: docs Related to the documentation and examples
Projects
None yet
Development

No branches or pull requests

6 participants