Hello Jupyter community,
apologies for asking this here. I've been trying to find some information about performance profiling online but my googling skills have failed me.
I use the Jupyter notebook daily to plot and analyse timeseries (mainly using seaborn and pandas).
Sometimes when the number of datapoints exceeds a certain amount, the kernel starts to hang (10+ minutes) or dies.
Have there been any efforts to document the performance of the kernel for large scale data visualization/analysis? I would like to establish some guidelines to help me decide when plotting something using the notebook is not a good idea.