Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cell execution asterisk does not change to cell execution number #13383

Open
jamesdbrock opened this issue Nov 5, 2022 · 18 comments
Open

cell execution asterisk does not change to cell execution number #13383

jamesdbrock opened this issue Nov 5, 2022 · 18 comments
Labels

Comments

@jamesdbrock
Copy link

jamesdbrock commented Nov 5, 2022

Description

When executing cells with the IHaskell kernel, the cell execution asterisk does not change to a cell execution number after execution has finished.

However, it will change after some number of other cells have been executed.

In this demo GIF, the cell execution asterisk will change after two more cells have completed execution. (I have also observed sometimes that the cells change after five more cells have completed execution, or some other number).

Note that the execution result is displayed immediately — it’s only the execution asterisk which doesn’t update until two more cells have executed.

Peek 2022-11-05 18-50

Reproduce

You can observe the bug in this docker image from ihaskell-notebook, which is based on https://hub.docker.com/layers/jupyter/base-notebook/lab-3.5.0/images/sha256-cc9e4b7e76519ef2c294b1b15188a012da4aad3b8b8a4b09c366e0c83a90b45f

docker run --rm -p 8888:8888 -v $PWD:/home/jovyan/pwd --name ihaskell_notebook ghcr.io/ihaskell/ihaskell-notebook@sha256:a33460db8792d8da5e77d41a5cd9d1a3ce2126ec367b2de958ad59b7a5960388 jupyter lab --ServerApp.token=''

Context

This bug was not present in JupyterLab v3.0.16, but the bug has appeared in every version which I’ve checked after that version:

  • JupyterLab v3.2.5
  • JupyterLab v3.3.1
  • JupyterLab v3.4.6
  • JupyterLab v3.5.0

Maybe this bug caused by some strange behavior of the IHaskell kernel? If so, then that strange behavior did not manifest as this bug until something changed in JupyterLab after JupyterLab v3.0.16. And the execution cells for IHaskell work fine in VS Code notebooks.

I would welcome any hints about how I should investigate this bug further.

@jamesdbrock
Copy link
Author

jamesdbrock commented Nov 6, 2022

The JavaScript error

[Violation] Added non-passive event listener to a scroll-blocking 'touchstart' event.
Consider marking event handler as 'passive' to make the page more responsive.
See https://www.chromestatus.com/feature/5745543795965952
codemirror.js:536 

Was also happening in JupyterLab v3.0.16 which doesn’t have this bug, so I don’t think that’s the problem.

@jamesdbrock
Copy link
Author

Possibly related? #5366

@jamesdbrock
Copy link
Author

jamesdbrock commented Nov 7, 2022

I checked more images for the presence of this bug.

Bug is not present:

Bug is present:

jamesdbrock added a commit to IHaskell/learn-you-a-haskell-notebook that referenced this issue Nov 7, 2022
Unfortunately subject to bug jupyterlab/jupyterlab#13383
@jamesdbrock
Copy link
Author

So there it is, this bug was introduced somewhere in JupyterLab v3.2.5 https://github.com/jupyterlab/jupyterlab/blob/master/CHANGELOG.md#325

jamesdbrock added a commit to IHaskell/ihaskell-notebook that referenced this issue Nov 8, 2022
@jamesdbrock
Copy link
Author

Here are some versions of packages installed in the JupyterLab v3.2.4 image:

ipykernel                 6.5.0            py39hef51801_1    conda-forge
ipython                   7.29.0           py39hef51801_2    conda-forge
ipython_genutils          0.2.0                      py_1    conda-forge
ipywidgets                7.7.1              pyhd8ed1ab_0    conda-forge
jupyter_client            7.0.6              pyhd8ed1ab_0    conda-forge
jupyter_core              4.9.1            py39hf3d152e_1    conda-forge
jupyter_server            1.11.2             pyhd8ed1ab_0    conda-forge
jupyter_telemetry         0.1.0              pyhd8ed1ab_1    conda-forge
jupyterhub                1.5.0            py39hf3d152e_1    conda-forge
jupyterhub-base           1.5.0            py39hf3d152e_1    conda-forge
jupyterlab                3.2.4              pyhd8ed1ab_0    conda-forge
jupyterlab_pygments       0.1.2              pyh9f0ad1d_0    conda-forge
jupyterlab_server         2.8.2              pyhd8ed1ab_0    conda-forge
jupyterlab_widgets        1.1.1              pyhd8ed1ab_0    conda-forge

And here's what's installed in the JupyterLab v3.2.5 image:

ipykernel                 6.6.0            py39hef51801_0    conda-forge
ipython                   7.30.1           py39hf3d152e_0    conda-forge
ipython_genutils          0.2.0                      py_1    conda-forge
ipywidgets                7.7.1              pyhd8ed1ab_0    conda-forge
jupyter_client            7.1.0              pyhd8ed1ab_0    conda-forge
jupyter_core              4.9.1            py39hf3d152e_1    conda-forge
jupyter_server            1.13.1             pyhd8ed1ab_0    conda-forge
jupyter_telemetry         0.1.0              pyhd8ed1ab_1    conda-forge
jupyterhub                2.0.1                hd8ed1ab_0    conda-forge
jupyterhub-base           2.0.1              pyhd8ed1ab_0    conda-forge
jupyterlab                3.2.5              pyhd8ed1ab_0    conda-forge
jupyterlab_pygments       0.1.2              pyh9f0ad1d_0    conda-forge
jupyterlab_server         2.10.2             pyhd8ed1ab_0    conda-forge
jupyterlab_widgets        1.1.1              pyhd8ed1ab_0    conda-forge

jamesdbrock added a commit to IHaskell/ihaskell-notebook that referenced this issue Nov 8, 2022
@jamesdbrock
Copy link
Author

jamesdbrock commented Nov 8, 2022

I've published some IHaskell Docker images for anyone who wants to try to debug this issue. These two images are the same except that the base image has a different version of JupyterLab.

JupyterLab v3.2.4 image:

docker run --rm -p 8888:8888 --name ihaskell_notebook_dev_3.2.4 ghcr.io/ihaskell/ihaskell-notebook@sha256:4544c8aecdc4521782c21fa30593f2775c99f707847c25112bf057dfb27d2b1b jupyter lab --ServerApp.token=''

JupyterLab v3.2.5 image:

docker run --rm -p 8888:8888 --name ihaskell_notebook_dev_3.2.5 ghcr.io/ihaskell/ihaskell-notebook@sha256:ae73e5bca9bef4c66b3d20e16340eecd2532ad7e310acac7c6d934349b0490e8 jupyter lab --ServerApp.token=''

@JasonWeill
Copy link
Contributor

Triage notes: @krassowski reports the same issue in the R kernel, particularly if there is an error.

@fcollonval
Copy link
Member

@jamesdbrock could you install jupyterlab-kernelspy and display the kernel reply when executing a cell?

@jamesdbrock
Copy link
Author

@jamesdbrock could you install jupyterlab-kernelspy and display the kernel reply when executing a cell?

Thanks for the advice @fcollonval , I'll try that.

@jamesdbrock
Copy link
Author

First Execution

Screenshot from 2022-11-15 10-00-30

Second Execution

Screenshot from 2022-11-15 10-03-28

Sixth Execution

When I clicked Execute for the sixth time, the kernelspy reported receipt of a shell.comm_info_reply which seems to cause the first cell to display execution_count: 1. But the timestamp indicates that this shell.comm_info_reply was supposedly received before the first cell’s shell.execute_reply.

Screenshot from 2022-11-15 10-14-47

@fcollonval
Copy link
Member

@jamesdbrock thanks for all the information - I was looking at this a bit.

I tried your docker and can confirm your issue (thanks for that). I tried also your binder at https://mybinder.org/v2/gh/gibiansky/IHaskell/mybinder and the issue does not appear for me (JupyterLab is 3.2.9 in that binder). Did you implement a workaround that is deployed on the Binder or not?

@fcollonval
Copy link
Member

fcollonval commented Nov 16, 2022

Trying to dive into this one a bit more, I added jupyterlab-kernelspy in the docker image (JLab 3.2.5) and compare what we get for the haskell kernel (above) and the Python kernel (below):

image

... no error this time - you can actually see that I get shell.execute_reply as part of the shell.execute_request group (this happens each time).

I uninstalled jupyterlab-kernelspy and the error was back. Debugging the Javascript, I see that the shell.execute_reply is receive first (even before iopub.status message that set the status to busy). So this smells like a race condition.

@jamesdbrock
Copy link
Author

Thanks for the investigation @fcollonval .

I didn't know that https://github.com/IHaskell/IHaskell/blob/master/Dockerfile is working with JupyterLab 3.2.9 (and it does work, I confirmed.)

I've been using https://github.com/IHaskell/ihaskell-notebook/blob/master/Dockerfile for this issue.

@fcollonval
Copy link
Member

I did not try to update the JupyterLab version within the local docker I used. But my guess is that as Binder is a distant server the network messaging is slower and we fall on the good side of the race condition.

I'm confident the error is coming because that promise is not resolved at the expected time:

this._done.resolve(this._replyMsg);

But I didn't figure out yet which flag combination is not properly set (there is something about the need to have got a reply and being in idle state).

@jamesdbrock
Copy link
Author

jamesdbrock commented Nov 17, 2022

But my guess is that as Binder is a distant server the network messaging is slower and we fall on the good side of the race condition.

This bug happens consistently when I am running the whole thing locally on my fast computer. EDIT Oh I see, you're saying the network latency causes the bug to not appear.

@jamesdbrock
Copy link
Author

Another note: when I build and run JuptyerLab with the IHaskell kernel using the latest https://github.com/tweag/jupyterWith , I get JupyterLab v3.4.3 and the bug occurs. So the bug isn't caused only by the https://github.com/IHaskell/ihaskell-notebook/blob/master/Dockerfile . This bug is a puzzle.

@jamesdbrock
Copy link
Author

This bug is still present in JupyterLab v3.6.1

@jamesdbrock
Copy link
Author

This bug is still present in JupyterLab v4.0.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants