Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

sqlite3.OperationalError: database is locked #52

Closed
SamLau95 opened this issue Oct 7, 2016 · 16 comments
Closed

sqlite3.OperationalError: database is locked #52

SamLau95 opened this issue Oct 7, 2016 · 16 comments
Milestone

Comments

@SamLau95
Copy link

SamLau95 commented Oct 7, 2016

I've deployed a JupyterHub instance and I'm running into a sqlite3.OperationalError: database is locked from nbformat/sign.py whenever I try to open a notebook. I can open the user/samlau95/tree URL, but clicking a notebook or trying to create a new notebook hangs for ~45 seconds until it fails with a 504 Gateway error.

Later, the container running the notebook server will output:

[E 2016-10-05 19:44:18.016 samlau95 handlers:468] Unhandled error in API request
    Traceback (most recent call last):
      File "/opt/conda/lib/python3.5/site-packages/traitlets/traitlets.py", line 501, in get
        value = obj._trait_values[self.name]
    KeyError: 'db'

    During handling of the above exception, another exception occurred:

    Traceback (most recent call last):
      File "/opt/conda/lib/python3.5/site-packages/notebook/base/handlers.py", line 457, in wrapper
        result = yield gen.maybe_future(method(self, *args, **kwargs))
      File "/opt/conda/lib/python3.5/site-packages/tornado/gen.py", line 1008, in run
        value = future.result()
      File "/opt/conda/lib/python3.5/site-packages/tornado/concurrent.py", line 232, in result
        raise_exc_info(self._exc_info)
      File "<string>", line 3, in raise_exc_info
      File "/opt/conda/lib/python3.5/site-packages/tornado/gen.py", line 282, in wrapper
        yielded = next(result)
      File "/opt/conda/lib/python3.5/site-packages/notebook/services/contents/handlers.py", line 124, in get
        path=path, type=type, format=format, content=content,
      File "/opt/conda/lib/python3.5/site-packages/notebook/services/contents/filemanager.py", line 358, in get
        model = self._notebook_model(path, content=content)
      File "/opt/conda/lib/python3.5/site-packages/notebook/services/contents/filemanager.py", line 318, in _notebook_model
        self.mark_trusted_cells(nb, path)
      File "/opt/conda/lib/python3.5/site-packages/notebook/services/contents/manager.py", line 447, in mark_trusted_cells
        trusted = self.notary.check_signature(nb)
      File "/opt/conda/lib/python3.5/site-packages/nbformat/sign.py", line 220, in check_signature
        if self.db is None:
      File "/opt/conda/lib/python3.5/site-packages/traitlets/traitlets.py", line 529, in __get__
        return self.get(obj, cls)
      File "/opt/conda/lib/python3.5/site-packages/traitlets/traitlets.py", line 508, in get
        value = self._validate(obj, dynamic_default())
      File "/opt/conda/lib/python3.5/site-packages/nbformat/sign.py", line 127, in _db_default
        self.init_db(db)
      File "/opt/conda/lib/python3.5/site-packages/nbformat/sign.py", line 139, in init_db
        )""")
    sqlite3.OperationalError: database is locked

I can verify that the database is locked:

$ sqlite3 .local/share/jupyter/nbsignatures.db
SQLite version 3.11.0 2016-02-15 17:29:24
Enter ".help" for usage hints.
sqlite> .schema
Error: database is locked

And that the process is the notebook server:

$ fuser .local/share/jupyter/nbsignatures.db
/home/samlau95/.local/share/jupyter/nbsignatures.db: 23697
$ ps aux | grep 23697
root@sam-node1:/home/samlau95# ps aux | grep 23697
samlau95 23697  0.0  0.6 188472 48464 ?        S    04:26   0:01 python3 /usr/local/bin/jupyterhub-singleuser --port=8888 --ip=0.0.0.0 --user=samlau95 --cookie-name=jupyter-hub-token-samlau95 --base-url=/user/samlau95 --hub-prefix=/hub/ --hub-api-url=http://10.128.0.7:8081/hub/api

This is running on Ubuntu 16.04 using the setup in https://github.com/data-8/jupyterhub-deploy which has been successfully deployed multiple times. This is the first time I'm deploying this on Ubuntu 16.04 (we've used 14.04 before) so perhaps this is related?

Here are the versions of packages installed:

root@4f4030e758f9:~# pip show notebook

---
Metadata-Version: 1.1
Name: notebook
Version: 4.2.1
Summary: A web-based notebook environment for interactive computing
Home-page: http://jupyter.org
Author: Jupyter Development Team
Author-email: jupyter@googlegroups.com
License: BSD
Location: /opt/conda/lib/python3.5/site-packages
Requires:
Classifiers:
  Intended Audience :: Developers
  Intended Audience :: System Administrators
  Intended Audience :: Science/Research
  License :: OSI Approved :: BSD License
  Programming Language :: Python
  Programming Language :: Python :: 2.7
  Programming Language :: Python :: 3

root@4f4030e758f9:~# pip show nbformat

---
Metadata-Version: 1.1
Name: nbformat
Version: 4.0.1
Summary: The Jupyter Notebook format
Home-page: http://jupyter.org
Author: Jupyter Development Team
Author-email: jupyter@googlegroups.com
License: BSD
Location: /opt/conda/lib/python3.5/site-packages
Requires:
Classifiers:
  Intended Audience :: Developers
  Intended Audience :: System Administrators
  Intended Audience :: Science/Research
  License :: OSI Approved :: BSD License
  Programming Language :: Python
  Programming Language :: Python :: 2.7
  Programming Language :: Python :: 3
  Programming Language :: Python :: 3.3

Any pointers on why this might be breaking? Happy to give more info.

@minrk
Copy link
Member

minrk commented Oct 7, 2016

Is home on NFS? We've seen some issues with sqlite and NFS. Moving the nbsignatures.db file out of they way resets the trust state of notebooks, which is a minor inconvenience, but not generally a big deal. I think there are fixes in nbformat 4.2 (out soon) that deal with db failures more gracefully.

@SamLau95
Copy link
Author

SamLau95 commented Oct 7, 2016

Yeah, home is on NFS.

I renamed the file to nbsignatures.db.old, but it gets created again when I open a notebook and then gets locked immediately after.

It seems like nbformat supports the :memory: option; is there a way to say I want to use that in JupyterHub config?

@takluyver
Copy link
Member

NotebookNotary.db_file is the config option (docs). It will forget about previously trusted notebooks every time you start it, though.

@SamLau95
Copy link
Author

SamLau95 commented Oct 7, 2016

I now have lines that look like

c = get_config()
c.NotebookNotary.db_file = ':memory:'

in my JupyterHub config but I'm still getting the same error in the logs. This is pretty puzzling to me since it seems like the issue is happening on db initialization. Any pointers?

@takluyver
Copy link
Member

That needs to be configured for the individual notebook servers, not the hub.

@SamLau95
Copy link
Author

SamLau95 commented Oct 8, 2016

Works! Thanks @takluyver .

Do we know more about this other than "NFS causes problems"?

@takluyver
Copy link
Member

The SQLite FAQ says:

This is because fcntl() file locking is broken on many NFS implementations.

I don't know any more than that.

@SamLau95
Copy link
Author

SamLau95 commented Oct 9, 2016

Okay, thanks for the info. I'll close this issue, try to work around it, and wait for the changes in 4.2.

@SamLau95 SamLau95 closed this as completed Oct 9, 2016
@minrk minrk added this to the 4.2 milestone Nov 30, 2016
@jusjosgra
Copy link

jusjosgra commented Jul 24, 2018

That needs to be configured for the individual notebook servers, not the hub.

@takluyver Can you elaborate on how to do this please?

@isthisthat
Copy link

Another option is to clear the notebook output: https://gist.github.com/damianavila/5305869
That worked for me.

@SheldonXLD
Copy link

Works! Thanks @takluyver .

Do we know more about this other than "NFS causes problems"?

Hi, where to set this configure? Can you tell me, thanks?

@vrt1shjwlkr
Copy link

Facing the same issue. @SamLau95 @takluyver can you please elaborate how to set this configuration option?

@SamueLacombe
Copy link

SamueLacombe commented Jul 13, 2020

I figured out how to do it:

  1. First open a Terminal in jupyter.
  2. Execute this command: jupyter notebook --generate-config
  3. Then go edit the file that was generated manually through windows and change the setting:
    #c.NotebookNotary.db_file = ':memory:' (make sure to uncomment it)

Hope this help...

Here the references that helped me figure out how to do it:
https://jupyter-notebook.readthedocs.io/en/stable/config.html

@girishgl
Copy link

girishgl commented Dec 2, 2020

The SQLite database should not be used on NFS. SQLite uses reader/writer locks to control access to the database. This locking mechanism might not work correctly if the database file is kept on an NFS filesystem. This is because fcntl() file locking is broken on many NFS implementations. Therefore, you should avoid putting SQLite database files on NFS since it will not handle well multiple processes which might try to access the file at the same time.

so ideally we should use PostgreSQL for production.

But can anyone help me how to change backend database in configuration for jupyterhub?

@meeseeksmachine
Copy link

This issue has been mentioned on Jupyter Community Forum. There might be relevant details there:

https://discourse.jupyter.org/t/how-to-change-default-db-from-sqlite-to-postgresql-mysql-in-jupyter-notebook/7052/1

@chance2021
Copy link

Actually I found a workaround for this issue. The issue is caused by the sqlite db is not compatible with NFS drive. One way is to replace the database from sqlite to postgre for the singleuser notebook but I haven't figured it out how to do that (btw, you can point the hub database to postgres, which is suggested by the official doc, by adding to hub.db.type and hub.db.url.). The other way, which is the workaround I am using, is to relocate the nbsignature.db file to your k8s cluster local disk. This can be done by modifying the configuration files inside of the jhub image. The below are the steps for this.

Update below command in both /etc/jupyter/jupyter_notebook_config.py and /home/jovyan/.jupyter/jupyter_notebook_config.py in the docker image
c.NotebookNotary.data_dir = "/tmp/signature_dir"

Note: By default, in the deployment.yaml in the helm package, only the files under /home and /share directories are stored via PVC, which is NFS in my case. Whatever files beyond this scope will be stored in the local disk during the lifetime of the pod.

For this signature db file, given the size is relatively small and the nature that it is only for the duration of a single session, I think it should be fine to just store it in the local disk, instead of the postgres database.

Hopefully it will be helpful for anyone has the same issue as me.

Reference:
https://jupyter-notebook.readthedocs.io/en/stable/security.html#notebook-security

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests