Skip to content
Connect webKnossos with existing datasets (BossDB, Neuroglancer Precomputed)
Python Other
  1. Python 98.3%
  2. Other 1.7%
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.


A webKnossos compatible data connector written in Python.

webKnossos-connect serves as an adapter between the webKnossos data store interface and other alternative data storage servers (e.g BossDB) or static files hosted on Cloud Storage (e.g. Neuroglancer Precomputed)


Available Adapaters / Supported Data Formats:


1. Installation / Docker

Install webKnossos-connect using Docker or use the instructions for native installation below. docker-compose up --build webknossos-connect

2. Connecting to webKnossos

Register your webknossos-connect instance with your main webKnossos instance. Modify the webKnossos Postgres database:

INSERT INTO "webknossos"."datastores"("name","url","key","isscratch","isdeleted","isforeign","isconnector")
VALUES (E'connect', E'http://localhost:8000', E'secret-key', FALSE, FALSE, FALSE, TRUE);

3. Adding Datasets

Add and configure datasets to webKnossos-connect to make them available for viewing in webKnossos


You can add new datasets to webKnossos-connect through the REST interface. POST a JSON configuration to:


The access token can be obained from your user profile in the webKnossos main instance. Read more in the webKnosssos docs.

Example JSON body. More examples can be found here.

    "boss": {
        "Test Organisation": {
            "ara": {
                "domain": "",
                "collection": "ara_2016",
                "experiment": "sagittal_50um",
                "username": "<NEURODATA_IO_USER>",
                "password": "<NEURODATA_IO_PW>"
    "neuroglancer": {
        "Test Organisation": {
            "fafb_v14": {
                "layers": {
                    "image": {
                        "source": "gs://neuroglancer-fafb-data/fafb_v14/fafb_v14_clahe",
                        "type": "image"

CURL Example

curl http:/<webKnossos-connect>/data/datasets -X POST -H "Content-Type: application/json" --data-binary "@datasets.json"

3.2 webKnossos UI

Alternatively, new datasets can be added directly through the webKnossos UI. Configure and import a new datasets from the webKnossos dashboard. (Dashboard -> Datasets -> Upload Dataset -> Add wk-connect Dataset)

Read more in the webKnossos docs.

3.3 Default test datasets

By default, some public datasets are added to webKnossos-connect to get you started when using the Docker image. Some initial datasets are hosted on For access, create a .env file with your credentials:

NEURODATA_IO_USER="<your username>"
NEURODATA_IO_PW="<your password>"


In Docker 🐳

  • Start it with docker-compose up dev
  • Run other commands docker-compose run --rm dev pipenv run lint
  • Check below for moar commands.
  • If you change the packages, rebuild the image with docker-compose build dev



You need Python 3.7 with pipenv installed. The recommended way is to use pyenv and pipenv:

  • Install pyenv with
    curl -L | bash
  • Install your system requirements to build Python, see
  • To install the correct Python version, run pyenv install
  • Start a new shell to activate the Python version: bash
  • Install Pipenv: pip install --user --upgrade pipenv
  • On current Debian and Ubuntu setups, you have to fix a bug manually: Add ~/.local/bin to your PATH like this:
    echo '
    # set PATH so it includes the private local bin if it exists
    # This should be the default, but is broken in some Debian/Ubuntu Versions,
    # see
    if [ -d "$HOME/.local/bin" ] ; then
    ' >> ~/.profile
    This will be activated after the next login automatically, to use it right now, run . ~/.profile


  • Add webknossos-connect to the webKnossos database:
    INSERT INTO "webknossos"."datastores"("name","url","key","isscratch","isdeleted","isforeign") 
    VALUES (E'connect',E'http://localhost:8000',E'k',FALSE,FALSE,FALSE);
  • pipenv install --dev
  • pipenv run main
  • curl http://localhost:8000/api/neuroglancer/Demo_Lab/test \
      -X POST -H "Content-Type: application/json" \
      --data-binary "@datasets.json"


Useful commands:

  • Lint with pylint & flake8
  • Format with black, isort & autoflake
  • Type-check with mypy
  • Benchark with timeit
  • Trace with py-spy

Use the commands with pipenv run …:

  • pretty
  • pretty-check
  • lint
  • lint-details
  • type-check
  • benchmarks/

Trace the server on http://localhost:8000/trace.


AGPLv3 Copyright scalable minds

You can’t perform that action at this time.