Skip to content


Folders and files

Last commit message
Last commit date

Latest commit



72 Commits

Repository files navigation

Coronavirus Dresden

Go to data License Release

Screenshot of a Grafana dashboard that uses the data

This script collects official infection statistics published by the city of Dresden and saves them to InfluxDB. From there the data can be processed and visualised using the SQL-like query language InfluxQL and, for instance, Grafana.

Subsequent changes to the published data set can also be detected and routinely logged.

Data sets are archived here.

Note The coronavirus dashboard based on this script and database was hosted at during the COVID-19 pandemic. On 04/26/2023, it was discontinued after more than two years. Thanks to all who accompanied and actively supported this project throughout that time! ❤️

Data source

The raw data provided by the city of Dresden and visualised on their Dashboard is obtained from the following source:

Data is available under an open licence compatible with CC-BY: Landeshauptstadt Dresden, dl-de/by-2-0,


Get this repository:

git clone

If desired, the data archive can be retrieved with:

cd coronavirus-dresden
git submodule update --init --recursive

If you want to load new data in the future, git pull inside the subdirectory:

cd data
git checkout main
git pull

Using a virtual environment of your choice for Python is recommended. An exemplary installation with venv is described below.

venv (recommended)

Set up Python environment:

python3 -m venv venv
source venv/bin/activate

pip install -r requirements.txt

InfluxDB (required)

brew install influxdb
brew services start influxdb
wget -qO- | sudo apt-key add -
source /etc/lsb-release
echo "deb${DISTRIB_ID,,} ${DISTRIB_CODENAME} stable" | sudo tee /etc/apt/sources.list.d/influxdb.list
sudo apt-get update && sudo apt-get install influxdb
sudo service influxdb start

Python API

Helpful resources:

Grafana (optional)

brew install grafana
brew services start grafana




To search for new data regularly, enter:

sudo crontab -e

Add the following line to run the script every 5 minutes (adapt paths to suit your own installation):

*/5 * * * * /root/bin/coronavirus-dresden/venv/bin/python /root/bin/coronavirus-dresden/ --log --archive-json

If you just want to routinely save new published JSON files and are not interested in saving the data to InfluxDB, you can do this by typing:

python --skip-influxdb

Command line arguments

To display all data collection options, type:

python --help