Skip to content


Repository files navigation

Verkehrsunfälle in Münster

Diskussion im Chat

Daten und Tools für die Verarbeitung von Verkehrsunfalldaten der Polizei Münster.

Quickstart data import

Initialize containers (may fail, just retry)

./ init

Run the importer. This takes a long time (> 1 hour). If you are not that patient consider using a ready-made container instead (see Data container images).

./ import

Or just import a single file

docker-compose run --rm importer python processing/ 'VU PP 2015.xlsb'

I am getting import errors!?

Rows 10120 to 10142 in file VU PP 2015.xlsb, row 10482 in file VU PP 2016.xlsx, row 10902 in file VU PP 2017.xlsx and row 10901 in file VU PP 2018.xlsx are known to produce failed to import errors. These rows are the very last rows in each of these files and are empty. The import error is produced because empty rows cannot be imported.


Download OSM extract of Münster for Nominatim & Overpass

wget -O nominatim/muenster-regbez-latest.osm.pbf
wget -O overpass/muenster-regbez-latest.osm.bz2

Initialize Nominatim and Overpass. This will take a long time!

docker-compose run --rm nominatim-import

Initialize Overpass

docker-compose run --rm overpass

Start Nominatim & Overpass

docker-compose up -d nominatim overpass

Execute the geocoder

./ geocode

Exporting a database dump

docker-compose exec postgres pg_dump postgres://postgres@/accidents --encoding=utf8 --format=plain --no-owner --no-acl --no-privileges | gzip -9 > dump.sql.gz

Data container images

Container images with built in data are available from

The images are based on the official postgres container images from the docker hub. Just treat them as such.

You'll find instructions on how to build the data container images in directory data-container-image.

Simple Usage

Find the latest container image tag on the repository and start a container from it and wait until database system is ready to accept connections is printed.

docker run --rm --name verkehrsunfaelle -p 5432:5432

Open a second terminal. Execute psql inside the container

docker exec -it verkehrsunfaelle psql -U postgres accidents

The data lives in the table objects in the column data as JSON.

Get all accidents

SELECT id, data FROM objects WHERE resource_name = 'record' AND parent_id = '/buckets/accidents/collections/accidents_raw';

Get all geometries

SELECT id, data FROM objects WHERE resource_name = 'record' AND parent_id = '/buckets/accidents/collections/geometries';

Available fields inside data can be found in the file kinto/schema.yml


Some columns in the accidents_raw collection contain numbers referencing values in metadata collections.

Column in accidents_raw Collection
accident_category accident_category
cause_1_4 accident_cause
cause_2 accident_cause
cause_3 accident_cause
cause_other accident_cause
cause_02 accident_cause
participants_01 traffic_involvement
participants_02 traffic_involvement

Collection accident_type is NOT related to the accident_type column in the accidents_raw collection.

CSV files

Container images mentioned above can be used to create csv files of the data.

You can either download the csv from this release or create the file export.csv containing the imported raw accidents:

cat csv-export.sql | docker-compose exec -T postgres psql -qt postgres://postgres@/accidents > export.csv

You'll find instructions on how to export csv directory csv.