Skip to content
WIP Verkehrsunfälle in Münster Crowd Geocoder + Visualisierung
Branch: master
Clone or download
Type Name Latest commit message Commit time
Failed to load latest commit information.
data Add 2017 data Mar 5, 2019
kinto Add geometries_corrections collections to kinto schema Jun 11, 2019
nominatim add stump for geocoding Feb 26, 2019
overpass add overpass stuff Mar 20, 2019
processing compute accident id as uuid5 from source_file and source_row_number Jun 4, 2019
.gitignore link to csv release in README Jul 20, 2019
csv-export.sql document csv export Jul 20, 2019 Add 'geocode' command to Apr 28, 2019
docker-compose.yml rename random-accident to editor-data-helpers, add two new helper hooks Jul 19, 2019

Verkehrsunfälle in Münster

Daten und Tools für die Verarbeitung von Verkehrsunfalldaten der Polizei Münster.


Start Kinto (Datastore)

docker-compose up -d kinto

Initialize Kinto

docker-compose run --rm initializer

Run the importer. This takes a long time (> 1 hour). If you are not that patient consider using a ready-made container instead (see Data container images).

docker-compose run --rm importer

Or just import a single file

docker-compose run --rm importer python processing/ '/data/VU PP 2015.xlsb'

I am getting import errors!?

Rows 10120 to 10142 in file VU PP 2015.xlsb, row 10482 in file VU PP 2016.xlsx, row 10902 in file VU PP 2017.xlsx and row 10901 in file VU PP 2018.xlsx are known to produce failed to import errors. These rows are the very last rows in each of these files and are empty. The import error is produced because empty rows cannot be imported.


Download OSM extract of Münster for Nominatim & Overpass

wget -O nominatim/muenster-regbez-latest.osm.pbf
wget -O overpass/muenster-regbez-latest.osm.bz2

Initialize Nominatim and Overpass. This will take a long time!

docker-compose run --rm nominatim-import

Initialize Overpass

docker-compose run --rm overpass

Start Nominatim & Overpass

docker-compose up -d nominatim overpass

Execute the geocoder

docker-compose run --rm geocoder

Exporting a database dump

docker-compose exec postgres pg_dump postgres://accidents@/accidents --encoding=utf8 --format=plain --no-owner --no-acl --no-privileges | gzip -9 > dump.sql.gz

Data container images

Container images with built in data are available from

The images are based on the official postgres container images from the docker hub. Just treat them as such.

Simple Usage

Find the latest container image tag on the repository and start a container from it and wait until database system is ready to accept connections is printed.

docker run --rm --name verkehrsunfaelle -p 5432:5432

Open a second terminal. Execute psql inside the container

docker exec -it verkehrsunfaelle psql -U postgres

The data lives in the table objects in the column data as JSON.

Get all accidents

SELECT id, data FROM objects WHERE resource_name = 'record' AND parent_id = '/buckets/accidents/collections/accidents_raw';

Get all geometries

SELECT id, data FROM objects WHERE resource_name = 'record' AND parent_id = '/buckets/accidents/collections/geometries';

Available fields inside data can be found in the file kinto/schema.yml

CSV files

Container images mentioned above can be used to create csv files of the data.

You can either download the csv from this release or create the file export.csv containing the imported raw accidents:

cat csv-export.sql | docker-compose exec -T postgres psql -qt postgres://accidents@/accidents > export.csv
You can’t perform that action at this time.