Skip to content

a public transport data crawler script, specifically for *Berlin, Germany*

License

Notifications You must be signed in to change notification settings

EightSQ/sbahncrawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

S-Bahn Datensammler

Sammelt Echtzeitdaten von Produkten des Berliner ÖPNV.

Usage

  1. Find the Station ID you are interested with the bvg-rest API project.

  2. With Docker:

    	$ docker build -t eightsq/sbahncrawler .
    	$ docker run \
    		-v {some_data_path_on_your_machine}:/data:rw \
    		--rm \
    		-e CRAWLER_STATIONID={your_station_id} \
    		eightsq/sbahncrawler:latest
    

    Without Docker: Make sure you have Python 3. Install the requests package. Then,

    	$ CRAWLER_STATIONID={your_station_id} python3 crawler.py <output_filename>
    

Actually, since you want to automate this, set up cronjob, that does this regularly for you (like every 10"). To crawl a different product than S-Bahn, adjust the productId filter in the crawl-Function inside crawler.py.

Author

EightSQ (Blog)

About

a public transport data crawler script, specifically for *Berlin, Germany*

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published