RTPA is a real-time application that fetches the availability of parking lots in Singapore and displays it on an interactive map.
This repo uses real-time data provided by LTA Data Mall
. In order to run the code, you will need to request an AccountKey
.
This project requires a Kafka cluster to produce / consume data from. docker-compose-kafka.yml provides an easy way to set up a cluster using the following bitnami docker images.
- bitnami-kafka
- bitnami-zookeeper
docker-compose -f docker-compose-kafka.yml up -d
Since bitnami kafka image doesn't provide a way to create topics on start up, we have to start zookeeper and kafka first and manually create the topic.
# Start a session in the kafka container
docker container exec -it rtpa_kafka_1 /bin/bash
# List existing topics
/opt/bitnami/kafka/bin/kafka-topics.sh --list --zookeeper zookeeper:2181
# Create a topic called rtpa
/opt/bitnami/kafka/bin/kafka-topics.sh --create --zookeeper zookeeper:2181 --topic rtpa --replication-factor 1 --partitions 1
You can get the pre-built images from the Docker Hub. You can also rebuild the images locally with the following steps.
In order to build the images, you will need the following developer tools installed.
- scala 2.12.x
- node 12.x
./docker-build.sh
This will build the following docker images.
- rtpa-backend - Kafka Producer + Spark Consumer
- rtpa-converter - NodeJS App that converts time series data to GeoJSON
- rtpa-ui - Mapbox JS based HTML served from Nginx server
Alternativelty, you can get the pre-built images from the Docker Hub.
-
Modify the environment variables and volume mounts (optional) in docker-compose.yml (using images from Docker Hub) or docker-compose-local.yml (using locally built images).
DATAMALL_API_KEY
- Required.AccountKey
provided by LTA DatamallRTPA_CSV_PATH
- Optional. Directory containing CSV carpack data, produced by rtpa-backend and consumed by rtpa-converterRTPA_GEOJSON_PATH
- Optional. Directory containing geojson carpack data, produced by rtpa-converter and consumed by rtpa-uiPRODUCER_TRIGGER_INTERVAL_MINUTES
- Optional. Determines how often the producer calls the DataMall APIs and writes to the Kafka topicCONSUMER_TRIGGER_INTERVAL_MINUTES
- Optional. Determines how often the consumers reads the data from Kafka topic
-
Run the docker images.
If you are using the images from Docker Hub,
docker-compose up -d
If you are using the local images,
docker-compose -f docker-compose-local.yml
-
The web application is now LIVE at http://localhost. You may need to wait a few minutes for the first batch of data to come through depending on your
PRODUCER_TRIGGER_INTERVAL_MINUTES
andCONSUMER_TRIGGER_INTERVAL_MINUTES
settings.