This repository is a Master Thesis project conducted at Scania CV AB and Uppsala University, Sweden. The thesis publication can be found here.
This thesis aims to increase the temporal resolution of the data (vehicle trips) using deep neural networks. Given a temporally ordered sequence of geospatial coordinates indicating a truck’s trajectory, the objective is to estimate the intermediate positions of the truck. Furthermore, the model should implicitly learn the actual road network and predict feasible positions of the truck. An example of a trip where a truck collects data every 10 minutes can be found in Table 1.
The objective is to double the frequency of the data, i.e., to generate intermediate latitude and longitude values for the truck every 5 minutes, which can be found in Table 2.
The Preprocessor
class can be found here and the preprocessing utilities can be found here.
A few randomly sampled, preprocessed subtrips plotted on the map are shown below:
The axes text in the following figures is black in color and may not be visible in dark mode.
- Install the requirements
$ pip install -r requirements.txt
- Set the hyper-parameters in the
Config
class. The final set of hyper-parameters used in this thesis are as follows:class Config: model_name: str = "model" data_path: str = "data" traj_len: int = 30 stride: int = 4 traj_dist: int = 1000 # meters epochs: int = 500 learning_rate: float = 1e-4 batch_size: int = 128 hid_dim: int = 100 load_weights: bool = False checkpoint_freq: int = 10 training: bool = True
- Navigate to the
geo_rits
package$ cd geo_rits
- Begin model training
$ python3 train.py
- The trained model can be found in the
models/
directory.
An example of the input subtrips and masks can be found in the
data/
directory.
The original publication of the RITS
architecture- "BRITS: Bidirectional Recurrent Imputation for Time Series, Wei Cao, Dong Wang, Jian Li, Hao Zhou, Lei Li Yitan Li. (NerIPS 2018)" can be found here.
The TensorFlow implementation of the original RITS
architecture used in this project can be found here.