Skip to content

InhwanBae/EigenTrajectory

Repository files navigation

EigenTrajectory: Low ̵ Rank Descriptors for Multi ̵ Modal Trajectory Forecasting

Inhwan Bae · Jean Oh · Hae-Gon Jeon
ICCV 2023

Project Page ICCV Paper Source Code Related Works



A common pipeline of trajectory prediction models and the proposed EigenTrajectory.


This repository contains the code for the EigenTrajectory(𝔼𝕋) space applied to the 10 traditional Euclidean-based trajectory predictors.
EigenTrajectory-LB-EBM achieves ADE/FDE of 0.21/0.34 while requiring only 1 hour for training!


🌌 EigenTrajectory(𝔼𝕋) Space 🌌

  • A novel trajectory descriptor based on Singular Value Decomposition (SVD), provides an alternative to traditional methods.
  • It employs a low-rank approximation to reduce the complexity and creates a compact space to represent pedestrian movements.
  • A new anchor-based refinement method to effectively encompass all potential futures.
  • It can significantly improve existing standard trajectory predictors by simply replacing the Euclidean space.

Model Training

Setup

Environment
All models were trained and tested on Ubuntu 18.04 with Python 3.7 and PyTorch 1.12.1 with CUDA 11.1.

Dataset
Preprocessed ETH and UCY datasets are included in this repository, under ./datasets/. The train/validation/test splits are the same as those fond in Social-GAN.

You can also download the dataset by running the following script.

./scripts/download_datasets.sh

Baseline models
This repository supports 10 baseline models: AgentFormer, DMRGCN, GPGraph-SGCN, GPGraph-STGCNN, Graph-TERN, Implicit, LBEBM, PECNet, SGCN and Social-STGCNN. We have included model source codes from their official GitHub in the ./baselines/ folder.

If you want to add your own baseline model, simply paste the model code into the baseline folder and add a few lines of initialization constructor and bridge code.

Train EigenTrajectory

To train our EigenTrajectory on the ETH and UCY datasets at once, we provide a bash script train.sh for a simplified execution.

./scripts/train.sh

We provide additional arguments for experiments:

./scripts/train.sh -t <experiment_tag> -b <baseline_model> -c <config_file_path> -p <config_file_prefix> -d <space_seperated_dataset_string> -i <space_seperated_gpu_id_string>

# Supported baselines: agentformer, dmrgcn, gpgraphsgcn, gpgraphstgcnn, graphtern, implicit, lbebm, pecnet, sgcn, stgcnn
# Supported datasets: eth, hotel, univ, zara1, zara2

# Examples
./scripts/train.sh -b sgcn -d "hotel" -i "1"
./scripts/train.sh -b agentformer -t EigenTrajectory -d "zara2" -i "2"
./scripts/train.sh -b pecnet -c ./config/ -p eigentrajectory -d "eth hotel univ zara1 zara2" -i "0 0 0 0 0"

If you want to train the model with custom hyper-parameters, use trainval.py instead of the script file.

python trainval.py --cfg <config_file_path> --tag <experiment_tag> --gpu_id <gpu_id> 

Model Evaluation

Pretrained Models

We provide pretrained models in the release section. You can download all pretrained models at once by running the script. This will download the 10 EigenTrajectory models.

./scripts/download_pretrained_models.sh

Evaluate EigenTrajectory

To evaluate our EigenTrajectory at once, we provide a bash script test.sh for a simplified execution.

./scripts/test.sh -t <experiment_tag> -b <baseline_model> -c <config_file_path> -p <config_file_prefix> -d <space_seperated_dataset_string> -i <space_seperated_gpu_id_string>

# Examples
./scripts/test.sh -b sgcn -d "hotel" -i "1"
./scripts/test.sh -b agentformer -t EigenTrajectory -d "zara2" -i "2"
./scripts/test.sh -b pecnet -c ./config/ -p eigentrajectory -d "eth hotel univ zara1 zara2" -i "0 0 0 0 0"

If you want to evaluate the model individually, you can use trainval.py with custom hyper-parameters.

python trainval.py --test --cfg <config_file_path> --tag <experiment_tag> --gpu_id <gpu_id> 

📖 Citation

If you find this code useful for your research, please cite our trajectory prediction papers :)

💬 LMTrajectory (CVPR'24) 🗨️ | 1️⃣ SingularTrajectory (CVPR'24) 1️⃣ | 🌌 EigenTrajectory (ICCV'23) 🌌 | 🚩 Graph‑TERN (AAAI'23) 🚩 | 🧑‍🤝‍🧑 GP‑Graph (ECCV'22) 🧑‍🤝‍🧑 | 🎲 NPSN (CVPR'22) 🎲 | 🧶 DMRGCN (AAAI'21) 🧶

@inproceedings{bae2023eigentrajectory,
  title={EigenTrajectory: Low-Rank Descriptors for Multi-Modal Trajectory Forecasting},
  author={Bae, Inhwan and Oh, Jean and Jeon, Hae-Gon},
  booktitle={Proceedings of the IEEE/CVF International Conference on Computer Vision},
  year={2023}
}
More Information (Click to expand)
@inproceedings{bae2024lmtrajectory,
  title={Can Language Beat Numerical Regression? Language-Based Multimodal Trajectory Prediction},
  author={Bae, Inhwan and Lee, Junoh and Jeon, Hae-Gon},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2024}
}

@inproceedings{bae2024singulartrajectory,
  title={SingularTrajectory: Universal Trajectory Predictor Using Diffusion Model},
  author={Bae, Inhwan and Park, Young-Jae and Jeon, Hae-Gon},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2024}
}

@article{bae2023graphtern,
  title={A Set of Control Points Conditioned Pedestrian Trajectory Prediction},
  author={Bae, Inhwan and Jeon, Hae-Gon},
  journal={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2023}
}

@inproceedings{bae2022gpgraph,
  title={Learning Pedestrian Group Representations for Multi-modal Trajectory Prediction},
  author={Bae, Inhwan and Park, Jin-Hwi and Jeon, Hae-Gon},
  booktitle={Proceedings of the European Conference on Computer Vision},
  year={2022}
}

@inproceedings{bae2022npsn,
  title={Non-Probability Sampling Network for Stochastic Human Trajectory Prediction},
  author={Bae, Inhwan and Park, Jin-Hwi and Jeon, Hae-Gon},
  booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition},
  year={2022}
}

@article{bae2021dmrgcn,
  title={Disentangled Multi-Relational Graph Convolutional Network for Pedestrian Trajectory Prediction},
  author={Bae, Inhwan and Jeon, Hae-Gon},
  journal={Proceedings of the AAAI Conference on Artificial Intelligence},
  year={2021}
}

Acknowledgement

Part of our code is borrowed from AgentFormer, DMRGCN, GP-Graph, Graph-TERN, Implicit, LB-EBM, PECNet, SGCN and Social-STGCNN. We thank the authors for releasing their code and models.