Skip to content

Code of the CVPR 2022 paper "HOP: History-and-Order Aware Pre-training for Vision-and-Language Navigation"

Notifications You must be signed in to change notification settings

YanyuanQiao/HOP-VLN-finetune

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 

Repository files navigation

HOP-VLN-finetune

This respository is the finetune code of HOP: History-and-Order Aware Pre-training for Vision-and-Language Navigation. The code is based on Recurrent-VLN-BERT. Thanks to Yicong Hong for releasing the Recurrent-VLN-BERT code.

Prerequisites

Installation

  • Install docker Please check here to install docker.
  • Create container To pull the image:
    docker pull starrychiao/hop-recurrent:v1
    If your CUDA version is 11.3, you can pull the image:
    docker pull starrychiao/vlnbert-2022-3090:1.0
    To create the container:
    docker run -it --ipc host  --shm-size=1024m --gpus all --name your_name  --volume "your_directory":/root/mount/Matterport3DSimulator starrychiao/hop-recurrent:v1
    or (if you pull the image for cuda 11.3)
    docker run -it --ipc host  --shm-size=1024m --gpus all --name your_name  --volume "your_directory":/root/mount/Matterport3DSimulator starrychiao/vlnbert-2022-3090:1.0
  • Set up
    docker start "your container id or name"
    docker exec -it "your container id or name" /bin/bash
    cd /root/mount/Matterport3DSimulator
  • Download the trained models.

R2R

cd finetune_r2r

Data Preparation

Please follow the instructions below to prepare the data in directories:

Initial HOP weights

  • Pre-trained HOP weights: load_model/checkpoint
    • Download the pytorch_model.bin from here.

Training

bash run/train_agent.bash

Evaluating

bash run/test_agent.bash

NDH

cd finetune_ndh

Data Preparation

Please follow the instructions below to prepare the data in directories:

Initial HOP weights

  • Pre-trained HOP weights for NDH: load/model
    • Download the pytorch_model.bin from here.

Training

bash run/train.bash

Evaluating

bash run/test.bash

Citation

If you use or discuss our HOP, please cite our paper:

@InProceedings{Qiao2022HOP,
    author    = {Qiao, Yanyuan, Qi Yuankai, Hong, Yicong, Yu, Zheng, Wang, Peng and Wu, Qi},
    title     = {HOP: History-and-Order Aware Pre-training for Vision-and-Language Navigation},
    booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
    month     = {June},
    year      = {2022},
    pages     = {15418-15427}
}

About

Code of the CVPR 2022 paper "HOP: History-and-Order Aware Pre-training for Vision-and-Language Navigation"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published