Skip to content

Latest commit

 

History

History
77 lines (55 loc) · 2.23 KB

TRAINING.md

File metadata and controls

77 lines (55 loc) · 2.23 KB

Training LAV

We use a LBC-style privileged distillation framework. Please follow the instructions below for different training stages.

Make sure you have followed INSTALL,md before proceeding.

All steps will visualize and store weights to the wandb cloud (locally as well), so make sure you have it setup already.

For all following training stages, you need a multi-GPU machine or should otherwise decrease the batch size.

Dataset

First, download the LAV dataset.

We have released the full 3425 trajectories. However, each trajectory is self-contained, and you may only download a subset of them to run the training code. After downloading the dataset, specify the dataset path in the following line of config.yaml:

You may also choose to download the split compressed files HERE.

data_dir: [PATH tO DATASET]

Privileged Motion Planning

python -m lav.train_bev

You can monitor the training and visualize the progess in your wandb page of project lav_bev:

bev

Semantic Segmentation

python -m lav.train_seg

Similar, monitor the progess in wandb of lav_seg:

bev

RGB Braking Prediction

python -m lav.train_bra

You can monitor the training and visualize the progess in your wandb page of project lav_bra:

bev

Point Painting

Write painted lidar points to the disk.

python -m lav.data_paint

Full Models

This is divided into two steps.

Perception Pre-training

python -m lav.train_full --perceive-only

Once it is done, update the following lines in config.yaml:

lidar_model_dir: [TRAINED MODEL PATH]

End-to-end Training

python -m lav.train_full

Visualize the progress in wandb project page lav_full:

bev

V2 agent training

To train the v2 leaderboard agent (used for team_code_v2), append _v2 to each of the training commands except for the seg model. Also for the last stage, additionally use a frozen perception pretrain stage (--motion-only) before end-to-end fine-tuning.