This is the official code repository to accompany the paper "CityLifeSim: A High-Fidelity Pedestrian and Vehicle Simulation with Complex Behaviors". Here we provide python code for generating and running scenarios using the simulation environment as well as links to the datasets and code used for running our experiments
The dataset contains videos (RGB, depth, segmentation frames) of six scenarios. There a total of 128 pedestrians in each video. One of the scenarios is captured from 17 different points of view (i.e., cameras) to simulate static view points, the others are captured from cameras on moving autonomous vehicles. Here are the download link for each scenario:
- 17 cameras covering different waypoints and points of interest. CCTV
- the front camera on a car during sunny weather. car_sunny
- the front camera on a car during rainy weather. car_weather
- the front cameras on a car during snowy weather. car_snowy
- the front camera on a drone.drone_front
- a camera pointing down from a drone. drone_downward
You can download the CityLifeSim executable here
- code/generate_scenario.py provides an example of how to programmatically create scenarios. (TRAVERSE_TYPE: random, a_star)
$ python generate_scenario.py --traverse_type <TRAVERSE_TYPE> --out_file <SCENARIO_FILE_NAME>.csv
- CityLife_randomwalk_128_v6.csv shows an example of the CSV output that is generated.
-
Set up CityLifeSim python client environment
- Install Anaconda and Open Anaconda Prompt
- Create the conda env for CityLifeSim
conda env create -f \CityLife_v1\citylifesim.yml conda activate citylifesim
-
CityLifeSim python client currently runs on AirSim 1.5.0 verion. Newer version won't work due to the syntx change for some AirSim API.
-
Run the CityEnv.exe
- Please check AirSim guide on how to move around in the enviroment in different modes(ComputerVision, Car, Multirotor)
- Modify the settings.json in Documents\AirSim based on your needs. (Use ComputerVision mode in the setting.json for cctv mode)
- ComputerVision mode in the setting.json for testing cctv camera mode, Car mode in the setting.json for car camera mode, Multirotor mode in the setting for drome cam mode
-
Prepare pedestrians scenarios
- Put the scenario csv file in the \CityLifeSim\WindowsNoEditor\CityEnv\Saved folder
-
Run the pedestrian scenarios simluation (CAM_MODE: cctv, car, drone)
$ python run_scenario.py --ped_scenario <SCENARIO_FILE_NAME> --cam_mode <CAM_MODE> --recording
- You can add --car_scenario <SCENARIO_FILE_NAME> to run the car scenario at the same time. We currently provide Scenario_[1-100] for testing.
- Please check the CausalCity for generating car scenarios.
- The recorded RBB-D images folder is by default present in the Documents folder (or specified in settings) with the timestamp of when the recording started in %Y-%M-%D-%H-%M-%S format.
- Environmental variables (e.g weather, Timne of Day) that can act as confounders in a dataset can be controlled using the AirSim APIs. Please check AirSim Documentation
- code/control_trafficlight.py provides an example of how to programmatically control the traffic lights to override the default one.
- To generate bounding boxes:
$ python seg2bbox.py --folder <RGB-D FOLDER> --seg_rgbs <FILE_PATH> --save_image
- Read the peds_bbox.json and plot the bboxes
$ python vis_bbox.py --folder <RGB-D FOLDER> --image_id <RGB_IMAGE_ID>
The following Colab downloads CityLifeSim into your drive, applies a SoTA MOT and evaluate it. We leverage the work of (Zhang et al., 2021). For more details please refer to the paper or dive into the code...