Skip to content

MIT-REALM/dcrl

Repository files navigation

Density Constrained Reinforcement Learning

The repository constains an implementation of Density Constrained Reinforcement Learning accepted by ICML 2021.

Install

Clone this repository

git clone https://github.com/Zengyi-Qin/dcrl.git

Create an virtual environment with Python 3.6 using Anaconda:

conda create - n dcrl python=3.6
conda activate dcrl

Install the dependencies:

pip install - r requirements.txt

Add dcrl to your PYTHONPATH:

export PYTHONPATH=$PYTHONPATH:'path/to/dcrl'

Training

python examples/FILE --mode train --constrained 1 --output OUTPUT_DIR
Environment options FILE
Autonomous electric vehicle routing ddpg_aev.py
Agricultural pesticide spraying drone ddpg_farm.py
Direct current series motor control ddpg_motor.py

The --constrained flag has two options. 0 for no constraint and 1 for the proposed DCRL approach.

Testing

python examples/FILE --mode test --constrained 1 --output OUTPUT_DIR --weights WEIGHTS_PATH

The WEIGHTS_PATH points to the weight files saved in OUTPUT_DIR/weights during training. For example, if we run the training of autonomous electric vehicle routing, the weight file can be OUTPUT_DIR/weights/ddpg_aev_50.h5f.

To test all the weights in the output foloer, run:

python examples/FILE --mode test_all --constrained 1 --output OUTPUT_DIR --weights OUTPUT_DIR/weights

About

Density Constrained Reinforcement Learning

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages