This is the code accompanying the paper: "Curiosity-Driven Energy-Efficient Worker Scheduling in Vehicular Crowdsourcing: A Deep Reinforcement Learning Approach" by Yinuo Zhao, Chi Harold Liu, et. al, published at ICDE 2020.
DRL-CEWS is a novel deep reinforcement learning (DRL) approach for curiosity-driven energy-efficient worker scheduling, to achieve an optimal trade-off between maximizing the collected amount of data and coverage fairness, and minimizing the overall energy consumption of workers.
- Clone repo
git clone https://github.com/BIT-MCS/DRL-CEWS.git cd DRL-CEWS
- Install dependent packages
pip install -r requirements.txt
Test the model trained with 100 PoIs (2 UAVs and 2 charging stations).
Download the model from Google Driver to ckpt/
. Then, change the trainable
to False
in the parameter configuration file /uav2_charge2/exper_dppo_curiosity/params.py
. After, run the following command the test the model.
python run.py
Last, find the result under /result
.
Change the trainable
to True
in the parameter configuration file /uav2_charge2/exper_dppo_curiosity/params.py
and then run the following command the train the model.
python run.py
Find the result under /result
.
Same as that in Quick Inference.
This paper was supported by National Natural Science Foundation of China (No. 61772072).
If you have any question, please email ynzhao@bit.edu.cn
.
If you are interested in our work, please cite our paper as
@inproceedings{liu2020curiosity,
title={Curiosity-driven energy-efficient worker scheduling in vehicular crowdsourcing: A deep reinforcement learning approach},
author={Liu, Chi Harold and Zhao, Yinuo and Dai, Zipeng and Yuan, Ye and Wang, Guoren and Wu, Dapeng and Leung, Kin K},
booktitle={2020 IEEE 36th International Conference on Data Engineering (ICDE)},
pages={25--36},
year={2020},
organization={IEEE}
}