-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DQN replaybuffer missing #9
Comments
That's a third party library for reinforcement learning. You need to install it using |
Ok.Thanks |
hi sir , i don't know how to open train.sh can you provide more information for me. |
You can use the command line and input train.sh in the dictionary of the project. Or you can input python train.py --algorithm_name mappo --experiment_name check1 --scenario RANDOM --accelerate 1200 --seed 1 --n_training_threads 4 --n_rollout_threads 42 --num_mini_batch 1 --num_env_steps 1512000 --ppo_epoch 10 --gain 0.01 --gamma 0.99 --lr 5e-4 --critic_lr 5e-4 --value_loss_coef 1 --log_level NOTICE --log_interval 1 --w_qos 4 --w_xqos 0.005 in the dictionary of the project. |
i don't understand sorry can you give more you know this my first time in this work and i'm traying to learn so please help me |
hello Bro,I really appreciate the code you provided. But I cant find the replaybuffer of DQN. The mistake comes from " from stable_baselines3.common.buffers import ReplayBuffer". It seems no such file in your project.
The text was updated successfully, but these errors were encountered: