New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Question on training function #19
Comments
Hi! So this is a stateful lstm implementation so the cell state is kept and sent forward through time. So the cell state at step 20 is input for the lstmcell at step 21. What is done here:
The hx, cx output of lstmcell the Variables are volatile and cannot be bppt so we create new Variables for the underlying data in hx, cx Variables and now they can be ready to bppt for next update. |
Hi, thanks for your replay. Your action_train function is executed for every training step. And self.done is always False until the env resets. So you are actually setting
nearly every time step. If you check the project you reference, https://github.com/ikostrikov/pytorch-a3c, it doesn't have such problem because it sets
every args.num_steps, instead of every step. |
hmm your right it looks like I changed something here. I'll take a look in a little bit but very busy at the moment |
Oh, I don't think it's the problem of GPU/CPU.
is ok for both GPU and CPU. The problem is you don't want to put these two lines in the "else" condition. This will make this two lines execute every time step, except episode terminates (self.done = True). What you want to do is to execute these 2 lines every args.num_steps (in your setting, args.num_steps = 20). |
its fixed now should be fine now thanks! Wow thanks for spotting had not noticed this error in repo. My version is not linked to GitHub and just been checking using trained models. And test part was fine lol. Good spot! For clarity all final performance of models posted were not trained with this bug in code. Thanks again! |
I noticed that in your player_util.py action_train function:
But how can you backpropagate gradients through time, to the past 20 steps, if you set:
The text was updated successfully, but these errors were encountered: