Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In the definition of FixedNormal class have some wrong code. #39

Closed
chillybird opened this issue Apr 21, 2022 · 4 comments
Closed

In the definition of FixedNormal class have some wrong code. #39

chillybird opened this issue Apr 21, 2022 · 4 comments

Comments

@chillybird
Copy link

chillybird commented Apr 21, 2022

# Normal
class FixedNormal(torch.distributions.Normal):
def log_probs(self, actions):
return super().log_prob(actions).sum(-1, keepdim=True)
def entrop(self):
return super.entropy().sum(-1)
def mode(self):
return self.mean

@chillybird
Copy link
Author

class method entrop is wrong.

@chillybird
Copy link
Author

I just found that the modified code is wrong, so the code with continuous action can not be modified.

@chillybird
Copy link
Author

If you change it, you can't get the correct entropy.

@zoeyuchao
Copy link
Member

ic, thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants