Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding categorical cross_entropy loss to yaml config. #276

Open
Eirikalb opened this issue Mar 12, 2024 · 1 comment
Open

Adding categorical cross_entropy loss to yaml config. #276

Eirikalb opened this issue Mar 12, 2024 · 1 comment

Comments

@Eirikalb
Copy link

Eirikalb commented Mar 12, 2024

After reading the recent paper from Google Deepmind I am really tempted to try their methods out myself to benchmark some Online RL problems in Omniverse Isaac Gym. https://arxiv.org/abs/2403.03950.

I see that there is some mention of experimental support for it in the code but I also see in the changenotes that crossentropy loss has not been added to the yaml config yet. What is the easiest/ best path for me to try it out using this framework. All my current experiments are based on standard MSE loss, using standard network architectures.

@Eirikalb Eirikalb changed the title Adding categorical cross_entropy to yaml config. Adding categorical cross_entropy loss to yaml config. Mar 12, 2024
@Denys88
Copy link
Owner

Denys88 commented Mar 14, 2024

Hi, I experimented with it here

def _build_value_layer(self, input_size, output_size, value_type='legacy'):

yes it is not in config. I tested TwoHot encoding. Feel free to try more options. I can add it to the yaml config.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants