Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Refactor losses into layers #598

Open
albertz opened this issue Aug 31, 2021 · 1 comment
Open

Refactor losses into layers #598

albertz opened this issue Aug 31, 2021 · 1 comment

Comments

@albertz
Copy link
Member

albertz commented Aug 31, 2021

Multiple things:

  • Loss becomes a subclass of LayerBase
  • Loss instances will be treated as normal layers, and the name logic for moving them out of rec loop etc will apply.

This should greatly cleanup the complexity we currently have with LayerBase.get_losses and LossHolder.

This should also fix some bugs along the way, e.g. #556.

@albertz
Copy link
Member Author

albertz commented Oct 14, 2022

Note that with RETURNN-common, this is not so much an issue anymore, as RETURNN common only uses the AsIsLoss, and all losses are already defined via normal layers.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants