-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove deprecated generated layers #29
Comments
@Zettelkasten @Atticus1806 @JackTemaki opinions? |
Yes, no deprecated layer should appear in the first "release" of the automatically generated API. |
Another option I'm thinking about, which might make this less important: Maybe we should not directly export the generated layer modules to the user at all (the
(Very similar also how we did many things in Pytorch-to-RETURNN.) Edit Ok, we have #30 now, i.e. functional layer modules do not get exported but their function will be instead, all automatically. But this does not fully solve this issue here. There might also be functional layers which are deprecated, and then there are also potential non-functional deprecated layers. |
I agree with @JackTemaki we should not have layers which we mark as deprecated in the generated code.
As long as this covers all of the options a user would have with this layer (in the example) this would be fine in my opinion, but I am not sure if it hurts to provide a nice Interface as access to the layer via reduce but still make Reduce available (speaking as an example for general cases) |
But
|
I see, but I am not sure if that makes it a bit tedious to keep managing interfaces for all these layers over the versions of Returnn. Because from what I understand that is what would need to happen then. |
PyTorch also provides both in certain cases, such as the activations: |
I don't understand. What do you mean? We generate the generated layer modules automatically. We just need to have a list of deprecated layers. Even that can maybe be automatically generated by parsing the docstring for "DEPRECATED" or so. |
I assume PyTorch mostly has the We could make a variant of We do not have to copy exactly every single aspect of PyTorch. We always should think whether it makes sense for us. Ok, for functions with options, e.g.
instead of
Although you could also argue that |
Many of them are removed now. I think we can close this. |
This is more a question at this point:
There are a couple of deprecated layers.
SelfAttentionLayer
(so theSelfAttention
module), which can be constructed more explicitly viaCumConcatLayer
etc. We should provide one implementation for self attention, but maybe directly using other atomic layers (CumConcatLayer
etc).GaussWindowAttention
(We should specify a full list here.)
So, should we remove them now? Better now than later when people start to use them.
The text was updated successfully, but these errors were encountered: