Skip to content
This repository has been archived by the owner on Nov 3, 2023. It is now read-only.

[Modular] Update Transformer Layer __init__s #4061

Merged
merged 1 commit into from Oct 8, 2021
Merged

Conversation

klshuster
Copy link
Contributor

Patch description
The TransformerDecoder(Layer) and TransformerEncoder(Layer) support passing **kwargs for maximum customizability; however, these **kwargs are unnecessarily (and, actually, incorrectly) being passed to the nn.Module's super().__init__(), which doesn't accept arbitrary **kwargs. This patch removes these from the super call

Testing steps
Subclassed a TransformerEncoder, attempted to initialize with arbitrary parameters, confirmed it works after fix.

Copy link
Contributor

@jxmsML jxmsML left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for the fix!

@klshuster klshuster merged commit f034747 into main Oct 8, 2021
@klshuster klshuster deleted the transformer_inits branch October 8, 2021 15:05
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants