Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace BertLayerNorm with LayerNorm #14385

Merged
merged 1 commit into from
Nov 15, 2021
Merged

Replace BertLayerNorm with LayerNorm #14385

merged 1 commit into from
Nov 15, 2021

Conversation

eldarkurtic
Copy link
Contributor

Running Movement pruning experiments with the newest HuggingFace would crash due to non-existing BertLayerNorm.

Running Movement pruning experiments with the newest HuggingFace would crash due to non-existing BertLayerNorm.
Copy link
Contributor

@VictorSanh VictorSanh left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm, thank you @eldarkurtic !

@VictorSanh VictorSanh merged commit 9fd937e into huggingface:master Nov 15, 2021
Albertobegue pushed a commit to Albertobegue/transformers that referenced this pull request Jan 27, 2022
Running Movement pruning experiments with the newest HuggingFace would crash due to non-existing BertLayerNorm.
@yeeeqichen
Copy link

@VictorSanh @eldarkurtic

Hello guys!

Thanks for your effort for such a great repo!

I'm wondering why the implementation of LayerNorm in BERT seems not to be identical to what's commonly discussed version of LayerNorm.
I've read several tutorials about LayerNorm, and according to their explanation, the implementation seems to be like following:

input_x = torch.rand(batch_size, sequence_length, hidden_size)
layer_norm = torch.nn.LayerNorm([sequence_length, hidden_size])
output = layer_norm(input_x)

In above implementation, the normalization is calculated through the dim of sequence_length as well as hidden_size, but the implementation in this repo only normalize the tensors in the last dim, which I think is somehow identical to InstanceNorm?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants