Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Expert Attention Fixes: #350

Merged
merged 1 commit into from Sep 7, 2021
Merged

Expert Attention Fixes: #350

merged 1 commit into from Sep 7, 2021

Conversation

copybara-service[bot]
Copy link
Contributor

Expert Attention Fixes:

  • Allow moe.py to work with a tensor of "memory_length" dimension
  • Fix Experts Attention bug in moe.py where it would break during decoding if the input dimension was different than the output dimension.
  • Fix bug in ExpertsEncDecAttention where it was only doing Self-Attention on the decoder side.
  • Factorize expert_computation code to easily allow for using different query and memory antecedents

@google-cla
Copy link

google-cla bot commented Aug 16, 2021

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@google-cla
Copy link

google-cla bot commented Sep 7, 2021

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

- Allow moe.py to work with a tensor of "memory_length" dimension
- Fix Experts Attention bug in moe.py where it would break during decoding if the input dimension was different than the output dimension.
- Fix bug in ExpertsEncDecAttention where it was only doing Self-Attention on the decoder side.
- Factorize expert_computation code to easily allow for using different query and memory antecedents

PiperOrigin-RevId: 395259569
@copybara-service copybara-service bot merged commit 69bd9c7 into master Sep 7, 2021
@copybara-service copybara-service bot deleted the test_390808726 branch September 7, 2021 16:03
@google-cla
Copy link

google-cla bot commented Sep 7, 2021

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and then comment @googlebot I fixed it.. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

0 participants