Skip to content
This repository has been archived by the owner on Nov 22, 2022. It is now read-only.

fix weight load for new fairseq checkpoints #1189

Closed

Conversation

jingfeidu
Copy link
Contributor

Summary: Some parameters have been changed for fairseq multihead attention module. Make corresponding changes for translate_roberta_state_dict to make it compatible

Reviewed By: liaimi

Differential Revision: D18939274

Summary: Some parameters have been changed for fairseq multihead attention module. Make corresponding changes for translate_roberta_state_dict to make it compatible

Reviewed By: liaimi

Differential Revision: D18939274

fbshipit-source-id: 22e8898863eb8b98a2376781874fceeee15bbdff
@facebook-github-bot facebook-github-bot added CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported labels Dec 11, 2019
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D18939274

@facebook-github-bot
Copy link
Contributor

This pull request has been merged in 8597667.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
CLA Signed Do not delete this pull request or issue due to inactivity. fb-exported Merged
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants