Skip to content

Commit

Permalink
[Nllb-Moe] Fix nllb moe accelerate issue (#23758)
Browse files Browse the repository at this point in the history
fix nllb moe accelerate issue
  • Loading branch information
younesbelkada committed May 25, 2023
1 parent d685e33 commit f67dac9
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion src/transformers/models/nllb_moe/modeling_nllb_moe.py
Original file line number Diff line number Diff line change
Expand Up @@ -856,7 +856,7 @@ class NllbMoePreTrainedModel(PreTrainedModel):
config_class = NllbMoeConfig
base_model_prefix = "model"
supports_gradient_checkpointing = True
_no_split_modules = ["NllbMoeAttention"]
_no_split_modules = ["NllbMoeEncoderLayer", "NllbMoeDecoderLayer"]

def _init_weights(self, module):
"""Initialize the weights"""
Expand Down

0 comments on commit f67dac9

Please sign in to comment.