Skip to content

Commit

Permalink
Merge pull request #5722 from simpleoier/whisper_lora_fix
Browse files Browse the repository at this point in the history
Fix LoRA issues when saving all parameters.
  • Loading branch information
mergify[bot] committed Mar 28, 2024
2 parents b37c040 + d2bfe39 commit 3858d84
Showing 1 changed file with 5 additions and 0 deletions.
5 changes: 5 additions & 0 deletions espnet2/layers/create_adapter_fn.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,11 @@ def create_lora_adapter(
f"Target modules {target_modules} not found in the base model."
)

# Set the model (originally in train mode) to eval mode
# This step can avoid merging LoRA weights again
# when loading pre-trained checkpoints
model.eval()


def create_new_houlsby_module(target_module: torch.nn.Module, bottleneck: int):
"""Create a new houlsby adapter module for the given target module\n.
Expand Down

0 comments on commit 3858d84

Please sign in to comment.