Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix norm issue #409

Merged
merged 8 commits into from
Jun 19, 2023
Merged

fix norm issue #409

merged 8 commits into from
Jun 19, 2023

Conversation

kan-bayashi
Copy link
Owner

@kan-bayashi kan-bayashi commented Jun 19, 2023

The norm for scale discriminator is not applied correctly. This causes mismatch of parameters with configuration. As a result, we cannot load the pretrained models.

To solve this issue, when parameter mismatch happens in loading, we remove the norm at first, load the parameters, and then apply the norm in post-hook functions.

It seems that applying weight norm and spectral norm for pretrained parameters causes unexpected behavior.
Therefore, I changed to just show the message about the training error when fine-tuning and show the instruction to use pretrained parameters.

Fix the following issues:

@kan-bayashi kan-bayashi merged commit 457148f into master Jun 19, 2023
22 checks passed
@kan-bayashi kan-bayashi deleted the fix/fix-norm-compatibility branch June 19, 2023 12:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant