You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Nov 11, 2023. It is now read-only.
Traceback (most recent call last):
File "/home/manjaro/.conda/envs/soft-vc/lib/python3.8/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap
fn(i, *args)
File "/media/manjaro/NVME_2tb/NeuralNetworks/so-vits-svc-v2-44100/train.py", line 112, in run
scheduler_g = torch.optim.lr_scheduler.ExponentialLR(optim_g, gamma=hps.train.lr_decay, last_epoch=epoch_str - 2)
File "/home/manjaro/.conda/envs/soft-vc/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 583, in __init__
super(ExponentialLR, self).__init__(optimizer, last_epoch, verbose)
File "/home/manjaro/.conda/envs/soft-vc/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 42, in __init__
raise KeyError("param 'initial_lr' is not specified "
KeyError: "param 'initial_lr' is not specified in param_groups[0] when resuming an optimizer"
Where can I find official checkpoints if that one is bad?
The text was updated successfully, but these errors were encountered:
But when I try to start training this error happens:
Traceback (most recent call last):
File "/home/manjaro/.conda/envs/soft-vc/lib/python3.8/site-packages/torch/multiprocessing/spawn.py", line 69, in _wrap
fn(i, *args)
File "/media/manjaro/NVME_2tb/NeuralNetworks/so-vits-svc-v2-44100/train.py", line 112, in run
scheduler_g = torch.optim.lr_scheduler.ExponentialLR(optim_g, gamma=hps.train.lr_decay, last_epoch=epoch_str - 2)
File "/home/manjaro/.conda/envs/soft-vc/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 583, in __init__
super(ExponentialLR, self).__init__(optimizer, last_epoch, verbose)
File "/home/manjaro/.conda/envs/soft-vc/lib/python3.8/site-packages/torch/optim/lr_scheduler.py", line 42, in __init__
raise KeyError("param 'initial_lr' is not specified "
KeyError: "param 'initial_lr' is not specified in param_groups[0] when resuming an optimizer"
Where can I find official checkpoints if that one is bad?
I'm trying to finetune 4.0-v2 using this checkpoint I found https://huggingface.co/cr941131/sovits-4.0-v2-hubert/tree/main
(not sure if its good or not)
But when I try to start training this error happens:
Where can I find official checkpoints if that one is bad?
The text was updated successfully, but these errors were encountered: