You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hey, @Jungjee I was trying to fine-tune RawNet1 and observed that the center-loss does not decrease as the training proceeds, although the total-loss does go down. I tried to increase the weight of the center-loss but still see similar patterns. If the weight is too high, the spk-basis-loss starts to increase. So, it seems that both these losses are somewhat inversely related. Did you observe similar trends? I have attached the plots for the losses for different values of the c_lambda parameter (which is the weight of the center-loss). Is there something that could be going wrong here? In all the cases above the learning-rate has been set to 0.001.
c_lambda=0.001(default)
c_lambda=0.1
c_lambda=0.5
c_lambda=5
The text was updated successfully, but these errors were encountered:
Hi, first of all, this tendency is different to what I observed before.
I recall all three losses decreased in the end.
Are the plots show training until the last epoch?
Since you mentioned the total-loss is decreasing, I'm thinking that the center loss might start to decrease quiet after the starting point.
My suggestion would be training the model using more epochs or smaller batch size to increase the number of total iterations.
Looking at the result of using C_lambda=5, I think with proper balance and enough iterations, you can get low values for all losses
Hey, @Jungjee I was trying to fine-tune RawNet1 and observed that the
center-loss
does not decrease as the training proceeds, although thetotal-loss
does go down. I tried to increase the weight of thecenter-loss
but still see similar patterns. If the weight is too high, thespk-basis-loss
starts to increase. So, it seems that both these losses are somewhat inversely related. Did you observe similar trends? I have attached the plots for the losses for different values of thec_lambda
parameter (which is the weight of the center-loss). Is there something that could be going wrong here? In all the cases above thelearning-rate
has been set to0.001
.c_lambda=0.001(default)
c_lambda=0.1
c_lambda=0.5
c_lambda=5
The text was updated successfully, but these errors were encountered: