-
Notifications
You must be signed in to change notification settings - Fork 19.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Serialisation doesn't allow variables as loss_weights #9444
Comments
I experienced the same issue but it seems I've figured out a workaround. Define a custom loss function with an additional argument using a closure as described in comments in issue 2121. You will set this function as a loss for your model and pass a To give you a more clear idea:
|
@Dref360 has there been any more investigation into this, or is the only current workaround the solution that @kurapan has proposed? |
@kurapan thanks for that suggested fix, it worked for me! |
I had the same problem. I did a very dirty fix, add a few lines of code in saving model before
I hope someone could figure out a better solution. |
Is there any solution for this bug? I think should keras people should fix it in later versions. Is there any possibility to update this issue. if the 'loss_weights' arguments works with variable, it will be better than using a customize loss function |
@fchollet Could really use a fix for this! Thanks. |
would need that fix too. The Workaround from @kurapan does work, however it implies that if one still want to log the unweighted loss values one need to introduce additional metrics. |
It is not just throwing this error when trying to save the model at the end of an epoch, the loss_weights do nothing to the total loss. My loss weights for two output network is {'out1': alpha, 'out2': 1-alpah} and alpha begins with 0 and ends to 1 by using some update equation which depends on the epoch number and by using callback of course. so the loss for 'out1' should be zero for the first epoch , but both losses are working without any multiplication with the loss_weights. The only solution for me is to build a custom loss function, but I still need this issue to be fixed because you want to use the build-in keras losses and sometimes it is hard to build the same loss by yourself. So is there |
I've been training a model with multiple losses, where the loss weights need updating during training via a callback. This is working fine, except when I try to save the model, when I get an error:
I've written two test cases, one passing, one failing, to demonstrate:
test cases
I'm running on OSX, Python 3.6.4, Tensorflow backend, CPU only. I freshly installed everything for the test.
pip list
:The text was updated successfully, but these errors were encountered: