Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Early stopping conditioned on metric val_loss which is not available #3

Open
HypergeneticSpacekid opened this issue Aug 23, 2020 · 2 comments

Comments

@HypergeneticSpacekid
Copy link

Hi,

I am getting the following errors, when executing

history = model.fit(x_train, y_train, epochs=params['epochs'], verbose=0,
                    batch_size=64, shuffle=True,
                    validation_data=(x_cv, y_cv),
                    callbacks=[es, mcp, rlp]
                    , sample_weight=sample_weights)

returns:

"WARNING:tensorflow:Early stopping conditioned on metric val_loss which is not available. Available metrics are:
WARNING:tensorflow:Can save best model only with val_loss available, skipping.
WARNING:tensorflow:Reduce LR on plateau conditioned on metric val_loss which is not available. Available metrics are: lr"

Do you know why this might be happening?

best wishes and thanks,
Alex

@nayash
Copy link
Owner

nayash commented Sep 29, 2020

I never faced this warning while training. Also, I have not used Keras in a while now, so hard to say (without looking at rest of the code) what's happening here. But you may wanna look at this.

@via986
Copy link

via986 commented Dec 28, 2020

@HypergeneticSpacekid
i got almost same error in mcp, solved:

replaced this line:

mcp = ModelCheckpoint(best_model_path, monitor='val_f1_metric', verbose=1, save_best_only=True, save_weights_only=False, mode='max', period=1)

to:

mcp = ModelCheckpoint(best_model_path, monitor='val_f1_metric', verbose=1, save_best_only=True, save_weights_only=False, mode='max', save_freq="epoch")

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants