You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Describe the bug
The bug is that the wrong checkpoint_metric is used in load_best_model at the end of EmmentalLearner.learn. I believe that it has to do with the fact that utils.merge doesn't delete entries, it just replaces them. This leaves us with multiple entries in logging_config.checkpointer_config.checkpointer_metric.
At this point, it should be clear that there are multiple values in logging_config.checkpointer_config.checkpoint_metric. However, in order to see how this affects downstream tasks, run EmmentalLearner.learn
Finally, print list(learner.logging_manager.checkpointer.checkpoint_metric.keys())[0], which shows the value used by Checkpointer.load_best_model function in order to determine if a best model was found (checkpointer.py, line ~253). The value from the default config should appear at this point instead of the value from the updated config.
Expected behavior
I expect the checkpoint metric I defined in the updated config to be used in Checkpointer.load_best_model.
Environment
OS: Ubuntu 16.04
Emmental Version: 0.0.4
Python 3.6
The text was updated successfully, but these errors were encountered:
Describe the bug
The bug is that the wrong
checkpoint_metric
is used inload_best_model
at the end ofEmmentalLearner.learn
. I believe that it has to do with the fact thatutils.merge
doesn't delete entries, it just replaces them. This leaves us with multiple entries inlogging_config.checkpointer_config.checkpointer_metric
.To Reproduce
Steps to reproduce the behavior:
logging_config.checkpointer_config.checkpoint_metric
. However, in order to see how this affects downstream tasks, runEmmentalLearner.learn
list(learner.logging_manager.checkpointer.checkpoint_metric.keys())[0]
, which shows the value used byCheckpointer.load_best_model
function in order to determine if a best model was found (checkpointer.py
, line ~253). The value from the default config should appear at this point instead of the value from the updated config.Expected behavior
I expect the checkpoint metric I defined in the updated config to be used in
Checkpointer.load_best_model
.Environment
The text was updated successfully, but these errors were encountered: