Skip to content
This repository has been archived by the owner on Aug 31, 2021. It is now read-only.

minimum loss is always set to training loss #131

Closed
mheilman opened this issue Mar 8, 2016 · 2 comments
Closed

minimum loss is always set to training loss #131

mheilman opened this issue Mar 8, 2016 · 2 comments

Comments

@mheilman
Copy link

mheilman commented Mar 8, 2016

The code in BaseMonitor.update currently sets the minimum loss so far as follows (link):

        if self.last_loss_seen < self.min_loss:
            self.min_loss = training_loss
            self.min_loss_i = self.steps

I believe the 2nd line there should be self.min_loss = self.last_loss_seen since the validation monitor sets that here.

@terrytangyuan
Copy link
Member

@mheilman Thanks for spotting that out! I just pushed the necessary change. You are welcome to submit PR directly to fix issues you found in the future once you signed CLA!

@mheilman
Copy link
Author

mheilman commented Mar 8, 2016

You're welcome. Thanks for the quick fix. If I happen find anything else in the future, I'll submit a PR (once the CLA stuff is sorted out).

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants