-
Notifications
You must be signed in to change notification settings - Fork 167
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible bug in meanteacher algorithm #102
Comments
This might be caused by a typo. Have you run the modified code? Does the result vary from the reported result? |
Sorry but i haven't run the modified code on any of the benchmarks, since right now I am only working locally with my own dataset. With the modified code I get an |
Will check this in next update. |
Fixed in PR #135 |
* [Update] resolve requirements.txt conflicts * [Fix] Fix mean teacher bug in #102 * [Fix] Fix DebiasPL bug * [Fix] Fix potential sample data bug in #119 * [Update] Add auto issue/pr closer * [Update] Update requirements.txt * [Fix] Fix bug in #74 * [Fix] Fix amp lighting bug in #123 * [Fix] Fix notebook bugs * [Update] release semilearn 0.3.1
In meanteacher train_step (line 55),
logits_x_ulb_s
gets assigned the logits ofouts_x_ulb_w
.So the consistency loss is calculated only with
logits_x_ulb_w
, which results in 0 consistency loss in every iteration.outs_x_ulb_s
is never used.Semi-supervised-learning/semilearn/algorithms/meanteacher/meanteacher.py
Lines 54 to 56 in 9a24e00
this should be:
The text was updated successfully, but these errors were encountered: