Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix typo in BaseHead default loss_factor. #446

Merged
merged 4 commits into from
Dec 16, 2020

Conversation

SuX97
Copy link
Collaborator

@SuX97 SuX97 commented Dec 15, 2020

No description provided.

@innerlee
Copy link
Contributor

The problem is deeper. There must be some **kwargs not being used and also not being checked

@codecov
Copy link

codecov bot commented Dec 15, 2020

Codecov Report

Merging #446 (f99d07b) into master (5e0ffc1) will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #446   +/-   ##
=======================================
  Coverage   86.59%   86.59%           
=======================================
  Files         101      101           
  Lines        7201     7201           
  Branches     1161     1161           
=======================================
  Hits         6236     6236           
  Misses        733      733           
  Partials      232      232           
Flag Coverage Δ
unittests 86.58% <ø> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmaction/models/heads/base.py 86.84% <ø> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 5e0ffc1...3a6f1f0. Read the comment docs.

@SuX97
Copy link
Collaborator Author

SuX97 commented Dec 15, 2020

The problem is deeper. There must be some **kwargs not being used and also not being checked

There is no **kwargs used in building heads and losses. The reason why we haven't tested it and no error being revealed is that, in BaseWeightedLoss, loss_weight is set to be 1.0 by default. And in all our usages(e.g. all configs), we override the loss_cls by dict(type='XXXLoss') without specifying loss_weight.

@innerlee
Copy link
Contributor

Is loss_factor a valid argument?

@SuX97
Copy link
Collaborator Author

SuX97 commented Dec 15, 2020

Is loss_factor a valid argument?

Yes, but it will always be overridden together with the total dictionary of loss_cls. So we never met this error before.

Fix docstring.
Fix typo.
@innerlee innerlee merged commit 4cc48fc into open-mmlab:master Dec 16, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants