Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Improve] Change default value of average-clips #232

Merged
merged 5 commits into from
Oct 10, 2020

Conversation

kennymckormick
Copy link
Member

Set 'average_clips' in the test_cfg of each config file.
You can still set this parameter during testing and it will override the default value in the config file.

@codecov
Copy link

codecov bot commented Oct 2, 2020

Codecov Report

Merging #232 into master will not change coverage.
The diff coverage is 100.00%.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #232   +/-   ##
=======================================
  Coverage   85.15%   85.15%           
=======================================
  Files          81       81           
  Lines        5267     5267           
  Branches      846      846           
=======================================
  Hits         4485     4485           
  Misses        644      644           
  Partials      138      138           
Flag Coverage Δ
#unittests 85.15% <100.00%> (ø)

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmaction/datasets/base.py 68.33% <ø> (ø)
mmaction/datasets/pipelines/formating.py 93.85% <ø> (ø)
mmaction/version.py 100.00% <100.00%> (ø)

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update fafca4e...cdcfb9b. Read the comment docs.

@dreamerlin
Copy link
Collaborator

dreamerlin commented Oct 2, 2020

The default value of average_clips should be "prob" rather than "score" for 3D model. "prob" will get higher performance for 3D. And for 2D recognizers, just set None for average_clips is OK.

@dreamerlin dreamerlin changed the title Fix average clips Change default value of average-clips Oct 2, 2020
@innerlee
Copy link
Contributor

innerlee commented Oct 2, 2020

Do u need to change model zoo performance when changing configs from None to 'something'?

@kennymckormick
Copy link
Member Author

The default value of average_clips should be "prob" rather than "score" for 3D model. "prob" will get higher performance for 3D. And for 2D recognizers, just set None for average_clips is OK.

Sorry that I made the mistake, the default value should be 'prob'.

@kennymckormick
Copy link
Member Author

Do u need to change model zoo performance when changing configs from None to 'something'?

Emm, Yes. Since the original default value of 'average_clips' is 'score', so some ckpts trained by me should be re-tested. @SuX97, did you explicitly set 'average_clips' to 'prob' during testing previously?

@dreamerlin
Copy link
Collaborator

Do u need to change model zoo performance when changing configs from None to 'something'?

Emm, Yes. Since the original default value of 'average_clips' is 'score', so some ckpts trained by me should be re-tested. @SuX97, did you explicitly set 'average_clips' to 'prob' during testing previously?

The performance of the ckpt is test by setting 'average_clips' to 'prob', so there is no need to re-test.

@kennymckormick
Copy link
Member Author

Do u need to change model zoo performance when changing configs from None to 'something'?

Emm, Yes. Since the original default value of 'average_clips' is 'score', so some ckpts trained by me should be re-tested. @SuX97, did you explicitly set 'average_clips' to 'prob' during testing previously?

The performance of the ckpt is test by setting 'average_clips' to 'prob', so there is no need to re-test.

My bad, 3D checkpoints trained by me is tested with default setting (average_clips = 'score'). So the reported performance might be a little bit lower. I will create another PR to fix them.

@innerlee
Copy link
Contributor

innerlee commented Oct 3, 2020

the perf table can be updated here too, so that configs and reported accuracies are always matched.

docs/changelog.md Outdated Show resolved Hide resolved
@innerlee
Copy link
Contributor

innerlee commented Oct 4, 2020

As long as the settings are consistent with the reported accuracies in modelzoo, i'm ok.

I'm a little confused by the use of None. Here are two possible meanings

  • invalid. this argument does not make sense for this method
  • don't care and use whatever the default value.

If it belongs to the second case, then the direction of this pr is correct: configs should be clear and concrete.

@dreamerlin
Copy link
Collaborator

As long as the settings are consistent with the reported accuracies in modelzoo, i'm ok.

I'm a little confused by the use of None. Here are two possible meanings

  • invalid. this argument does not make sense for this method
  • don't care and use whatever the default value.

If it belongs to the second case, then the direction of this pr is correct: configs should be clear and concrete.

It belongs to the second case. For 3D recognizers, the average_clips matters since there is no consensus op to average the segments and average_clip does this thing. For some 2D recognizers, if it uses consensus op and doesn't reshape for temporal dimension (like TSM in 3crop + twice sample setting, which will be supported soon), the average_clips doesn't matter.

@kennymckormick
Copy link
Member Author

As long as the settings are consistent with the reported accuracies in modelzoo, i'm ok.
I'm a little confused by the use of None. Here are two possible meanings

  • invalid. this argument does not make sense for this method
  • don't care and use whatever the default value.

If it belongs to the second case, then the direction of this pr is correct: configs should be clear and concrete.

It belongs to the second case. For 3D recognizers, the average_clips matters since there is no consensus op to average the segments and average_clip does this thing. For some 2D recognizers, if it uses consensus op and doesn't reshape for temporal dimension (like TSM in 3crop + twice sample setting, which will be supported soon), the average_clips doesn't matter.

I agree. @innerlee

@kennymckormick kennymckormick changed the title Change default value of average-clips [Improve] Change default value of average-clips Oct 10, 2020
@innerlee innerlee merged commit 539984a into open-mmlab:master Oct 10, 2020
@kennymckormick kennymckormick deleted the fix_average_clips branch October 15, 2020 09:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants