Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add TIN model #53

Merged
merged 5 commits into from
Aug 27, 2020
Merged

Add TIN model #53

merged 5 commits into from
Aug 27, 2020

Conversation

dreamerlin
Copy link
Collaborator

No description provided.

@dreamerlin dreamerlin added the WIP work in progress label Jul 25, 2020
@codecov
Copy link

codecov bot commented Jul 25, 2020

Codecov Report

Merging #53 into master will decrease coverage by 2.08%.
The diff coverage is 27.70%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master      #53      +/-   ##
==========================================
- Coverage   86.50%   84.41%   -2.09%     
==========================================
  Files          72       73       +1     
  Lines        4030     4178     +148     
  Branches      623      633      +10     
==========================================
+ Hits         3486     3527      +41     
- Misses        440      547     +107     
  Partials      104      104              
Flag Coverage Δ
#unittests 84.41% <27.70%> (-2.09%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
mmaction/models/__init__.py 100.00% <ø> (ø)
mmaction/models/backbones/resnet_tin.py 18.93% <18.93%> (ø)
mmaction/core/__init__.py 100.00% <100.00%> (ø)
mmaction/models/backbones/__init__.py 100.00% <100.00%> (ø)
mmaction/models/backbones/resnet.py 94.66% <100.00%> (+0.38%) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 78a54ee...c9c0c48. Read the comment docs.

setup.cfg Outdated Show resolved Hide resolved
"""Initiate the parameters either from existing checkpoint or from
scratch."""
# we set the initial bias of the convolution
# layer to 0, and the final initial output will be 1.0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what is "final initial" output?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

image
This comment is the same with the original repo and the paper. It means the initialized conv layer will output 1.0

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's the initial weights?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The conv weights are initialized by default using pytorch kaiming_uniform_ or loading from ckpt, the codes only changes the bias as the paper said.

@innerlee innerlee merged commit c9aea9c into open-mmlab:master Aug 27, 2020
@dreamerlin dreamerlin deleted the tin branch September 1, 2020 16:07
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants