-
Notifications
You must be signed in to change notification settings - Fork 1.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add TIN model #53
Add TIN model #53
Conversation
Codecov Report
@@ Coverage Diff @@
## master #53 +/- ##
==========================================
- Coverage 86.50% 84.41% -2.09%
==========================================
Files 72 73 +1
Lines 4030 4178 +148
Branches 623 633 +10
==========================================
+ Hits 3486 3527 +41
- Misses 440 547 +107
Partials 104 104
Flags with carried forward coverage won't be shown. Click here to find out more.
Continue to review full report at Codecov.
|
5f617fd
to
a1cca11
Compare
295344b
to
5f0d7e3
Compare
"""Initiate the parameters either from existing checkpoint or from | ||
scratch.""" | ||
# we set the initial bias of the convolution | ||
# layer to 0, and the final initial output will be 1.0 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what is "final initial" output?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's the initial weights?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The conv weights are initialized by default using pytorch kaiming_uniform_
or loading from ckpt, the codes only changes the bias as the paper said.
No description provided.