-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Luke]add PretrainedConfig and unit test #5560
Conversation
Thanks for your contribution! |
似乎是因为另一个任务中的reformer模型导致test无法通过? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
请merge一下develop,应该测试可以通过
@@ -525,111 +425,25 @@ class LukeModel(LukePretrainedModel): | |||
/docs/en/api/paddle/fluid/dygraph/layers/Layer_en.html>`__ subclass. Use it as a regular Paddle Layer | |||
and refer to the Paddle documentation for all matter related to general usage and behavior. | |||
|
|||
Args: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
args加一下LukeConfig的注释
self.num_classes = num_classes | ||
self.dropout = nn.Dropout(self.luke.config["hidden_dropout_prob"]) | ||
self.classifier = nn.Linear(self.luke.config["hidden_size"], num_classes) | ||
self.num_classes = config.num_classes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.num_classes = config.num_classes | |
self.num_labels = config.num_labels |
self.num_classes = num_classes | ||
self.dropout = nn.Dropout(self.luke.config["hidden_dropout_prob"]) | ||
self.classifier = nn.Linear(self.luke.config["hidden_size"] * 2, num_classes, bias_attr=False) | ||
self.num_classes = config.num_classes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.num_classes = config.num_classes | |
self.num_labels = config.num_labels |
self.num_classes = num_classes | ||
self.dropout = nn.Dropout(self.luke.config["hidden_dropout_prob"]) | ||
self.classifier = nn.Linear(self.luke.config["hidden_size"] * 3, num_classes) | ||
self.num_classes = config.num_classes |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
self.num_classes = config.num_classes | |
self.num_labels = config.num_labels |
Codecov Report
@@ Coverage Diff @@
## develop #5560 +/- ##
===========================================
+ Coverage 58.76% 59.42% +0.66%
===========================================
Files 481 482 +1
Lines 68058 68103 +45
===========================================
+ Hits 39997 40473 +476
+ Misses 28061 27630 -431
... and 2 files with indirect coverage changes Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm!
PR types
PR changes
Description