Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Luke]add PretrainedConfig and unit test #5560

Merged
merged 3 commits into from
Apr 7, 2023

Conversation

ZwhElliott
Copy link
Contributor

PR types

PR changes

Description

@paddle-bot
Copy link

paddle-bot bot commented Apr 7, 2023

Thanks for your contribution!

@ZwhElliott
Copy link
Contributor Author

似乎是因为另一个任务中的reformer模型导致test无法通过?

Copy link
Collaborator

@sijunhe sijunhe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

请merge一下develop,应该测试可以通过

@@ -525,111 +425,25 @@ class LukeModel(LukePretrainedModel):
/docs/en/api/paddle/fluid/dygraph/layers/Layer_en.html>`__ subclass. Use it as a regular Paddle Layer
and refer to the Paddle documentation for all matter related to general usage and behavior.

Args:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

args加一下LukeConfig的注释

self.num_classes = num_classes
self.dropout = nn.Dropout(self.luke.config["hidden_dropout_prob"])
self.classifier = nn.Linear(self.luke.config["hidden_size"], num_classes)
self.num_classes = config.num_classes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.num_classes = config.num_classes
self.num_labels = config.num_labels

self.num_classes = num_classes
self.dropout = nn.Dropout(self.luke.config["hidden_dropout_prob"])
self.classifier = nn.Linear(self.luke.config["hidden_size"] * 2, num_classes, bias_attr=False)
self.num_classes = config.num_classes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.num_classes = config.num_classes
self.num_labels = config.num_labels

self.num_classes = num_classes
self.dropout = nn.Dropout(self.luke.config["hidden_dropout_prob"])
self.classifier = nn.Linear(self.luke.config["hidden_size"] * 3, num_classes)
self.num_classes = config.num_classes
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.num_classes = config.num_classes
self.num_labels = config.num_labels

@codecov
Copy link

codecov bot commented Apr 7, 2023

Codecov Report

Merging #5560 (e0e6e25) into develop (951bc76) will increase coverage by 0.66%.
The diff coverage is 100.00%.

❗ Current head e0e6e25 differs from pull request most recent head 2aa4fad. Consider uploading reports for the commit 2aa4fad to get more accurate results

@@             Coverage Diff             @@
##           develop    #5560      +/-   ##
===========================================
+ Coverage    58.76%   59.42%   +0.66%     
===========================================
  Files          481      482       +1     
  Lines        68058    68103      +45     
===========================================
+ Hits         39997    40473     +476     
+ Misses       28061    27630     -431     
Impacted Files Coverage Δ
paddlenlp/transformers/__init__.py 100.00% <100.00%> (ø)
paddlenlp/transformers/luke/configuration.py 100.00% <100.00%> (ø)
paddlenlp/transformers/luke/modeling.py 96.00% <100.00%> (+75.94%) ⬆️
paddlenlp/transformers/luke/tokenizer.py 72.02% <100.00%> (+56.69%) ⬆️

... and 2 files with indirect coverage changes

Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here.

Copy link
Collaborator

@sijunhe sijunhe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

@sijunhe sijunhe merged commit c682b3f into PaddlePaddle:develop Apr 7, 2023
2 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants