Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes to make life easier with the nlp library #6423

Merged
merged 6 commits into from
Aug 12, 2020
Merged

Conversation

sgugger
Copy link
Collaborator

@sgugger sgugger commented Aug 11, 2020

This PR adds two things to make the interface easier with the nlp library:

  • BatchEncoding stops enforcing a 2-dim for every tensor, which causes problems for labels (which should be one vector of shape [batch_size]).
  • PreTrainedTokenizerBase.pad accepts tensors as inputs, which makes it easy to use this function for data collation.

Added proper documentation and tests from @thomwolf initial work.

@codecov
Copy link

codecov bot commented Aug 11, 2020

Codecov Report

Merging #6423 into master will increase coverage by 2.27%.
The diff coverage is 95.45%.

Impacted file tree graph

@@            Coverage Diff             @@
##           master    #6423      +/-   ##
==========================================
+ Coverage   77.51%   79.79%   +2.27%     
==========================================
  Files         150      150              
  Lines       27789    27807      +18     
==========================================
+ Hits        21542    22188     +646     
+ Misses       6247     5619     -628     
Impacted Files Coverage Δ
src/transformers/pipelines.py 79.79% <ø> (+52.80%) ⬆️
src/transformers/tokenization_utils_base.py 94.16% <95.45%> (+0.28%) ⬆️
src/transformers/tokenization_albert.py 28.84% <0.00%> (-58.66%) ⬇️
src/transformers/modeling_utils.py 87.35% <0.00%> (+0.19%) ⬆️
src/transformers/modeling_tf_bert.py 96.58% <0.00%> (+0.35%) ⬆️
src/transformers/tokenization_utils.py 90.40% <0.00%> (+0.40%) ⬆️
src/transformers/generation_tf_utils.py 86.46% <0.00%> (+0.75%) ⬆️
src/transformers/generation_utils.py 96.92% <0.00%> (+0.83%) ⬆️
src/transformers/modeling_t5.py 83.33% <0.00%> (+0.94%) ⬆️
src/transformers/modeling_tf_ctrl.py 97.87% <0.00%> (+1.06%) ⬆️
... and 10 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update f6cb0f8...8edc948. Read the comment docs.

@@ -2318,7 +2318,7 @@ def _concat_inputs_history(self, inputs: List[List[int]], histories: List[Option
max_len = max([len(item) for item in outputs])
outputs = [output + [self.pad_token_id] * (max_len - len(output)) for output in outputs]
outputs = BatchEncoding(
{"input_ids": outputs, "attention_mask": [1] * len(outputs)}, tensor_type=self.framework
{"input_ids": outputs, "attention_mask": [[1] * len(outputs)]}, tensor_type=self.framework,
Copy link
Collaborator Author

@sgugger sgugger Aug 11, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is the only thing were the change of dim in BatchEncoding.convert_to_tensors is breaking something, but in this case, it was a bit magical that the dimension was automatically added, so I don't think this is a serious failure.

Copy link
Member

@LysandreJik LysandreJik left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, great that you added tests for all three frameworks.

@sgugger
Copy link
Collaborator Author

sgugger commented Aug 12, 2020

Merging then, we can follow up next week when @thomwolf is back if he has more comments.

@sgugger sgugger merged commit e9c3031 into master Aug 12, 2020
@sgugger sgugger deleted the simple-examples branch August 12, 2020 12:00
fabiocapsouza added a commit to fabiocapsouza/transformers that referenced this pull request Nov 15, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants