Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pin memory in Trainer by default #9857

Merged
merged 6 commits into from Jan 28, 2021
Merged

Conversation

abhishekkrthakur
Copy link
Member

No description provided.

Copy link
Collaborator

@sgugger sgugger left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for adding this!

Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
abhishekkrthakur and others added 2 commits January 27, 2021 20:37
Co-authored-by: Stas Bekman <stas00@users.noreply.github.com>
Co-authored-by: Stas Bekman <stas00@users.noreply.github.com>
@abhishekkrthakur abhishekkrthakur marked this pull request as ready for review January 27, 2021 19:37
@abhishekkrthakur abhishekkrthakur merged commit 25fcb5c into master Jan 28, 2021
@abhishekkrthakur abhishekkrthakur deleted the trainer_pin_memory branch January 28, 2021 07:50
@stas00
Copy link
Contributor

stas00 commented Jan 28, 2021

Could we please go through normal PR review approval cycles? Unless I missed something and there was one.

It looks like my comment on slack was missed where I suggested to use a more specific cl arg name.

I proposed one of:

  • dataloader_pin_memory
  • dl_pin_memory

But since we already have dataloader_num_workers:

            num_workers=self.args.dataloader_num_workers,
            pin_memory=self.args.pin_memory,

it should probably be dataloader_pin_memory

This is important since there are other ways to pin memory in pytorch.


This is a general comment - not specific to this PR:

We have this ongoing issue wrt cl arg naming, that we name something and later we realize it's not the best name and then we are concerned with changing the name not to break user's code, so let's think deeply about new cl args names before we add them. Thank you!

@abhishekkrthakur
Copy link
Member Author

@stas00 It seems like I missed this message and when I opened this PR in the morning, I didn't see any comments and @sgugger had approved the PR. For a final check, I asked @LysandreJik who gave me the green light.

To avoid this in future, I would request if PR specific comments are made on the PR itself so that author & other reviewers can go through them and make sure that everything is resolved before merging.

@stas00
Copy link
Contributor

stas00 commented Jan 28, 2021

Yes, absolutely. I guess it just fell through the cracks.

And let's have PR description, as simple as:

This PR adds --pin_memory to trainer DataLoader and it defaults to True.

Qbiwan pushed a commit to Qbiwan/transformers that referenced this pull request Jan 31, 2021
Co-authored-by: Sylvain Gugger <35901082+sgugger@users.noreply.github.com>
Co-authored-by: Stas Bekman <stas00@users.noreply.github.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants