Skip to content

Conversation

asi1024
Copy link
Contributor

@asi1024 asi1024 commented Nov 4, 2020

This PR implements LazyConvXd and LazyConvTransposeXd based on #44538. (cc. @emcastillo and @albanD)

@asi1024 asi1024 requested a review from apaszke as a code owner November 4, 2020 09:28
@facebook-github-bot
Copy link
Contributor

Hi @asi1024!

Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file.

In order for us to review and merge your code, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA.

If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks!

@dr-ci
Copy link

dr-ci bot commented Nov 4, 2020

💊 CI failures summary and remediations

As of commit 87f3823 (more details on the Dr. CI page):


💚 💚 Looks good so far! There are no failures yet. 💚 💚


This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions on the GitHub issue tracker or post in the (internal) Dr. CI Users group.

See how this bot performed.

This comment has been revised 25 times.

@ejguan ejguan added module: nn Related to torch.nn module: tests Issues related to tests (not the torch.testing module) labels Nov 4, 2020
@ejguan
Copy link
Contributor

ejguan commented Nov 4, 2020

Can you please rebase this PR against viable/strict branch to make sure all CI green?
And @albanD, can you please review this PR?

@ejguan ejguan requested a review from albanD November 4, 2020 16:32
@ejguan ejguan added the triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module label Nov 4, 2020
Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

Small update on the doc and it will be good to go. Thanks for the test refactor, it looks much better!

Copy link
Collaborator

@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the doc update!

Can you just fix the lint and rebase on top of master so that the CI runs and we can merge :)

@asi1024
Copy link
Contributor Author

asi1024 commented Nov 22, 2020

Rebased!

@codecov
Copy link

codecov bot commented Nov 22, 2020

Codecov Report

Merging #47350 (87f3823) into master (b665490) will increase coverage by 0.00%.
The diff coverage is 97.77%.

@@           Coverage Diff           @@
##           master   #47350   +/-   ##
=======================================
  Coverage   80.95%   80.95%           
=======================================
  Files        1853     1853           
  Lines      199835   199879   +44     
=======================================
+ Hits       161771   161813   +42     
- Misses      38064    38066    +2     

@albanD
Copy link
Collaborator

albanD commented Nov 30, 2020

Thanks!

Copy link
Contributor

@facebook-github-bot facebook-github-bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@albanD has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@albanD merged this pull request in 492683b.

@asi1024 asi1024 deleted the lazyconv branch December 18, 2020 08:13
facebook-github-bot pushed a commit that referenced this pull request Feb 18, 2021
Summary:
Some minor improvement for lazy modules introduced in #44538, #47350 and #51548.

This PR mainly turn the bias to `UninitializedParameter` and instead of creating empty tensors like
```python
self.bias = Parameter(torch.Tensor(0))
self.bias = UninitializedParameter()
```
I think it would be better to
```python
self.register_parameter('bias', None)
self.bias = UninitializedParameter()
```

In addition, I change the constructor of the `LazyBatchNorm` from
```python
self.running_mean = UninitializedBuffer()
```
to
```python
self.register_buffer('running_mean', UninitializedBuffer())
```
as the original one would not change the underlying `self._buffers`.

Thank you for your time on reviewing this PR :).

Gently ping albanD, mruberry

Pull Request resolved: #52212

Reviewed By: jbschlosser

Differential Revision: D26504508

Pulled By: albanD

fbshipit-source-id: 7094d0bb4fa9e2a40a07b79d350ea12a6ebfd080
xsacha pushed a commit to xsacha/pytorch that referenced this pull request Mar 31, 2021
…h#52212)

Summary:
Some minor improvement for lazy modules introduced in pytorch#44538, pytorch#47350 and pytorch#51548.

This PR mainly turn the bias to `UninitializedParameter` and instead of creating empty tensors like
```python
self.bias = Parameter(torch.Tensor(0))
self.bias = UninitializedParameter()
```
I think it would be better to
```python
self.register_parameter('bias', None)
self.bias = UninitializedParameter()
```

In addition, I change the constructor of the `LazyBatchNorm` from
```python
self.running_mean = UninitializedBuffer()
```
to
```python
self.register_buffer('running_mean', UninitializedBuffer())
```
as the original one would not change the underlying `self._buffers`.

Thank you for your time on reviewing this PR :).

Gently ping albanD, mruberry

Pull Request resolved: pytorch#52212

Reviewed By: jbschlosser

Differential Revision: D26504508

Pulled By: albanD

fbshipit-source-id: 7094d0bb4fa9e2a40a07b79d350ea12a6ebfd080
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

cla signed Merged module: nn Related to torch.nn module: tests Issues related to tests (not the torch.testing module) open source triaged This issue has been looked at a team member, and triaged and prioritized into an appropriate module

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants