Skip to content

Conversation

@LaserBit
Copy link
Contributor

What does this PR do?

Fixes #5231

Before submitting

  • Was this discussed/approved via a Github issue? (no need for typos and docs improvements)
  • Did you read the contributor guideline, Pull Request section?
  • Did you make sure your PR does only one thing, instead of bundling different changes together? Otherwise, we ask you to create a separate PR for every change.
  • Did you make sure to update the documentation with your changes?
  • Did you write any new necessary tests?
  • Did you verify new and existing tests pass locally with your changes?
  • If you made a notable change (that affects users), did you update the CHANGELOG?

PR review

Anyone in the community is free to review the PR once the tests have passed.
Before you start reviewing make sure you have read Review guidelines. In short, see the following bullet-list:

  • Is this pull request ready for review? (if not, please submit in draft mode)
  • Check that all items from Before submitting are resolved
  • Make sure the title is self-explanatory and the description concisely explains the PR
  • Add labels and milestones (and optionally projects) to the PR so it can be classified; Bugfixes should be including in bug-fix release milestones (m.f.X) and features should be included in (m.X.b) releases.

Did you have fun?

Make sure you had fun coding 🙃

@codecov
Copy link

codecov bot commented Dec 22, 2020

Codecov Report

Merging #5232 (d48275f) into master (d5b3678) will not change coverage.
The diff coverage is n/a.

@@          Coverage Diff           @@
##           master   #5232   +/-   ##
======================================
  Coverage      93%     93%           
======================================
  Files         134     134           
  Lines        9976    9976           
======================================
  Hits         9294    9294           
  Misses        682     682           

@rohitgr7
Copy link
Contributor

rohitgr7 commented Dec 22, 2020

class ImagenetTransferLearning(LightningModule):
    def __init__(self):
        super().__init__()

        # init a pretrained resnet
        backbone = models.resnet50(pretrained=True)
        num_filters = backbone.fc.in_features
        _layers = list(backbone.children())[:-1]
        self.feature_extractor = torch.nn.Sequential(*_layers)

        # use the pretrained model to classify cifar-10 (10 image classes)
        num_target_classes = 10
        self.classifier = nn.Linear(num_filters, num_target_classes)

    def forward(self, x):
        self.feature_extractor.eval()
        batch_size = x.size(0)
        with torch.no_grad():
            representations = self.feature_extractor(x).view(batch_size, -1)
        x = self.classifier(representations)
        ...

I think it should be like this.

  1. The Last layer of the pretrained model is basically a classifier and putting a new classifier on top of it is a bit incorrect.
  2. LightningModule puts the entire model to train mode while training. So need to manually do .eval every time in forward + no_grad to avoid gradient update.

@awaelchli awaelchli added example docs Documentation related labels Dec 23, 2020
@LaserBit
Copy link
Contributor Author

LaserBit commented Dec 23, 2020

@rohitgr7 Thank you !
However, just like the original sample code, I could not use your code because I got RuntimeError: mat1 dim 1 must match mat2 dim 0.
I tried using the CIFAR-10 dataset to see if my own dataset was the problem, but I got the same RuntimeError: mat1 dim 1 must match mat2 dim 0.

This is my source code. Google Colab

Can you give me a solution?

@rohitgr7
Copy link
Contributor

@LaserBit fixed it. should be working now.

@LaserBit
Copy link
Contributor Author

@rohitgr7 Thanks!
It's working fine, sir.

@rohitgr7 rohitgr7 enabled auto-merge (squash) January 4, 2021 18:57
@Borda Borda added the ready PRs ready to be merged label Jan 4, 2021
Copy link
Collaborator

@Borda Borda left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@Borda Borda disabled auto-merge January 4, 2021 19:47
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
@rohitgr7 rohitgr7 enabled auto-merge (squash) January 4, 2021 19:52
Copy link
Collaborator

@SkafteNicki SkafteNicki left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@rohitgr7 rohitgr7 merged commit a40e3a3 into Lightning-AI:master Jan 5, 2021
Borda pushed a commit that referenced this pull request Jan 6, 2021
* Change the classifier input from 2048 to 1000.

* Update docs for Imagenet example

Thanks @rohitgr7

* Apply suggestions from code review

Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>

Co-authored-by: Rohit Gupta <rohitgr1998@gmail.com>
Co-authored-by: Adrian Wälchli <aedu.waelchli@gmail.com>
(cherry picked from commit a40e3a3)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

docs Documentation related example ready PRs ready to be merged

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Typo correction in Transfer Learning sample code

6 participants