Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

classification with freeze_ and unfreeze_ #59

Open
yongjin-shin opened this issue Nov 2, 2020 · 2 comments
Open

classification with freeze_ and unfreeze_ #59

yongjin-shin opened this issue Nov 2, 2020 · 2 comments

Comments

@yongjin-shin
Copy link

yongjin-shin commented Nov 2, 2020

Hi,
First of all, I very much appreciate your wonderful work!
Currently, I am testing CIFAR10 classification task with your example code.
I would like to ask whether 10% of the test performance is an usual case when I call freeze_() in the test time.

# Test Time
    with torch.no_grad():
        for data in test_loader:
            images, labels = data
            classifier.freeze_()
            outputs = classifier(images.to(device))
            _, predicted = torch.max(outputs.data, 1)
            total += labels.size(0)
            correct += (predicted == labels.to(device)).sum().item()
    print(f'Freeze Epoch {epoch} | {str(100 * correct / total)}% | Elpased: {time.time() - tic:.1f}s')
    classifier.unfreeze_()

I only add classifier.freeze_() before getting outputs.
I thought the accuracy should be similar with unfreeeze_(), however, it seems not.
When I activated freezemode, I got 10%, but unfreeze mode reach 45% at the first epoch.
Since I only activate it at the test time, the training loss keeps going down.

Best Regards,
YJ

@piEsposito
Copy link
Owner

This is not usual and should be fixed by now.

@jinqijinqi
Copy link

The problem is still there.
image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants