You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
But in many experiments in the paper, the labels in the unsupervised target overlap with that of the supervised source labels in the ImageNet. Is it justified that you pretrain on supervised ImageNet labels?
The text was updated successfully, but these errors were encountered:
That's a nice point. We use pre-trained models because they are so popular and are used by default. Maybe you can investigate how pre-trained models affect the performance :)
You had mentioned that the backbone network is ResNet-50 pretrained on Imagenet.
Universal-Domain-Adaptation/net.py
Line 37 in 5d7caa9
But in many experiments in the paper, the labels in the unsupervised target overlap with that of the supervised source labels in the ImageNet. Is it justified that you pretrain on supervised ImageNet labels?
The text was updated successfully, but these errors were encountered: