You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
optimizer_T in train_DA.py of 'SHOT' method only optimizes net_T.base.parameters() without net_T.bottleneck ,but function train_net_T() in da_trainer.py sets self.net_T.bottleneck.train() mode. Don't they conflicts with each other?
The text was updated successfully, but these errors were encountered:
please see pytorch documents for how optimizer and the .train() call work. in short, optimizer updates parameters (please go see the definition of parameters); while the .train() call allows certain stuff like statistics in BN layers (not included in parameters) to be updated, and changes behaviors of certain layers like dropout.
optimizer_T in train_DA.py of 'SHOT' method only optimizes net_T.base.parameters() without net_T.bottleneck ,but function train_net_T() in da_trainer.py sets self.net_T.bottleneck.train() mode. Don't they conflicts with each other?
The text was updated successfully, but these errors were encountered: