Skip to content
Permalink
Browse files

Be backward compatible with torch 1.0 (#483)

Since 1.1.0 is not that old we should probably support
1.0.x for some time.
  • Loading branch information...
ottonemo authored and BenjaminBossan committed Jun 11, 2019
1 parent 815ade3 commit 51000d1e6f33e486d4586f91fd321e5e13c15ad3
Showing with 6 additions and 2 deletions.
  1. +6 −2 skorch/callbacks/lr_scheduler.py
@@ -13,7 +13,11 @@
from torch.optim.lr_scheduler import MultiStepLR
from torch.optim.lr_scheduler import ReduceLROnPlateau
from torch.optim.lr_scheduler import StepLR
from torch.optim.lr_scheduler import CyclicLR as TorchCyclicLR
try:
from torch.optim.lr_scheduler import CyclicLR as TorchCyclicLR
except ImportError:
# Backward compatibility with torch >= 1.0 && < 1.1
TorchCyclicLR = None
from torch.optim.optimizer import Optimizer
from skorch.callbacks import Callback

@@ -150,7 +154,7 @@ def on_batch_end(self, net, training, **kwargs):
):
self.lr_scheduler_.batch_step(self.batch_idx_)

if isinstance(self.lr_scheduler_, TorchCyclicLR):
if TorchCyclicLR and isinstance(self.lr_scheduler_, TorchCyclicLR):
self.lr_scheduler_.step(self.batch_idx_)

if training:

0 comments on commit 51000d1

Please sign in to comment.
You can’t perform that action at this time.