We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Getting this error anytime I try to run learn.lr_find or learn.fit with VGG16 or VGG19.
learn.lr_find
learn.fit
Below is the traceback.
ValueError Traceback (most recent call last) <ipython-input-6-95b405c2ed1e> in <module>() ----> 1 lrf=learn.lr_find() 2 learn.sched.plot() /home/james/fastai_2/courses/dl1/fastai/learner.py in lr_find(self, start_lr, end_lr, wds) 131 """ 132 self.save('tmp') --> 133 layer_opt = self.get_layer_opt(start_lr, wds) 134 self.sched = LR_Finder(layer_opt, len(self.data.trn_dl), end_lr) 135 self.fit_gen(self.model, self.data, layer_opt, 1) /home/james/fastai_2/courses/dl1/fastai/learner.py in get_layer_opt(self, lrs, wds) 90 91 def get_layer_opt(self, lrs, wds): ---> 92 return LayerOptimizer(self.opt_fn, self.get_layer_groups(), lrs, wds) 93 94 def fit(self, lrs, n_cycle, wds=None, **kwargs): /home/james/fastai_2/courses/dl1/fastai/layer_optimizer.py in __init__(self, opt_fn, layer_groups, lrs, wds) 15 if len(wds)==1: wds=wds*len(layer_groups) 16 self.layer_groups,self.lrs,self.wds = layer_groups,lrs,wds ---> 17 self.opt = opt_fn(self.opt_params()) 18 19 def opt_params(self): /home/james/fastai_2/courses/dl1/fastai/core.py in <lambda>(*args, **kwargs) 63 64 def SGD_Momentum(momentum): ---> 65 return lambda *args, **kwargs: optim.SGD(*args, momentum=momentum, **kwargs) 66 67 def one_hot(a,c): return np.eye(c)[a] /home/james/anaconda3/envs/tensorflow/lib/python3.6/site-packages/torch/optim/sgd.py in __init__(self, params, lr, momentum, dampening, weight_decay, nesterov) 54 if nesterov and (momentum <= 0 or dampening != 0): 55 raise ValueError("Nesterov momentum requires a momentum and zero dampening") ---> 56 super(SGD, self).__init__(params, defaults) 57 58 def __setstate__(self, state): /home/james/anaconda3/envs/tensorflow/lib/python3.6/site-packages/torch/optim/optimizer.py in __init__(self, params, defaults) 40 group_set = set(group['params']) 41 if not param_set.isdisjoint(group_set): ---> 42 raise ValueError("some parameters appear in more than one " 43 "parameter group") 44 param_set.update(group_set)
The text was updated successfully, but these errors were encountered:
Best to discuss this on the forum. Be sure to share the code you're using to create the model.
Sorry, something went wrong.
Merge pull request fastai#32 from kaushaltrivedi/bert_xlnet_support
07383c5
Bert xlnet support
No branches or pull requests
Getting this error anytime I try to run
learn.lr_find
orlearn.fit
with VGG16 or VGG19.Below is the traceback.
The text was updated successfully, but these errors were encountered: