You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The keyword for the learning rate is "lr". You should correct the line you set the optimization parameters as following:
setp(model; lr=0.001, adagrad=true)
Here is the result:
test(model,trainX,onehot(trainY),softloss) = 2.302583841731204
test(model,testX,onehot(testY),softloss) = 2.3025815985486386
Progress: 100% Time: 0:00:06
test(model,trainX,onehot(trainY),softloss) = 1.6269797708233549
test(model,testX,onehot(testY),softloss) = 1.6429610465456255
I tried implementing a fairly simple muli-layer perceptron in
Knet.jl
One would expect the test loss value to decrease after training a few steps. But instead it diverges to a larger value:
Trying different optimization parameters did not seem to help.
The text was updated successfully, but these errors were encountered: