Skip to content

Commit

Permalink
Training: Use ADAM instead of adadelta.
Browse files Browse the repository at this point in the history
  • Loading branch information
Brandon Amos committed Jun 3, 2016
1 parent 02b85b7 commit 130edce
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions training/train.lua
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ require 'torchx' --for concetration the table of tensors
paths.dofile("OpenFaceOptim.lua")


local optimMethod = optim.adadelta
local optimMethod = optim.adam
local optimState = {} -- Use for other algorithms like SGD
local optimator = OpenFaceOptim(model, optimState)

Expand Down Expand Up @@ -87,7 +87,7 @@ function train()
print('\n')

collectgarbage()

local nnModel = model:float():clone():clearState()
if opt.cudnn then
cudnn.convert(nnModel,nn)
Expand Down

0 comments on commit 130edce

Please sign in to comment.