Skip to content

Commit

Permalink
🎨 Fix some typos about NeuralNetwork::Optimizer
Browse files Browse the repository at this point in the history
  • Loading branch information
yoshoku committed May 23, 2020
1 parent 3fa073d commit af299df
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion lib/rumale/neural_network/adam.rb
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ def initialize(learning_rate: 0.001, decay1: 0.9, decay2: 0.999)
end

# @!visibility private
# Calculate the updated weight with Nadam adaptive learning rate.
# Calculate the updated weight with Adam adaptive learning rate.
#
# @param weight [Numo::DFloat] (shape: [n_features]) The weight to be updated.
# @param gradient [Numo::DFloat] (shape: [n_features]) The gradient for updating the weight.
Expand Down
2 changes: 1 addition & 1 deletion spec/rumale/neural_network/layer/affine_spec.rb
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@
let(:z) { Numo::DFloat[[1, 2, 3, 4], [5, 6, 7, 8], [9, 8, 7, 6]] }
let(:n_inputs) { x.shape[1] }
let(:n_outputs) { z.shape[1] }
let(:adam) { Rumale::Optimizer::Adam.new }
let(:adam) { Rumale::NeuralNetwork::Optimizer::Adam.new }
let(:affine) { described_class.new(n_inputs: n_inputs, n_outputs: n_outputs, optimizer: adam, rng: rng.dup) }
let(:rand_mat) { 0.01 * Rumale::Utils.rand_normal([n_inputs, n_outputs], rng.dup) }
let(:out) { affine.forward(x)[0] }
Expand Down

0 comments on commit af299df

Please sign in to comment.