Skip to content

Commit 35e12d4

Browse files
authored
fix: GradientDescent --> AdamOptimizer (#168)
1 parent 17a3964 commit 35e12d4

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

lab-09-4-xor_tensorboard.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@
4545
cost_summ = tf.summary.scalar("cost", cost)
4646

4747
with tf.name_scope("train") as scope:
48-
train = tf.train.GradientDescentOptimizer(learning_rate=learning_rate).minimize(cost)
48+
train = tf.train.AdamOptimizer(learning_rate=learning_rate).minimize(cost)
4949

5050
# Accuracy computation
5151
# True if hypothesis>0.5 else False

0 commit comments

Comments
 (0)