Skip to content

Update 02 - tf.layers.py #30

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jan 10, 2018
Merged

Update 02 - tf.layers.py #30

merged 1 commit into from
Jan 10, 2018

Conversation

MASMAS-Studio
Copy link
Contributor

@MASMAS-Studio MASMAS-Studio commented Jan 9, 2018

책의 131페이지를 읽던 도중,

W1 = tf.Variable(tf.random_normal([3, 3, 1, 32], stddev=0.01))
L1 = tf.nn.conv2d(X, W1, strides=[1, 1, 1, 1], padding='SAME')
L1 = tf.nn.relu(L1)
L1 = tf.nn.max_pool(L1, ksize=[1, 2, 2, 1], strides=[1, 2, 2, 1], padding='SAME')

코드를 아래와 같이 수정하는 부분에서, relu activation 함수를 설정하는 것이 누락된 것 같습니다.

L1 = tf.layers.conv2d(X, 32, [3, 3])
L1 = tf.layers.max_pooling2d(L1, [2, 2], [2, 2])

혹시나 따로 activation 함수를 지정하지 않으면 자동으로 ReLU 함수로 설정되는 것인지
https://github.com/tensorflow/tensorflow/blob/r1.4/tensorflow/python/layers/convolutional.py

코드를 참고해봤지만, 그런 부분이 없어서 Pull Request 합니다.

@golbin
Copy link
Owner

golbin commented Jan 10, 2018

코드를 간결하게 보여주기 위해 일부러 제거한 것이긴 한데요.

혼동의 여지가 있을 것 같네요. 해당 부분은 수정하도록 하겠습니다. PR 주셔서 감사합니다. 👍

@golbin golbin closed this Jan 10, 2018
@golbin golbin reopened this Jan 10, 2018
@golbin golbin merged commit 00d4418 into golbin:master Jan 10, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants