Batch norm and ReLU reversed? #7

Closed
urielka opened this Issue May 5, 2017 · 2 comments

Comments

Projects
None yet
2 participants
@urielka

urielka commented May 5, 2017

Hi according to the paper (and as you posted in the README) the basic building block is:

But if I follow TF-Slim code and your code I think you are having depthwise -> ReLU -> BN -> Pointwise -> ReLU -> BN.

The ReLU is part of slim convolution layers.

Also you seem to be using separable_convolution2d with num_outputs=None to get a depthwise convolution, you can just use https://www.tensorflow.org/api_docs/python/tf/nn/depthwise_conv2d instead, no?

@Zehaos

This comment has been minimized.

Show comment
Hide comment
@Zehaos

Zehaos May 5, 2017

Owner

Hi,

  1. please refer to slim arg scope mechanism. In my implementation ReLU is the activation function of BN.
    mobilenet.py#L57-L62
  2. tf.nn.depwise_conv2d work in the same way, but it needs additional variable creation code.
Owner

Zehaos commented May 5, 2017

Hi,

  1. please refer to slim arg scope mechanism. In my implementation ReLU is the activation function of BN.
    mobilenet.py#L57-L62
  2. tf.nn.depwise_conv2d work in the same way, but it needs additional variable creation code.
@urielka

This comment has been minimized.

Show comment
Hide comment
@urielka

urielka May 6, 2017

Understood, my bad :)

urielka commented May 6, 2017

Understood, my bad :)

@urielka urielka closed this May 6, 2017

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment