Join GitHub today
GitHub is home to over 28 million developers working together to host and review code, manage projects, and build software together.Sign up
Batch norm and ReLU reversed? #7
But if I follow TF-Slim code and your code I think you are having depthwise -> ReLU -> BN -> Pointwise -> ReLU -> BN.
The ReLU is part of slim convolution layers.
Also you seem to be using separable_convolution2d with num_outputs=None to get a depthwise convolution, you can just use https://www.tensorflow.org/api_docs/python/tf/nn/depthwise_conv2d instead, no?