You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need a separate Activation layer because of #34. In many cases, the BatchNorm layer sits between the previous layer with the Linear Activation function and its own non-linear Activation function.
Also, it's important for Transfer Learning on Keras models, due to the widely used separate Activation layer.
The text was updated successfully, but these errors were encountered:
We need a separate Activation layer because of #34. In many cases, the BatchNorm layer sits between the previous layer with the Linear Activation function and its own non-linear Activation function.
Also, it's important for Transfer Learning on Keras models, due to the widely used separate Activation layer.
The text was updated successfully, but these errors were encountered: