title |
---|
ReLU / Rectified-Linear and Leaky-ReLU Layer |
-
Layer type:
ReLU
-
CPU implementation:
./src/caffe/layers/relu_layer.cpp
-
CUDA GPU implementation:
./src/caffe/layers/relu_layer.cu
-
Sample (as seen in
./models/bvlc_reference_caffenet/train_val.prototxt
)layer { name: "relu1" type: "ReLU" bottom: "conv1" top: "conv1" }
Given an input value x, The ReLU
layer computes the output as x if x > 0 and negative_slope * x if x <= 0. When the negative slope parameter is not set, it is equivalent to the standard ReLU function of taking max(x, 0). It also supports in-place computation, meaning that the bottom and the top blob could be the same to preserve memory consumption.
- Parameters (
ReLUParameter relu_param
)- Optional
negative_slope
[default 0]: specifies whether to leak the negative part by multiplying it with the slope value rather than setting it to 0.
- Optional
- From
./src/caffe/proto/caffe.proto
:
{% highlight Protobuf %} {% include proto/ReLUParameter.txt %} {% endhighlight %}