Skip to content

Latest commit

 

History

History
32 lines (25 loc) · 1.65 KB

relu.md

File metadata and controls

32 lines (25 loc) · 1.65 KB
title
ReLU / Rectified-Linear and Leaky-ReLU Layer

ReLU / Rectified-Linear and Leaky-ReLU Layer

Given an input value x, The ReLU layer computes the output as x if x > 0 and negative_slope * x if x <= 0. When the negative slope parameter is not set, it is equivalent to the standard ReLU function of taking max(x, 0). It also supports in-place computation, meaning that the bottom and the top blob could be the same to preserve memory consumption.

Parameters

  • Parameters (ReLUParameter relu_param)
    • Optional
      • negative_slope [default 0]: specifies whether to leak the negative part by multiplying it with the slope value rather than setting it to 0.
  • From ./src/caffe/proto/caffe.proto:

{% highlight Protobuf %} {% include proto/ReLUParameter.txt %} {% endhighlight %}