Leaky Rectified Linear Unit1 LReLU (Leaky Rectified Linear Unit) Information LReLU (Leaky Rectified Linear Unit) is a type of activation function used in deep learning models, particularly in convolutional neural networks (CNNs). It is similar to the ReLU (Rectified Linear Unit) activation function, but it allows for a small, non-zero gradient when the input is negative. The LReLU function is defined as f(x) = max(ax, x), where a is a small constant that is u.. Zettelkasten/Terminology Information 2023. 2. 25. 이전 1 다음