PReLU1 PReLU (Parametric Rectified Linear Unit) Information PReLU (Parametric Rectified Linear Unit) is a variation of the ReLU activation function used in neural networks. It is called "parametric" because it has a learnable parameter that can be adjusted during the training process, unlike the standard ReLU function. The PReLU function is defined as f(x) = alpha * x for x = 0, where alpha is a learnable parameter. Th.. Zettelkasten/Terminology Information 2023. 2. 26. 이전 1 다음