
Information
- PReLU (Parametric Rectified Linear Unit) is a variation of the ReLU activation function used in neural networks.
- It is called "parametric" because it has a learnable parameter that can be adjusted during the training process, unlike the standard ReLU function.
- The PReLU function is defined as f(x) = alpha * x for x < 0 and f(x) = x for x >= 0, where alpha is a learnable parameter.
- The parameter alpha is usually initialized to a small positive value, such as 0.01 or 0.001, to avoid the vanishing gradient problem that can occur with the standard ReLU function.
- PReLU has been shown to improve the performance of neural networks on various tasks, particularly in cases where the data has a lot of noise or outliers.
- PReLU can also be used in deep neural networks to prevent the so-called "dying ReLU" problem, where some neurons can become "dead" and stop learning due to always outputting 0 for negative inputs.
- In summary, PReLU is a versatile activation function that can improve the performance and stability of neural networks.
'Zettelkasten > Terminology Information' 카테고리의 다른 글
ELU (Exponential Linear Unit) (0) | 2023.02.27 |
---|---|
MAPE (Mean Absolute Percentage Error) (0) | 2023.02.26 |
CVRMSE (Coefficient of Variation of the Root Mean Squared Error) (0) | 2023.02.25 |
LReLU (Leaky Rectified Linear Unit) (0) | 2023.02.25 |
SELU (Scaled Exponential Linear Unit) (0) | 2023.02.24 |
댓글