
Information
- ReLU (Rectified Linear Unit) is a type of activation function used in neural networks, which is defined as f(x) = max(0,x).
- It returns 0 for all negative inputs and returns the input as it is for all positive inputs.
- ReLU is a popular choice for activation function due to its simplicity and effectiveness in reducing vanishing gradients in deep neural networks.
- Vanishing gradients occur when the gradient becomes very small, making it difficult to update the weights and causing the network to stop learning.
- ReLU can help prevent vanishing gradients by ensuring that gradients can flow through the network, as it is non-linear and has a constant gradient for positive inputs.
- ReLU is computationally efficient and easy to implement, making it a popular choice for many machine learning applications.
'Zettelkasten > Terminology Information' 카테고리의 다른 글
MIC (Maximum Information Coefficient) (0) | 2023.04.20 |
---|---|
MLP (Multi-Layer Perceptron) (0) | 2023.04.18 |
RES (Renewable Energy Source) (0) | 2023.04.17 |
ESS (Energy Storage System) (0) | 2023.04.16 |
BC Hydro (British Columbia Hydro) dataset (0) | 2023.04.16 |
댓글