Rectified Linear Unit1 ReLU (Rectified Linear Unit) Information ReLU (Rectified Linear Unit) is a type of activation function used in neural networks, which is defined as f(x) = max(0,x). It returns 0 for all negative inputs and returns the input as it is for all positive inputs. ReLU is a popular choice for activation function due to its simplicity and effectiveness in reducing vanishing gradients in deep neural networks. Vanishing gradients occ.. Zettelkasten/Terminology Information 2023. 4. 19. 이전 1 다음