Scaled Exponential Linear Unit1 SELU (Scaled Exponential Linear Unit) Information SELU (Scaled Exponential Linear Unit) is an activation function for neural networks that was introduced in 2017 by Klambauer et al. SELU is a self-normalizing activation function, which means that it preserves the mean and variance of the activations across the layers, and thus reduces the vanishing/exploding gradients problem. SELU is defined as a piecewise function that is similar .. Zettelkasten/Terminology Information 2023. 2. 24. 이전 1 다음