What is: Scaled Exponential Linear Unit?
Source | Self-Normalizing Neural Networks |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Scaled Exponential Linear Units, or SELUs, are activation functions that induce self-normalizing properties.
The SELU activation function is given by
with and .