What is: Sigmoid Linear Unit?
Source | Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
** Sigmoid Linear Units**, or SiLUs, are activation functions for neural networks. The activation of the SiLU is computed by the sigmoid function multiplied by its input, or
See Gaussian Error Linear Units (GELUs) where the SiLU was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.