Viet-Anh on Software Logo

What is: Sigmoid Linear Unit?

SourceSigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

** Sigmoid Linear Units**, or SiLUs, are activation functions for neural networks. The activation of the SiLU is computed by the sigmoid function multiplied by its input, or xσ(x). x\sigma(x).

See Gaussian Error Linear Units (GELUs) where the SiLU was originally coined, and see Sigmoid-Weighted Linear Units for Neural Network Function Approximation in Reinforcement Learning and Swish: a Self-Gated Activation Function where the SiLU was experimented with later.