Viet-Anh on Software Logo

What is: S-shaped ReLU?

SourceDeep Learning with S-shaped Rectified Linear Activation Units
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

The S-shaped Rectified Linear Unit, or SReLU, is an activation function for neural networks. It learns both convex and non-convex functions, imitating the multiple function forms given by the two fundamental laws, namely the Webner-Fechner law and the Stevens law, in psychophysics and neural sciences. Specifically, SReLU consists of three piecewise linear functions, which are formulated by four learnable parameters.

The SReLU is defined as a mapping:

f(x)=t_ir+ar_i(x_itr_i) if x_itr_if\left(x\right) = t\_{i}^{r} + a^{r}\_{i}\left(x\_{i}-t^{r}\_{i}\right) \text{ if } x\_{i} \geq t^{r}\_{i} f(x)=x_i if tr_i>x>t_il f\left(x\right) = x\_{i} \text{ if } t^{r}\_{i} > x > t\_{i}^{l} f(x)=t_il+al_i(x_itl_i) if x_itl_if\left(x\right) = t\_{i}^{l} + a^{l}\_{i}\left(x\_{i}-t^{l}\_{i}\right) \text{ if } x\_{i} \leq t^{l}\_{i}

where tl_it^{l}\_{i}, tr_it^{r}\_{i} and al_ia^{l}\_{i} are learnable parameters of the network ii and indicates that the SReLU can differ in different channels. The parameter ar_ia^{r}\_{i} represents the slope of the right line with input above a set threshold. tr_it^{r}\_{i} and tl_it^{l}\_{i} are thresholds in positive and negative directions respectively.

Source: Activation Functions