Viet-Anh on Software Logo

What is: modReLU?

SourceUnitary Evolution Recurrent Neural Networks
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

modReLU is an activation that is a modification of a ReLU. It is a pointwise nonlinearity, σ_modReLU(z):CC\sigma\_{modReLU}\left(z\right) : C \rightarrow C, which affects only the absolute value of a complex number, defined as:

σ_modReLU(z)=(z+b)zz if z+b0\sigma\_{modReLU}\left(z\right) = \left(|z| + b\right)\frac{z}{|z|} \text{ if } |z| + b \geq 0 σ_modReLU(z)=0 if z+b0\sigma\_{modReLU}\left(z\right) = 0 \text{ if } |z| + b \leq 0

where bRb \in \mathbb{R} is a bias parameter of the nonlinearity. For a n_hn\_{h} dimensional hidden space we learn n_hn\_{h} nonlinearity bias parameters, one per dimension.