Viet-Anh on Software Logo

What is: Randomized Leaky Rectified Linear Units?

SourceEmpirical Evaluation of Rectified Activations in Convolutional Network
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Randomized Leaky Rectified Linear Units, or RReLU, are an activation function that randomly samples the negative slope for activation values. It was first proposed and used in the Kaggle NDSB Competition. During training, a_jia\_{ji} is a random number sampled from a uniform distribution U(l,u)U\left(l, u\right). Formally:

y_ji=x_ji if x_ji0y\_{ji} = x\_{ji} \text{ if } x\_{ji} \geq{0} y_ji=a_jix_ji if x_ji<0y\_{ji} = a\_{ji}x\_{ji} \text{ if } x\_{ji} < 0

where

α_jiU(l,u),l<u and l,u[0,1)\alpha\_{ji} \sim U\left(l, u\right), l < u \text{ and } l, u \in \left[0,1\right)

In the test phase, we take average of all the a_jia\_{ji} in training similar to dropout, and thus set a_jia\_{ji} to l+u2\frac{l+u}{2} to get a deterministic result. As suggested by the NDSB competition winner, a_jia\_{ji} is sampled from U(3,8)U\left(3, 8\right).

At test time, we use:

y_ji=x_jil+u2y\_{ji} = \frac{x\_{ji}}{\frac{l+u}{2}}