What is: Randomized Leaky Rectified Linear Units?
Source | Empirical Evaluation of Rectified Activations in Convolutional Network |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Randomized Leaky Rectified Linear Units, or RReLU, are an activation function that randomly samples the negative slope for activation values. It was first proposed and used in the Kaggle NDSB Competition. During training, is a random number sampled from a uniform distribution . Formally:
where
In the test phase, we take average of all the in training similar to dropout, and thus set to to get a deterministic result. As suggested by the NDSB competition winner, is sampled from .
At test time, we use: