Viet-Anh on Software Logo

What is: Leaky ReLU?

Year2014
Data SourceCC BY-SA - https://paperswithcode.com

Leaky Rectified Linear Unit, or Leaky ReLU, is a type of activation function based on a ReLU, but it has a small slope for negative values instead of a flat slope. The slope coefficient is determined before training, i.e. it is not learnt during training. This type of activation function is popular in tasks where we may suffer from sparse gradients, for example training generative adversarial networks.