What is: Parametric Exponential Linear Unit?
Source | Parametric Exponential Linear Unit for Deep Convolutional Neural Networks |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Parameterized Exponential Linear Units, or PELU, is an activation function for neural networks. It involves learning a parameterization of ELU in order to learn the proper activation shape at each layer in a CNN.
The PELU has two additional parameters over the ELU:
Where , , and . Here causes a change in the slope in the positive quadrant, controls the scale of the exponential decay, and controls the saturation in the negative quadrant.
Source: Activation Functions