What is: Shifted Rectified Linear Unit?
Source | Trainable Activations for Image Classification |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
The Shifted Rectified Linear Unit, or ShiLU, is a modification of ReLU activation function that has trainable parameters.