What is: CReLU?
Source | Understanding and Improving Convolutional Neural Networks via Concatenated Rectified Linear Units |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
CReLU, or Concatenated Rectified Linear Units, is a type of activation function which preserves both positive and negative phase information while enforcing non-saturated non-linearity. We compute by concatenating the layer output as: