What is: Kaiming Initialization?
Source | Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Kaiming Initialization, or He Initialization, is an initialization method for neural networks that takes into account the non-linearity of activation functions, such as ReLU activations.
A proper initialization method should avoid reducing or magnifying the magnitudes of input signals exponentially. Using a derivation they work out that the condition to stop this happening is:
This implies an initialization scheme of:
That is, a zero-centered Gaussian with standard deviation of (variance shown in equation above). Biases are initialized at .