What is: Layer-Sequential Unit-Variance Initialization?
Source | All you need is a good init |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Layer-Sequential Unit-Variance Initialization (LSUV) is a simple method for weight initialization for deep net learning. The initialization strategy involves the following two step:
-
First, pre-initialize weights of each convolution or inner-product layer with orthonormal matrices.
-
Second, proceed from the first to the final layer, normalizing the variance of the output of each layer to be equal to one.