Viet-Anh on Software Logo

What is: Layer-Sequential Unit-Variance Initialization?

SourceAll you need is a good init
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Layer-Sequential Unit-Variance Initialization (LSUV) is a simple method for weight initialization for deep net learning. The initialization strategy involves the following two step:

  1. First, pre-initialize weights of each convolution or inner-product layer with orthonormal matrices.

  2. Second, proceed from the first to the final layer, normalizing the variance of the output of each layer to be equal to one.