Viet-Anh on Software Logo

What is: Xavier Initialization?

Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Xavier Initialization, or Glorot Initialization, is an initialization scheme for neural networks. Biases are initialized be 0 and the weights W_ijW\_{ij} at each layer are initialized as:

W_ijU[6fanin+fanout,6fanin+fanout]W\_{ij} \sim U\left[-\frac{\sqrt{6}}{\sqrt{fan_{in} + fan_{out}}}, \frac{\sqrt{6}}{\sqrt{fan_{in} + fan_{out}}}\right]

Where UU is a uniform distribution and faninfan_{in} is the size of the previous layer (number of columns in WW) and fanoutfan_{out} is the size of the current layer.