What is: Weight Normalization?
Source | Weight Normalization: A Simple Reparameterization to Accelerate Training of Deep Neural Networks |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Weight Normalization is a normalization method for training neural networks. It is inspired by batch normalization, but it is a deterministic method that does not share batch normalization's property of adding noise to the gradients. It reparameterizes each -dimentional weight vector in terms of a parameter vector and a scalar parameter and to perform stochastic gradient descent with respect to those parameters instead. Weight vectors are expressed in terms of the new parameters using:
where is a -dimensional vector, is a scalar, and denotes the Euclidean norm of . This reparameterization has the effect of fixing the Euclidean norm of the weight vector : we now have , independent of the parameters .