What is: Weight Standardization?
Source | Micro-Batch Training with Batch-Channel Normalization and Weight Standardization |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Weight Standardization is a normalization technique that smooths the loss landscape by standardizing the weights in convolutional layers. Different from the previous normalization methods that focus on activations, WS considers the smoothing effects of weights more than just length-direction decoupling. Theoretically, WS reduces the Lipschitz constants of the loss and the gradients. Hence, WS smooths the loss landscape and improves training.
In Weight Standardization, instead of directly optimizing the loss on the original weights , we reparameterize the weights as a function of , i.e. , and optimize the loss on by SGD:
where
Similar to Batch Normalization, WS controls the first and second moments of the weights of each output channel individually in convolutional layers. Note that many initialization methods also initialize the weights in some similar ways. Different from those methods, WS standardizes the weights in a differentiable way which aims to normalize gradients during back-propagation. Note that we do not have any affine transformation on . This is because we assume that normalization layers such as BN or GN will normalize this convolutional layer again.