What is: Weight Demodulation?
Source | Analyzing and Improving the Image Quality of StyleGAN |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Weight Modulation is an alternative to adaptive instance normalization for use in generative adversarial networks, specifically it is introduced in StyleGAN2. The purpose of instance normalization is to remove the effect of - the scales of the features maps - from the statistics of the convolution’s output feature maps. Weight modulation tries to achieve this goal more directly. Assuming that input activations are i.i.d. random variables with unit standard deviation. After modulation and convolution, the output activations have standard deviation of:
i.e., the outputs are scaled by the norm of the corresponding weights. The subsequent normalization aims to restore the outputs back to unit standard deviation. This can be achieved if we scale (“demodulate”) each output feature map by . Alternatively, we can again bake this into the convolution weights:
where is a small constant to avoid numerical issues.