What is: Virtual Batch Normalization?
Source | Improved Techniques for Training GANs |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Virtual Batch Normalization is a normalization method used for training generative adversarial networks that extends batch normalization. Regular batch normalization causes the output of a neural network for an input example to be highly dependent on several other inputs in the same minibatch. To avoid this problem in virtual batch normalization (VBN), each example is normalized based on the statistics collected on a reference batch of examples that are chosen once and fixed at the start of training, and on itself. The reference batch is normalized using only its own statistics. VBN is computationally expensive because it requires running forward propagation on two minibatches of data, so the authors use it only in the generator network.