What is: Conditional Batch Normalization?
Source | Modulating early visual processing by language |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Conditional Batch Normalization (CBN) is a class-conditional variant of batch normalization. The key idea is to predict the and of the batch normalization from an embedding - e.g. a language embedding in VQA. CBN enables the linguistic embedding to manipulate entire feature maps by scaling them up or down, negating them, or shutting them off. CBN has also been used in GANs to allow class information to affect the batch normalization parameters.
Consider a single convolutional layer with batch normalization module for which pretrained scalars and are available. We would like to directly predict these affine scaling parameters from, e.g., a language embedding . When starting the training procedure, these parameters must be close to the pretrained values to recover the original ResNet model as a poor initialization could significantly deteriorate performance. Unfortunately, it is difficult to initialize a network to output the pretrained and . For these reasons, the authors propose to predict a change and on the frozen original scalars, for which it is straightforward to initialize a neural network to produce an output with zero-mean and small variance.
The authors use a one-hidden-layer MLP to predict these deltas from a question embedding for all feature maps within the layer:
So, given a feature map with channels, these MLPs output a vector of size . We then add these predictions to the and parameters:
Finally, these updated and are used as parameters for the batch normalization: . The authors freeze all ResNet parameters, including and , during training. A ResNet consists of four stages of computation, each subdivided in several residual blocks. In each block, the authors apply CBN to the three convolutional layers.