What is: In-Place Activated Batch Normalization?
Source | In-Place Activated BatchNorm for Memory-Optimized Training of DNNs |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
In-Place Activated Batch Normalization, or InPlace-ABN, substitutes the conventionally used succession of BatchNorm + Activation layers with a single plugin layer, hence avoiding invasive framework surgery while providing straightforward applicability for existing deep learning frameworks. It approximately halves the memory requirements during training of modern deep learning models.