What is: Gradient Checkpointing?
Source | Training Deep Nets with Sublinear Memory Cost |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Gradient Checkpointing is a method used for reducing the memory footprint when training deep neural networks, at the cost of having a small increase in computation time.