What is: Gradient Sign Dropout?
Source | Just Pick a Sign: Optimizing Deep Multitask Models with Gradient Sign Dropout |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
GradDrop, or Gradient Sign Dropout, is a probabilistic masking procedure which samples gradients at an activation layer based on their level of consistency. It is applied as a layer in any standard network forward pass, usually on the final layer before the prediction head to save on compute overhead and maximize benefits during backpropagation. Below, we develop the GradDrop formalism. Throughout, o denotes elementwise multiplication after any necessary tiling operations (if any) are completed. To implement GradDrop, we first define the Gradient Positive Sign Purity, , as
is bounded by For multiple gradient values at some scalar , we see that if , while if . Thus, is a measure of how many positive gradients are present at any given value. We then form a mask for each gradient as follows:
for the standard indicator function and some monotonically increasing function (often just the identity) that maps and is odd around . is a tensor composed of i.i.d random variables. The is then used to produce a final gradient