What is: Weights Reset?
Source | The Weights Reset Technique for Deep Neural Networks Implicit Regularization |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Weight Reset is an implicit regularization procedure that periodically resets a randomly selected portion of layer weights during the training process, according to predefined probability distributions.
To delineate the Weight Reset procedure, a straightforward formulation is introduced. Assume as a multivariate Bernoulli distribution with parameter , and let's propose that is an arbitrary distribution used for initializing model weights. At specified intervals (after a certain number of training iterations, except for the last one), a random portion of the weights from selected layers in the neural network undergoes a reset utilizing the following method:
where operation is an element-wise hadamar type multiplication, are current weights for layer , are reset weights for this layer, is a resetting mask, is a resetting rate for a layer , are new random weights.
Evidence has indicated that Weight Reset can compete with, and in some instances, surpass traditional regularization techniques.
Given the observable effects of the Weight Reset technique on an increasing number of weights in a model, there's a plausible hypothesis suggesting its potential association with the Double Descent phenomenon.