What is: AdaBound?
Source | Adaptive Gradient Methods with Dynamic Bound of Learning Rate |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
AdaBound is a variant of the Adam stochastic optimizer which is designed to be more robust to extreme learning rates. Dynamic bounds are employed on learning rates, where the lower and upper bound are initialized as zero and infinity respectively, and they both smoothly converge to a constant final step size. AdaBound can be regarded as an adaptive method at the beginning of training, and thereafter it gradually and smoothly transforms to SGD (or with momentum) as the time step increases.
Where is the initial step size, and and are the lower and upper bound functions respectively.