What is: AMSGrad?
| Source | On the Convergence of Adam and Beyond | 
| Year | 2000 | 
| Data Source | CC BY-SA - https://paperswithcode.com | 
AMSGrad is a stochastic optimization method that seeks to fix a convergence issue with Adam based optimizers. AMSGrad uses the maximum of past squared gradients rather than the exponential average to update the parameters:
