Viet-Anh on Software Logo

What is: Early Stopping?

Year1995
Data SourceCC BY-SA - https://paperswithcode.com

Early Stopping is a regularization technique for deep neural networks that stops training when parameter updates no longer begin to yield improves on a validation set. In essence, we store and update the current best parameters during training, and when parameter updates no longer yield an improvement (after a set number of iterations) we stop training and use the last best parameters. It works as a regularizer by restricting the optimization procedure to a smaller volume of parameter space.

Image Source: Ramazan Gençay