Viet-Anh on Software Logo

What is: Dropout?

SourceDropout: A Simple Way to Prevent Neural Networks from Overfitting
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Dropout is a regularization technique for neural networks that drops a unit (along with connections) at training time with a specified probability pp (a common value is p=0.5p=0.5). At test time, all units are present, but with weights scaled by pp (i.e. ww becomes pwpw).

The idea is to prevent co-adaptation, where the neural network becomes too reliant on particular connections, as this could be symptomatic of overfitting. Intuitively, dropout can be thought of as creating an implicit ensemble of neural networks.