What is: Recurrent Dropout?
Source | Recurrent Dropout without Memory Loss |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Recurrent Dropout is a regularization method for recurrent neural networks. Dropout is applied to the updates to LSTM memory cells (or GRU states), i.e. it drops out the input/update gate in LSTM/GRU.