Viet-Anh on Software Logo

What is: rnnDrop?

Year2015
Data SourceCC BY-SA - https://paperswithcode.com

rnnDrop is a dropout based regularization technique for recurrent neural networks. It amounts to using the same dropout mask at every timestep. It drops both the non-recurrent and recurrent connections. A simple figure to explain the idea is shown to the right. The figure shows an RNN being trained with rnnDrop for three frames (t1,t,t+1)\left(t-1, t, t+1\right) on two different training sequences in the data (denoted as ‘sequence1’ and ‘sequence2’). The black circles denote the randomly omitted hidden nodes during training, and the dotted arrows stand for the model weights connected to those omitted nodes.

From: RnnDrop: A Novel Dropout for RNNs in ASR by Moon et al