What is: Temporal Activation Regularization?
Source | Revisiting Activation Regularization for Language RNNs |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Temporal Activation Regularization (TAR) is a type of slowness regularization for RNNs that penalizes differences between states that have been explored in the past. Formally we minimize:
where is the norm, is the output of the RNN at timestep , and is a scaling coefficient.