What is: DropConnect?
Source | Regularization of Neural Networks using DropConnect |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
DropConnect generalizes Dropout by randomly dropping the weights rather than the activations with probability . DropConnect is similar to Dropout as it introduces dynamic sparsity within the model, but differs in that the sparsity is on the weights , rather than the output vectors of a layer. In other words, the fully connected layer with DropConnect becomes a sparsely connected layer in which the connections are chosen at random during the training stage. Note that this is not equivalent to setting to be a fixed sparse matrix during training.
For a DropConnect layer, the output is given as:
Here is the output of a layer, is the input to a layer, are weight parameters, and is a binary matrix encoding the connection information where . Each element of the mask is drawn independently for each example during training, essentially instantiating a different connectivity for each example seen. Additionally, the biases are also masked out during training.