What is: L1 Regularization?
Year | 1986 |
Data Source | CC BY-SA - https://paperswithcode.com |
Regularization is a regularization technique applied to the weights of a neural network. We minimize a loss function compromising both the primary loss function and a penalty on the Norm of the weights:
where is a value determining the strength of the penalty. In contrast to weight decay, regularization promotes sparsity; i.e. some parameters have an optimal value of zero.
Image Source: Wikipedia