What is: Seesaw Loss?
Source | Seesaw Loss for Long-Tailed Instance Segmentation |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Seesaw Loss is a loss function for long-tailed instance segmentation. It dynamically re-balances the gradients of positive and negative samples on a tail class with two complementary factors: mitigation factor and compensation factor. The mitigation factor reduces punishments to tail categories w.r.t the ratio of cumulative training instances between different categories. Meanwhile, the compensation factor increases the penalty of misclassified instances to avoid false positives of tail categories. The synergy of the two factors enables Seesaw Loss to mitigate the overwhelming punishments on tail classes as well as compensate for the risk of misclassification caused by diminished penalties.
Here works as a tunable balancing factor between different classes. By a careful design of , Seesaw loss adjusts the punishments on class j from positive samples of class . Seesaw loss determines by a mitigation factor and a compensation factor, as:
The mitigation factor decreases the penalty on tail class according to a ratio of instance numbers between tail class and head class . The compensation factor increases the penalty on class whenever an instance of class is misclassified to class .