What is: Semi-Supervised Knowledge Distillation?
Source | Semi-Supervised Domain Generalizable Person Re-Identification |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
Semi-Supervised Knowledge Distillation is a type of knowledge distillation for person re-identification that exploits weakly annotated data by assigning soft pseudo labels to YouTube-Human to improve models' generalization ability. SSKD first trains a student model (e.g. ResNet-50) and a teacher model (e.g. ResNet-101) using labeled data from multi-source domain datasets. Then, SSKD develops an auxiliary classifier to imitate the soft predictions of unlabeled data generated by the teacher model. Meanwhile, the student model is also supervised by hard labels and predicted soft labels by the teacher model for labeled data.