Viet-Anh on Software Logo

What is: Teacher-Tutor-Student Knowledge Distillation?

SourceParser-Free Virtual Try-on via Distilling Appearance Flows
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Teacher-Tutor-Student Knowledge Distillation is a method for image virtual try-on models. It treats fake images produced by the parser-based method as "tutor knowledge", where the artifacts can be corrected by real "teacher knowledge", which is extracted from the real person images in a self-supervised way. Other than using real images as supervisions, knowledge distillation is formulated in the try-on problem as distilling the appearance flows between the person image and the garment image, enabling the finding of dense correspondences between them to produce high-quality results.