Viet-Anh on Software Logo

What is: Online Multi-granularity Distillation?

SourceOnline Multi-Granularity Distillation for GAN Compression
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

OMGD, or Online Multi-Granularity Distillation is a framework for learning efficient GANs. The student generator is optimized in a discriminator-free and ground-truth-free setting. The scheme trains the teacher and student alternatively, promoting these two generators iteratively and progressively. The progressively optimized teacher generator helps to warm up the student and guide the optimization direction step by step.

Specifically, the student generator G_SG\_{S} only leverages the complementary teacher generators GW_TG^{W}\_{T} and GD_TG^{D}\_{T} for optimization and can be trained in the discriminator-free and ground-truth-free setting. This framework transfers different levels concepts from the intermediate layers and output layer to perform the knowledge distillation. The whole optimization is conducted on an online distillation scheme. Namely, GW_TG^{W}\_{T}, GD_TG^{D}\_{T} and G_SG\_{S} are optimized simultaneously and progressively.