What is: Gaussian Mixture Variational Autoencoder?
Source | Regularizing Transformers With Deep Probabilistic Layers |
Year | 2000 |
Data Source | CC BY-SA - https://paperswithcode.com |
GMVAE, or Gaussian Mixture Variational Autoencoder, is a stochastic regularization layer for transformers. A GMVAE layer is trained using a 700-dimensional internal representation of the first MLP layer. For every output from the first MLP layer, the GMVAE layer first computes a latent low-dimensional representation sampling from the GMVAE posterior distribution to then provide at the output a reconstruction sampled from a generative model.