What is: FastMoE?
| Source | FastMoE: A Fast Mixture-of-Expert Training System |
| Year | 2000 |
| Data Source | CC BY-SA - https://paperswithcode.com |
**FastMoE ** is a distributed MoE training system based on PyTorch with common accelerators. The system provides a hierarchical interface for both flexible model design and adaption to different applications, such as Transformer-XL and Megatron-LM.
