Exploring Massively Multilingual, Massive Neural Machine Translation

We will be giving an overview of the recent efforts towards universal translation at Google Research. From training a single translation model for 100 languages to scaling neural networks beyond 80 billion parameters with 1000 layers deep Transformers and several research and engineering challenges that the project has tackled; multi-task learning with hundreds of tasks, learning under heavy data imbalance, trainability of very deep networks, understanding the learned representations, cross-lingual down-st
Back to Top