bert

Good text sharing

Microsoft proposed the universal pre-training model MASS, the natural language generation task surpasses BERT, GPT!

Researchers at Microsoft Research Asia have proposed a new universal pre-training method, MASS, on ICML 2019, which goes beyond the sequence-to-sequence natural language generation task.BERTAnd GPT. In the WMT19 machine translation competition that Microsoft participated in, MASS helped the Chinese-English, English-Lithuanese language pairs to achieve the first place.