BERTWhy is there such a good effect, deep into the principle itself, and where is it? At the AI ProCon 2019 conference, Zhang Junlin, head of the Sina Weibo machine learning team AI Lab, shared the BERT andTransformerWhat did you learn? 》
CMU and Google’s new XLNet surpassed 20 tasks BERT Performance and achieved the best results on 18 tasks! What's even more exciting is that XLNet has now opened training code and large pre-training models.
Researchers at Microsoft Research Asia have proposed a new universal pre-training method, MASS, on ICML 2019, which goes beyond the sequence-to-sequence natural language generation task.BERTAnd GPT. In the WMT19 machine translation competition that Microsoft participated in, MASS helped the Chinese-English, English-Lithuanese language pairs to achieve the first place.
2018 10 At the beginning of the month, Google AI proposed a new contextual word representation -BERT feature. This paper introduces the BERT features and briefly analyzes the interpretability of BERT features.
We will use Kaggle's spam classification challenge to measureBERTPerformance in multi-label text categorization.