Everything you want to know about marketing AI content generation apps
8 text representation and advantages and disadvantages in the NLP field
Text representation isNLPThe mission is very basic and at the same time a very important part. This article will introduce the history of text representation and the advantages and disadvantages of each method.
What did BERT and Transformer learn?
BERTWhy is there such a good effect, deep into the principle itself, and where is it? At the AI ProCon 2019 conference, Zhang Junlin, head of the Sina Weibo machine learning team AI Lab, shared the BERT andTransformerWhat did you learn? 》
"59 Page PDF" Natural Language Processing NLP Basic Concepts (Free Download)
Easyai.tech found it difficult to get started with artificial intelligence, especially for non-technical people.
Therefore, we integrate the excellent science and technology content at home and abroad in the most easy-to-understand way, specifically for non-technical personnel, so that everyone can understand the basic concepts in the field of artificial intelligence.
2019's latest Transformer models: XLNET, ERNIE 2.0 and ROBERTA
The large pre-training language model is undoubtedly natural language processing (NLPThe main trend of the latest research progress.
Looking back at the past NNUM trends in 20
Take a look at past 20 years with ACL papers,NLP The development trend.
8 steps to solve 90% NLP issues
We'll start with the simplest method and move on to more subtle solutions such as feature engineering, word vectors and deep learning.
Baidu reading comprehension technology research and application
This report is divided into the following 4 parts: 1. What is machine reading comprehension? 2. Advances in reading comprehension technology. 3. More challenges in reading comprehension. 4. Baidu reading comprehension technology research work
Microsoft proposed the universal pre-training model MASS, the natural language generation task surpasses BERT, GPT!
Researchers at Microsoft Research Asia have proposed a new universal pre-training method, MASS, on ICML 2019, which goes beyond the sequence-to-sequence natural language generation task.BERTAnd GPT. In the WMT19 machine translation competition that Microsoft participated in, MASS helped the Chinese-English, English-Lithuanese language pairs to achieve the first place.