Name: Yiming Cui
Type: User
Company: Joint Laboratory of HIT and iFLYTEK Research (HFL)
Bio: NLP Researcher. Mainly interested in Pre-trained Language Model, Machine Reading Comprehension, Question Answering, etc.
Twitter: KCrosner
Location: Beijing, China
Blog: http://ymcui.github.io
Yiming Cui's Projects
Chinese Version of ACL 2020 PC Blogs (ACL 2020程序委员会博文中文版)
Policies of scientific publisher and conferences towards large language model (LLM), such as ChatGPT
Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm系列模型)
A Chinese Cloze-style RC Dataset: People's Daily & Children's Fairy Tale (CFT)
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
中文LLaMA&Alpaca大语言模型+本地CPU/GPU训练部署 (Chinese LLaMA & Alpaca LLMs)
中文LLaMA-2 & Alpaca-2大模型二期项目 + 64K超长上下文模型 (Chinese LLaMA-2 & Alpaca-2 LLMs with 64K long context models)
中文羊驼大模型三期项目 (Chinese Llama-3 LLMs) developed from Meta Llama 3
中文Mixtral混合专家大模型(Chinese Mixtral MoE LLMs)
Chinese MobileBERT(中文MobileBERT模型)
Collections of Chinese reading comprehension datasets
Pre-Trained Chinese XLNet(中文XLNet预训练模型)
The First Evaluation Workshop on Chinese Machine Reading Comprehension (CMRC 2017)
A Span-Extraction Dataset for Chinese Machine Reading Comprehension (CMRC 2018)
A Sentence Cloze Dataset for Chinese Machine Reading Comprehension (CMRC 2019)
Cross-Lingual Machine Reading Comprehension (EMNLP 2019)
Empirical Evaluation on Current Neural Networks on Cloze-style Reading Comprehension
ExpMRC: Explainability Evaluation for Machine Reading Comprehension
LAMB Optimizer for Large Batch Training (TensorFlow version)
LERT: A Linguistically-motivated Pre-trained Language Model(语言学信息增强的预训练模型LERT)
Port of Facebook's LLaMA model in C/C++
Revisiting Pre-trained Models for Chinese Natural Language Processing (MacBERT)
Multilingual Multi-Aspect Explainability Analyses on Machine Reading Comprehension Models (iScience)
Score your NLP paper review
PERT: Pre-training BERT with Permuted Language Model
VLE: Vision-Language Encoder (VLE: 视觉-语言多模态预训练模型)
XLNet: Generalized Autoregressive Pretraining for Language Understanding