이름 | 저자 | 년도 | 비고 | 링크 |
---|---|---|---|---|
Attention is All You Need | A Vaswani et al. | 2017 | Transformer | paper | notion |
Deep contextualized word representations | ME Peters et al. | 2018 | ELMo | paper | notion | allennlp |
Improving Language Understanding by Generative Pre-Training | A Radford et al. | 2018 | GPT | paper | notion |
BERT: Pre-training of Deep bidirectional Transformers for Language Understanding | J Devlin et al. | 2019 | BERT | paper | notion | github |
Language Models are Unsupervised Multitask Learners | A Radford et al. | 2019 | GPT2 | paper | notion | github |
RoBERTa: A Robustly Optimized BERT Pretraining Approach | Y Liu et al. | 2019 | RoBERTa | paper | notion | github |
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations | Z Lan et al. | 2019 | ALBERT | paper | notion | github |
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension | M Lewis et al. | 2019 | BART | paper | notion | github |
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators | K Clark et al. | 2020 | ELECTRA | paper | notion | github |
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer | C Raffel et al. | 2020 | T5 | paper | notion | github |
-
Notifications
You must be signed in to change notification settings - Fork 0
arensis-julia/NLP_papers
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
Natural Language Processing (NLP) - paper review
Resources
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published