Skip to content

arensis-julia/NLP_papers

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 

Repository files navigation

PAPER SUMMARY

BASICS

이름 저자 년도 비고 링크
Attention is All You Need A Vaswani et al. 2017 Transformer paper | notion
Deep contextualized word representations ME Peters et al. 2018 ELMo paper | notion | allennlp
Improving Language Understanding by Generative Pre-Training A Radford et al. 2018 GPT paper | notion
BERT: Pre-training of Deep bidirectional Transformers for Language Understanding J Devlin et al. 2019 BERT paper | notion | github
Language Models are Unsupervised Multitask Learners A Radford et al. 2019 GPT2 paper | notion | github
RoBERTa: A Robustly Optimized BERT Pretraining Approach Y Liu et al. 2019 RoBERTa paper | notion | github
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations Z Lan et al. 2019 ALBERT paper | notion | github
BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension M Lewis et al. 2019 BART paper | notion | github
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators K Clark et al. 2020 ELECTRA paper | notion | github
Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer C Raffel et al. 2020 T5 paper | notion | github

About

Natural Language Processing (NLP) - paper review

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published