Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

🚀 [2020] Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks #17

Open
JudePark96 opened this issue Oct 3, 2020 · 0 comments

Comments

@JudePark96
Copy link
Owner

논문 Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks
저자 Suchin Gururangan, Ana Marasović, Swabha Swayamdipta, Kyle Lo, Iz Beltagy, Doug Downey, Noah A. Smith
링크 https://www.aclweb.org/anthology/2020.acl-main.740.pdf
학회 ACL 2020

Contents

1. 초록은 뭐라고 말하고 있어 ?

2. 주요 기여점은 뭐야 ?

3. 이전의 접근과는 뭐가 다른 것 같아 ?

4. 어떤 걸 제안할 수 있을까 ?

5. 다음 논문은 무엇을 읽어야할까 ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant