Skip to content

End to end Bert Q&A MOdel tutorial on SageMaker. (Bring your own container).

License

Notifications You must be signed in to change notification settings

awshlabs/Jan2020-NLPLab

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AKO 2020 : State-of-the-art NLP with MXNet

Presenter: Rachel Hu, Wen-ming Ye, Laurens ten Cate

Abstract

Implementing natural language processing (NLP) models just got simpler and faster. We will quickly introduce BERT (Bidirectional Encoder Representation from Transformers), the state-of-the-art (SOTA) NLP model, and demonstrate how it can be used for various NLP tasks. In this chalk talk, learn how to implement NLP models using Apache MXNet and the GluonNLP Toolkit to quickly prototype products, validate new ideas, and learn SOTA NLP. We will also show how you can use GluonNLP and SageMaker to fine-tune BERT for a text classification use case and deploy the trained model. Come join us to train your NLP model onsite!

Start the lab here in the tutorial folder [https://github.com/awshlabs/Jan2020-NLPLab/tree/master/tutorial] (https://github.com/awshlabs/Jan2020-NLPLab/tree/master/tutorial).

About

End to end Bert Q&A MOdel tutorial on SageMaker. (Bring your own container).

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published