Skip to content

Transformers Workshop on behalf of ML India. Contains resource notebook for training/inferring large scale transformer models for different downstream task

Notifications You must be signed in to change notification settings

abhilash1910/Transformers-Workshop

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 

Repository files navigation

Transformers-Workshop 🚀

NLP-Workshop-ML-India

This repository contains the codes and the notebooks for NLP Workshop which was organized by ML India from June 19- July 11.

Contents

Notebook contains the contents for the Transformers part of the session. This mainly relies on Transformer models.

The contents include:

  1. Encoder Decoder Architecture
  2. Disadvantages of Encoder Decoders
  3. Transformer Architectures
  4. Attention Mechanism
  5. Bahdanau,Luong Attention
  6. Self and Multi Head Attention
  7. Designing a Keras Transformer
  8. Extacting Distilbert/BERT embeddings for finetuning on classification task
  9. Working with input ids,tokens and attention masks for Transformer models
  10. Inference Tasks using different transformers
  11. Bert based QA inference
  12. Encoder Decoder T5 architecture for Summarization Inference
  13. GPT2 model for Text Generation Inference
  14. Encoder Decoder Electra Model for NER Inference
  15. DialogRPT Model for Text Classification Inference
  16. T5 for Text 2 Text Paraphrasing/Generation
  17. BART encoder decoder model for Zero Shot Classification
  18. Also contains samples for training Transformers on downstream tasks such as Token Classification /SQuAD etc.

Guidelines

This code has been released under Apache License. The resources for the notebooks is present inside Kaggle,particularly embedding files. These can be used locally by either downloading them from kaggle manually or can be used in kaggle notebooks by using the "Add Data" tab in kaggle notebooks.

About

Transformers Workshop on behalf of ML India. Contains resource notebook for training/inferring large scale transformer models for different downstream task

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published