Skip to content

jihadakbr/transformer-pre-processing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Transformer Pre-processing

This project was completed as a part of the Honors portion of the Sequence Models Course on Coursera.

Credit to DeepLearning.AI and the Coursera platform for providing the course materials and guidance.

Objective

In this notebook, my objective is to explore and understand the pre-processing methods applied to raw text before passing it to the encoder and decoder blocks of the Transformer architecture.

Upon completing this assignment, I will gain the skills to create visualizations that provide insights into positional encodings, enhancing my understanding of how they impact word embeddings. By delving into the pre-processing techniques, I will be equipped with the knowledge to effectively prepare text data for the Transformer model, facilitating more accurate and meaningful representations.

Results

Transformer Pre-processing

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published