This project was completed as a part of the Honors portion of the Sequence Models Course on Coursera.
Credit to DeepLearning.AI and the Coursera platform for providing the course materials and guidance.
My objective is to utilize tokenizers and pre-trained models available in the HuggingFace Library. I aim to fine-tune a pre-trained transformer model specifically for Named-Entity Recognition (NER). By leveraging the power of pre-trained models and optimizing them for NER tasks, I aim to achieve accurate and efficient entity recognition in text data. This will enable me to enhance my skills in utilizing cutting-edge natural language processing techniques and advance my proficiency in applying transformer networks for real-world applications like Named-Entity Recognition.