BERT is a Bidirectional Encoder Representations from Transformers which is designed to pre-train deep bidirectional representations from unlabeled text.
This article is on how to use BERT for sentiment analysis.
After I imported the libraries and loaded the dataset from the file, I started cleaning the data. This involves removing symbols that may interfere during tokenization.
The rest of the article on this can be found here: https://nwosunneoma.medium.com/sentiment-analysis-with-bert-b4fa916a353c.
This is a drive to the file as the csv file was to large to upload: https://drive.google.com/drive/folders/10h5xzIk-z-n3m94Rjb5q6ttjazjN3FaK?usp=sharing