Khanh Vu · Changling Li · Zihan Zhu · Xin Chen
Group: Attention is all you need
Department of Computer Science, ETH Zurich, Switzerland
Table of Contents
Large language models have revolutionized various fields, showcasing their capacity for understanding human natural languages. Tweet sentiment analysis is a challenging and valuable task within its context, given its direct relevance to social media analyses. In this report, we delve into a method utilizing BERT (Bidirectional Encoder Representations from Transformers) ensemble, showcasing its efficiency in achieving commendable performance in tweet sentiment analysis compared to several standard baselines. We conduct a comprehensive suite of experiments, with discussions on the critical role of data preprocessing in improving model performance. Our findings provide insights into the development of more robust and efficient sentiment analysis models.
See the full report.
We include baseline and our methods' implementation in the following sections.
The preprocessed data can be downloaded here, remember to decompress and put it under src/data
.
Please check src/ML/.
Please check src/DL/.
Please check src/bert-fine_tuning/.
Contact Khanh Vu, Changling Li, Zihan Zhu and Xin Chen for questions, comments and reporting bugs.