Our proposed SubTST model is a novel approach for combining latent topic information with Transformer-based models. By integrating topic-based information into word representation, our model is effective to enhance the text representation via external features. With our delicate combination, our works propose a promising way to provide the topic information into context-based information from pre-trained language models. Besides, with the consideration of lexicon unit in both topic model and Transformer-based language models, our approach obtains the potential in both performance and complexity.Paper
sh run.sh