Skip to content

Latest commit

 

History

History
executable file
·
8 lines (4 loc) · 702 Bytes

README.md

File metadata and controls

executable file
·
8 lines (4 loc) · 702 Bytes

SubTST

Our proposed SubTST model is a novel approach for combining latent topic information with Transformer-based models. By integrating topic-based information into word representation, our model is effective to enhance the text representation via external features. With our delicate combination, our works propose a promising way to provide the topic information into context-based information from pre-trained language models. Besides, with the consideration of lexicon unit in both topic model and Transformer-based language models, our approach obtains the potential in both performance and complexity.Paper

Run

sh run.sh