Skip to content

Public repository of paper " Learning to combine classifiers outputs with the transformer for text classification"

Notifications You must be signed in to change notification settings

Buguemar/Transformer_as_ensemble

Repository files navigation

Transformer_as_ensemble

Text classification is a fairly explored task that has allowed dealing with a considerable amount of problems. However, one of its main difficulties is to conduct a learning process in data with class imbalance, i.e., datasets with only a few examples in some classes, which often represent the most interesting cases for the task. In this context, text classifiers overfit some particular classes, showing poor performance. To address this problem, we propose a scheme that combines the outputs of different classifiers, coding them in the encoder of a transformer. Feeding also a BERT encoding of each example, the encoder learns a joint representation of the text and the outputs of the classifiers. These encodings are used to train a new text classifier. We also introduce a data augmentation technique, which allows the representation learning task to be driven without over-fitting the encoding to a particular class.

About

Public repository of paper " Learning to combine classifiers outputs with the transformer for text classification"

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published