Tensorflow implementation of "EGL-word" method in AAAI 2017 paper "Active Discriminative Text Representation Learning"
Switch branches/tags
Nothing to show
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Failed to load latest commit information.
README.md
active_learning.py
data_helpers.py
process_data.py
results.png
rt-polarity.neg
rt-polarity.pos
text_cnn.py

README.md

Active-Learning-for-Neural-Networks

This re-implemented the "EGL-word" method proposed in AAAI 2017 paper "Active Discriminative Text Representation Learning" in Tensorflow (the original implementation was in Theano). It compared the "EGL-word" method with two baseline methods "random" and "entropy" on a sentiment analysis dataset.

First run "python process_data.py path/to/pre-trained-embedding" to generate the processed data.

Then run "python active_learning.py --AL_method=active-learning-method", where "active-learning-method" should be one of "random", "entropy" or "EGL".

The following figure is the average learning curve over five runs of 10-fold cross validation.

Learning Curve