Skip to content
Implementation of the paper: Text Segmentation as a Supervised Learning Task
Branch: master
Clone or download
koomri Merge pull request #3 from geekSiddharth/patch-2
fixed case where sentence is of 0 len
Latest commit 878116a Jun 14, 2018
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
data Published code Mar 20, 2018
models Published code Mar 20, 2018
papers Published code Mar 20, 2018
webapp Published code Mar 20, 2018
.gitignore Published code Mar 20, 2018
README.md update paper url Mar 28, 2018
accuracy.py Published code Mar 20, 2018
annotate_wiki_file.py Published code Mar 20, 2018
calc_statistics.py Published code Mar 20, 2018
check_annotated_wiki_file.py Published code Mar 20, 2018
chen_cities_converter.py Published code Mar 20, 2018
chen_elements_convertor.py Published code Mar 20, 2018
choi_convertor.py Published code Mar 20, 2018
choiloader.py Published code Mar 20, 2018
clean_wiki_dataset.py Published code Mar 20, 2018
configgenerator.py Published code Mar 20, 2018
convert_seperator.py Published code Mar 20, 2018
delim_sign Published code Mar 20, 2018
evaluate.py fixed case where sentence is of 0 len May 22, 2018
gpu2cpu.py Published code Mar 20, 2018
graphseg_gen.sh Published code Mar 20, 2018
graphseg_timer.py Published code Mar 20, 2018
malicious_wiki_files Published code Mar 20, 2018
run.py Published code Mar 20, 2018
run_web_server.py Published code Mar 20, 2018
seg_comparsion.py Published code Mar 20, 2018
test_accuracy.py Published code Mar 20, 2018
test_accuracy_choi.py Published code Mar 20, 2018
tests.py
text_manipulation.py Published code Mar 20, 2018
times_profiler.py Published code Mar 20, 2018
utils.py Published code Mar 20, 2018
web_config.py Published code Mar 20, 2018
wiki_extractor.py Published code Mar 20, 2018
wiki_loader.py Published code Mar 20, 2018
wiki_processor.py Published code Mar 20, 2018
wiki_thresholds.py Published code Mar 20, 2018
wiki_utils.py Published code Mar 20, 2018
wikicities_article_names_to_ids Published code Mar 20, 2018
wikielements_article_names_to_ids Published code Mar 20, 2018

README.md

Text Segmentation as a Supervised Learning Task

This repository contains code and supplementary materials which are required to train and evaluate a model as described in the paper Text Segmentation as a Supervised Learning Task

Downalod required resources

wiki-727K, wiki-50 datasets:

https://www.dropbox.com/sh/k3jh0fjbyr0gw0a/AADzAd9SDTrBnvs1qLCJY5cza?dl=0

word2vec:

https://drive.google.com/a/audioburst.com/uc?export=download&confirm=zrin&id=0B7XkCwpI5KDYNlNUTTlSS21pQmM

Fill relevant paths in configgenerator.py, and execute the script (git repository includes Choi dataset)

Creating an environment:

conda create -n textseg python=2.7 numpy scipy gensim ipython 
source activate textseg
pip install http://download.pytorch.org/whl/cu80/torch-0.3.0-cp27-cp27mu-linux_x86_64.whl 
pip install tqdm pathlib2 segeval tensorboard_logger flask flask_wtf nltk
pip install pandas xlrd xlsxwriter termcolor

How to run training process?

python run.py --help

Example:

python run.py --cuda --model max_sentence_embedding --wiki 

How to evaluate trained model (on wiki-727/choi dataset)?

python test_accuracy.py  --help

Example:

python test_accuracy.py --cuda --model <path_to_model> --wiki

How to create a new wikipedia dataset:

python wiki_processor.py --input <input> --temp <temp_files_folder> --output <output_folder> --train <ratio> --test <ratio>

Input is the full path to the wikipedia dump, temp is the path to the temporary files folder, and output is the path to the newly generated wikipedia dataset.

Wikipedia dump can be downloaded from following url:

https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2

You can’t perform that action at this time.