Toxic Comment Classification Challenge
The task was to classify online comments into 6 categories: toxic
, severve_toxic
, obscene
, threat
, insult
, identity_hate
.
Private score: 0.9826 / Public score: 0.9833
FastText pre-trained embedding vector is used on this project.
Please download word2vec binary file from fasttext site
And copy word2vec binary file to the data folder.
└── data
└── crawl-300d-2M.vec
└── sample_submission.csv
└── test.vec
└── train.vec
Training locally
$ python main.py --output_file_path submission_result.csv
- python 3
- keras
- numpy
- matplotlib
- tensorflow
- pandas
- sklearn