Skip to content
No description, website, or topics provided.
Branch: master
Clone or download
Latest commit 4e176df Jan 25, 2019
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
pytorch_pretrained_bert codes Jan 23, 2019
BERT_RACE.pdf results Jan 25, 2019
Dockerfile codes Jan 23, 2019
README.md update Jan 25, 2019
requirements.txt codes Jan 23, 2019
run.sh
run_race.py codes Jan 23, 2019

README.md

BERT for RACE

By: Chenglei Si (River Valley High School)

Implementation

This work is based on Pytorch implementation of BERT (https://github.com/huggingface/pytorch-pretrained-BERT). I adapted the original BERT model to work on multiple choice machine comprehension.

Environment:

The code is tested with Python3.6 and Pytorch 1.0.0.

Usage

  1. Download the dataset and unzip it. The default dataset directory is ./RACE
  2. Run ./run.sh

Hyperparameters

I did some tuning and find the following hyperparameters to work reasonally well:

BERT_base: batch size: 32, learning rate: 5e-5, training epoch: 3

BERT_large: batch size: 8, learning rate: 1e-5 (DO NOT SET IT TOO LARGE), training epoch: 2

Results

Model RACE RACE-M RACE-H
BERT_base 65.0 71.7 62.3
BERT_large 67.9 75.6 64.7

You can compare them with other results on the leaderboard.

BERT large achieves the current (Jan 2019) best result. Looking forward to new models that can beat BERT!

More Details

I have written a short report in this repo describing the details.

You can’t perform that action at this time.