Skip to content

Tensorflow implementation of Attention-over-Attention Neural Networks for Reading Comprehension

Notifications You must be signed in to change notification settings

kihyunwon/attention-over-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Attentive-over-Attention Neural Networks for Reading Comprehension

Tensorflow implementation of Attentive-over-Attention Neural Networks.

Prerequisites

Usage

First, download DeepMind Q&A Dataset from here, and untar cnn.tgz and dailymail.tgz into data directory:

Then run the pre-processing code with:

$ ./prepare-rc.sh

To train a model with cnn dataset:

$ python3 main.py --dataset cnn -t

To test an existing model (in progress):

$ python3 main.py --dataset cnn

Credit

Modified codes for pre-processing, shuffling, and loading dataset are originally from IBM's Attention Sum Reader implementation.

Results (in progress)

About

Tensorflow implementation of Attention-over-Attention Neural Networks for Reading Comprehension

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published