TensorFlow implementation of Neural Variational Inference for Text Processing
Switch branches/tags
Nothing to show
Clone or download
Latest commit 7484a52 Aug 10, 2016
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
assets update Mar 20, 2016
data learning rate decay added Mar 20, 2016
models change self.learning_rate to self.lr. Fix #1 Jun 1, 2016
.gitignore
LICENSE
README.md Merge branch 'master' of github.com:carpedm20/variational-text-tensor… Mar 20, 2016
batch_loader.py
main.py learning rate decay added Mar 20, 2016
reader.py update toy dataset Mar 20, 2016
toy_generator.py
utils.py

README.md

Neural Variational Document Model

Tensorflow implementation of Neural Variational Inference for Text Processing.

model_demo

This implementation contains:

  1. Neural Variational Document Model
    • Variational inference framework for generative model of text
    • Combines a stochastic document representation with a bag-of-words generative model
  2. Neural Answer Selection Model (in progress)
    • Variational inference framework for conditional generative model of text
    • Combines a LSTM embeddings with an attention mechanism to extract the semantics between question and answer

Prerequisites

Usage

To train a model with Penn Tree Bank dataset:

$ python main.py --dataset ptb

To test an existing model:

$ python main.py --dataset ptb --forward_only True

Results

Training details of NVDM. The best result can be achieved by onehost updates, not alternative updates.

scalar

histogram

Author

Taehoon Kim / @carpedm20