Skip to content

Allen517/cyanrnn_project

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

33 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

CYAN-RNN

This repository enables the reproduction of the experiments described in the article:

Yongqing Wang, Huawei Shen, Shenghua Liu, Jinhua Gao and Xueqi Cheng. Cascade dynamics modeling with attention-based recurrent neural network. The 26th International Joint Conference on Artificial Intelligence (IJCAI-17). Melbourne, Australia, 2017.

The demo of this project is located in my personal website


Content


Requirements

  • maven==3.*
  • jdk==1.8
  • (optional) Eclipse

You'd better install maven plugin in eclipse (the lastest version has already installed maven plugin)

back to top

Usage

Install the project

git clone git@github.com:Allen517/cyanrnn_project.git
cd cyanrnn_project
mvn clean install

(Optional) Compile and packaging by Eclipse

  • Import a project and import "Existing Maven Project"

import a project

  • Click "Browse" and choose "cyanrnn_project"

choose cyanrnn project

  • Click "Finish"

finish

  • Export a "Runnable JAR file"

Right click on the main procedure

right click

Choose "Export"

export

Choose "Runnable JAR file"

runnable JAR file

Completed

completed

Running

java -jar cyanrnn.jar config_cyanrnn_hsoftmax

move the runnable jar (e.g., the jar file is called "cyanrnn.jar") into the directory of cyanrnn_project

back to top

Data

back to top

Specfication

The architecture of "src" directory

  • main.java.com.kingwang.netattrnn

baselines

evals

--RNNModelEvals.java: implementation of RNN validation in tranining process

rnn

--RNN.java: main process of RNN

batchderv (When minibatch is finished, batchderv will average the derivation in all batches.)

BatchDerivative.java: interface of BatchDerivative

impl

--AttBatchDerivative.java: for attention layer

--AttWithCovBatchDerivative.java: for attention layer with coverage

--GRUBatchDerivative.java: for GRU (RNN)

--InputBatchDerivative.java: for input layer

--LSTMBatchDerivative.java: for LSTM (RNN)

--OutputBatchDerivative.java: for output layer

--OutputBatchWithHSoftmaxDerivative.java: for output layer with hierachical softmax

--OutputBatchWithOnlyTimeDerivative.java: for output layer (only calculating the generation of activated time)

--OutputBatchWithTimeDerivative.java: for output layer with hierachical softmax (calculating the generation of activated time and activated users)

cells

--Cell.java: interface of RNN layers

--Operator.java: basic operator for RNN layers

baselines

rnn/impl: implementation of RNN

impl: Implementation of CYAN-RNN and CYAN-RNN(cov)

main: Main procedure of CYAN-RNN

comm/utils: Common utilities

cons: Constants

dataset: Implementation of loading dataset

evals: Implementation of CYAN-RNN and CYAN-RNN(cov) validation in tranining process

utils: Common utilities for RNN, CYAN-RNN, CYAN-RNN(cov)

back to top

Releases

No releases published

Packages

No packages published

Languages