This repository enables the reproduction of the experiments described in the article:
Yongqing Wang, Huawei Shen, Shenghua Liu, Jinhua Gao and Xueqi Cheng. Cascade dynamics modeling with attention-based recurrent neural network. The 26th International Joint Conference on Artificial Intelligence (IJCAI-17). Melbourne, Australia, 2017.
The demo of this project is located in my personal website
- Install the project
- (Optional) Compile and packaging by Eclipse
- The architecture of "src" directory
- (optional) Eclipse
You'd better install maven plugin in eclipse (the lastest version has already installed maven plugin)
Install the project
git clone firstname.lastname@example.org:Allen517/cyanrnn_project.git cd cyanrnn_project mvn clean install
(Optional) Compile and packaging by Eclipse
- Import a project and import "Existing Maven Project"
- Click "Browse" and choose "cyanrnn_project"
- Click "Finish"
- Export a "Runnable JAR file"
Right click on the main procedure
Choose "Runnable JAR file"
java -jar cyanrnn.jar config_cyanrnn_hsoftmax
move the runnable jar (e.g., the jar file is called "cyanrnn.jar") into the directory of cyanrnn_project
- Please check the instruction
The architecture of "src" directory
--RNNModelEvals.java: implementation of RNN validation in tranining process
--RNN.java: main process of RNN
batchderv (When minibatch is finished, batchderv will average the derivation in all batches.)
BatchDerivative.java: interface of BatchDerivative
--AttBatchDerivative.java: for attention layer
--AttWithCovBatchDerivative.java: for attention layer with coverage
--GRUBatchDerivative.java: for GRU (RNN)
--InputBatchDerivative.java: for input layer
--LSTMBatchDerivative.java: for LSTM (RNN)
--OutputBatchDerivative.java: for output layer
--OutputBatchWithHSoftmaxDerivative.java: for output layer with hierachical softmax
--OutputBatchWithOnlyTimeDerivative.java: for output layer (only calculating the generation of activated time)
--OutputBatchWithTimeDerivative.java: for output layer with hierachical softmax (calculating the generation of activated time and activated users)
--Cell.java: interface of RNN layers
--Operator.java: basic operator for RNN layers
rnn/impl: implementation of RNN
impl: Implementation of CYAN-RNN and CYAN-RNN(cov)
main: Main procedure of CYAN-RNN
comm/utils: Common utilities
dataset: Implementation of loading dataset
evals: Implementation of CYAN-RNN and CYAN-RNN(cov) validation in tranining process
utils: Common utilities for RNN, CYAN-RNN, CYAN-RNN(cov)