Skip to content

YangLinyi/HTML-Hierarchical-Transformer-based-Multi-task-Learning-for-Volatility-Prediction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

86 Commits
 
 
 
 
 
 

Repository files navigation

HTML-Hierarchical-Transformer-based-Multi-task-Learning-for-Volatility-Prediction

If you find this repository helps your research, please cite our following paper:

Linyi Yang and Jiazheng Li and Ruihai Dong and Yue Zhang and Barry Smyth. NumHTML: Numeric-Oriented Hierarchical Transformer Model for Multi-task Financial Forecasting. Proceedings of the AAAI Conference on Artificial Intelligence 2022.

@inproceedings{yang2022numhtml,
title={NumHTML: Numeric-Oriented Hierarchical Transformer Model for Multi-task Financial Forecasting},
author={Yang, Linyi and Li, Jiazheng and Dong, Ruihai and Zhang, Yue and Smyth, Barry},
booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
volume={36},
number={10},
pages={11604--11612},
year={2022}
}

Linyi Yang, Tin Lok James Ng, Barry Smyth, Ruihai Dong. HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction. Proceedings of the The Web Conference 2020.

@inproceedings{yang2020html,
title={HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction},
author={Yang, Linyi and Ng, Tin Lok James and Smyth, Barry and Dong, Ruihai},
booktitle={Proceedings of The Web Conference 2020},
pages={441--451},
year={2020}
}

Dataset

The token-level transformer relies on the pre-trained transformers, which can be downloaded from here.
The raw dataset of the earnings call can be found from [Qin and Yang, ACL-19].

Model

We provide our code and data used for the paper. Our HTML model consists of a token-level transformer and a sentence-level transformer which can be found in the Model path. Also, we provide our experimental code using Multi-task and Single-task settings respectively.

Contact

Feel free to email me any questions or queries at linyi.yang@insight-centre.org -- Thanks for reading.

About

Code for WWW-20 Paper: HTML: Hierarchical Transformer-based Multi-task Learning for Volatility Prediction

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published