Skip to content
(TMM 2018) Cross-Modality Microblog Sentiment Prediction via Bi-Layer Multimodal Hypergraph Learning
Branch: master
Clone or download

Code of Bi-Layer Multimodal Hypergraph Learning

Bi-Layer Multimodal Hypergraph Learning is the advanced version of Multimodal Hypergraph Learning (Multi-HGL)

Brief Explanation

  • run_CV_gridsearch.m: an entrance for transductive learning, evaluation and inference
  • BiHG_learning2.m: a core part for bi-Layer multimodal hypergraph learning (someone can use or advances it for other tasks, e.g., including the initialization of edge weights, fixing W, and optimizing main variables/parameters (f, W, g and F) beyond levels)
  • preprocess*.m: pre-processing codes for data (we were informed that the data was sensitive, so you can refer to the codes to pre-process your data)
  • mPara.mStarExp, mPara.mLamda, mPara.mMu, mPara.mProbSigmaWeight, mPara.Alpha, mPara.mLamda2, and mPara.mMu2 are main hyper-parameters (Please refer to the paper). mPara.mStarExp and mPara.mMu2 are much more important.

Citing BiMHGL

If you find BiMHGL code useful in your research, please consider citing:

  title={Cross-modality microblog sentiment prediction via bi-layer multimodal hypergraph learning},
  author={Ji, Rongrong and Chen, Fuhai and Cao, Liujuan and Gao, Yue},
  journal={IEEE Transactions on Multimedia},
You can’t perform that action at this time.