Skip to content

Social-AI-Studio/MUSCAT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Multilingual Rumor Detection in Social Media Conversations

This repository contains codes for the paper MUSCAT: Multilingual Rumor Detection in Social Media Conversations, IEEE BigData 2022. The code here implements the MUSCAT model and includes scripts for multiple baselines.

Approach

MUSCAT

Data

To conduct experiments using MUSCAT, we used the PHEME, Twitter16, and SEAR datasets. To download the data, you can follow these links:

Scripts

To conduct experiments using MUSCAT, we used the PHEME, Twitter16, and SEAR datasets. To download the data, you can follow these links:

To run TD-RvNN, execute the following script:

OBJ=PHEME #dataset name
LANG=$en # language split
python torch_model/Main_TD_RvNN.py --obj $OBJ --lang $LANG --fold $i --epochs 300 &

To run BiGCN, execute the following script:

LANG=en
python ./Process/getTwittergraph.py PHEME $LANG
python ./model/Twitter/BiGCN_Twitter.py PHEME 10 $LANG 

To run MUSCAT, execute the following script:

MODEL=bert-base-multilingual-cased
EXP_SETTING=coupled-hierarchical-attn
LANG=en
i=pheme4cls

python run_rumor_opt.py --data_dir ./rumor_data/${i}/${LANG}/${k}/ --train_batch_size 16 \
--task_name ${i} --output_dir ./output_v${OUT_DIR}/${i}_rumor_output_${k}/ --bert_model $MODEL \
--do_train --do_eval --learning_rate 5e-5 --max_tweet_num 17 --max_tweet_length 30 \
--exp_setting $EXP_SETTING # --use_longformer

To cite

@inproceedings{awal2022muscat,
  title={MUSCAT: Multilingual Rumor Detection in Social Media Conversations},
  author={Awal, Md Rabiul and Nguyen, Minh Dang and Lee, Roy Ka-Wei and Choo, Kenny Tsu Wei},
  booktitle={2022 IEEE International Conference on Big Data (Big Data)},
  pages={455--464},
  year={2022},
  organization={IEEE}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •