Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问可以应用在中文的语料库吗 #14

Closed
WenTingTseng opened this issue Apr 19, 2020 · 2 comments
Closed

请问可以应用在中文的语料库吗 #14

WenTingTseng opened this issue Apr 19, 2020 · 2 comments

Comments

@WenTingTseng
Copy link

WenTingTseng commented Apr 19, 2020

你好
目前我在做一项实验希望给予两个entity预测出relation。我想请问您的代码是否也能应用在中文语料库上。另外,是否也有像是FB15K之类这种中文的训练资料呢

另外,请问此篇论文是投稿于哪个会议或期刊?
谢谢

@yao8839836
Copy link
Owner

yao8839836 commented Apr 19, 2020

@WenTingTseng

你好,

您在调用代码时将预训练BERT模型从英文的bert-base-uncased和bert-base-cased换成中文的'bert-base-chinese即可, 即在命令

python3 run_bert_relation_prediction.py
--task_name kg
--do_train
--do_eval
--do_predict
--data_dir ./data/XXX
--bert_model bert-base-cased
--max_seq_length 25
--train_batch_size 32
--learning_rate 5e-5
--num_train_epochs 20.0
--output_dir ./output_FB15K/
--gradient_accumulation_steps 1
--eval_batch_size 512

中使用:

--bert_model bert-base-chinese,而不是 --bert_model bert-base-cased

中文知识图谱可参考:https://github.com/ownthink/KnowledgeGraphData

这篇论文之前投的会议,没中,现在准备转投IEEE Transactions期刊,可能PAMI,或TKDE。

@WenTingTseng
Copy link
Author

好的,了解

谢谢你

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants