We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
你好 目前我在做一项实验希望给予两个entity预测出relation。我想请问您的代码是否也能应用在中文语料库上。另外,是否也有像是FB15K之类这种中文的训练资料呢
另外,请问此篇论文是投稿于哪个会议或期刊? 谢谢
The text was updated successfully, but these errors were encountered:
@WenTingTseng
你好,
您在调用代码时将预训练BERT模型从英文的bert-base-uncased和bert-base-cased换成中文的'bert-base-chinese即可, 即在命令
python3 run_bert_relation_prediction.py --task_name kg --do_train --do_eval --do_predict --data_dir ./data/XXX --bert_model bert-base-cased --max_seq_length 25 --train_batch_size 32 --learning_rate 5e-5 --num_train_epochs 20.0 --output_dir ./output_FB15K/ --gradient_accumulation_steps 1 --eval_batch_size 512
中使用:
--bert_model bert-base-chinese,而不是 --bert_model bert-base-cased
中文知识图谱可参考:https://github.com/ownthink/KnowledgeGraphData
这篇论文之前投的会议,没中,现在准备转投IEEE Transactions期刊,可能PAMI,或TKDE。
Sorry, something went wrong.
好的,了解
谢谢你
No branches or pull requests
你好
目前我在做一项实验希望给予两个entity预测出relation。我想请问您的代码是否也能应用在中文语料库上。另外,是否也有像是FB15K之类这种中文的训练资料呢
另外,请问此篇论文是投稿于哪个会议或期刊?
谢谢
The text was updated successfully, but these errors were encountered: