Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

微调样本的用处是什么?如何使用? #44

Open
moyaojunshi opened this issue Jul 13, 2021 · 1 comment
Open

微调样本的用处是什么?如何使用? #44

moyaojunshi opened this issue Jul 13, 2021 · 1 comment

Comments

@moyaojunshi
Copy link

···
$ cd bert_modified
$ python create_data.py -f /path/to/training/data/file
$ python create_tf_record.py --input_file correct.txt --wrong_input_file wrong.txt --output_file tf_examples.tfrecord --vocab_file ../model/pre-trained/vocab.txt
···
经过以上的步骤后,获得微调样本tf_examples.tfrecord,我想知道,这个微调样本的用处以及如何使用

@VinMing
Copy link

VinMing commented Nov 12, 2021

reamde上面有说明。

Then, all you need to do is to continue training the pretrained model following the same commands for pretraining described in GitHub repo for BERT except using the pretrained model as the initial checkpoint.
Put the checkpoint files of the fine-tuned model under model/fine-tuned/. Then, set the "fine-tuned" in the faspell_configs.json to the path of the the fine-tuned model.

看谷歌的BERT模型README中的 Pre-training with BERT,具体根据自己需求来修改参数,就能得到model.ckpt等一系列文件。放到model/fine-tuned就好

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants