Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chinese-BERT-wwm fine-tuning #18

Closed
nietao2 opened this issue Jul 3, 2019 · 3 comments
Closed

Chinese-BERT-wwm fine-tuning #18

nietao2 opened this issue Jul 3, 2019 · 3 comments

Comments

@nietao2
Copy link

nietao2 commented Jul 3, 2019

请问使用Chinese-BERT-wwm做fine-tuning的时候需要用LTP做分词吗?

@ymcui
Copy link
Owner

ymcui commented Jul 3, 2019

不需要的,和普通的BERT一样,使用的是Char级别的Tokenizer。
Whole Word Masking只影响预训练阶段的Mask方式,并不改变下游任务的输入方式。

@nietao2
Copy link
Author

nietao2 commented Jul 3, 2019

谢谢!

@nietao2 nietao2 closed this as completed Jul 3, 2019
@guotong1988
Copy link

为什么fine-tune的时候不需要分词?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants