We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
请问使用Chinese-BERT-wwm做fine-tuning的时候需要用LTP做分词吗?
The text was updated successfully, but these errors were encountered:
不需要的,和普通的BERT一样,使用的是Char级别的Tokenizer。 Whole Word Masking只影响预训练阶段的Mask方式,并不改变下游任务的输入方式。
Sorry, something went wrong.
谢谢!
为什么fine-tune的时候不需要分词?
No branches or pull requests
请问使用Chinese-BERT-wwm做fine-tuning的时候需要用LTP做分词吗?
The text was updated successfully, but these errors were encountered: