Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swap Mecab tokenizer with Sentencepiece : possible ? #14

Closed
sachaarbonel opened this issue Jul 18, 2020 · 2 comments
Closed

Swap Mecab tokenizer with Sentencepiece : possible ? #14

sachaarbonel opened this issue Jul 18, 2020 · 2 comments

Comments

@sachaarbonel
Copy link

Hi @cl-tohoku, I wanted to get my hands dirty with your model to finetune a pos model. When going on your model card I wanted to test out your model using the recently released Hosted inference API from hugging face when I got this error: ⚠️ This model could not be loaded by the inference API. ⚠️ Error loading tokenizer No module named 'MeCab' ModuleNotFoundError("No module named 'MeCab'"). Correct me if I'm wrong but wouldn't be possible to swap out the Mecab based tokenizer with sentencepiece using this pretrained weights?

@singletongue
Copy link
Collaborator

Thank you for the information regarding the Hosted inference API.
I'm sorry but it should take some time to address the issue.

Since the tokenization used in the distributed models is not compatible with that of sentencepiece, just swapping the tokenizer would not be a great solution.
The pretrained weights you mentioned could be utilized, but it should also need to be fine-tuned to BERT.

The possible solution will be to train a model with a new tokenizer.
We may release such models in the future, but we cannot guarantee the release.

@sachaarbonel
Copy link
Author

Thank's for the clarification I wasn't sure we needed to retrain the model with the new tokenizer. In the mean time I'll conduct my fine-tuning experiments with the Albert Japanese model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants