Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using finetuning scripts with BERT #47

Closed
dungtn opened this issue Feb 10, 2020 · 1 comment
Closed

Using finetuning scripts with BERT #47

dungtn opened this issue Feb 10, 2020 · 1 comment

Comments

@dungtn
Copy link

dungtn commented Feb 10, 2020

Hi 👋,

Thank you for the great work!

I'm trying to replicate the BERT baseline for downstream tasks. Is it possible to load BERT pre-trained instead of ERNIE pre-trained model in the fine-tuning code? If not, can you provide some pointers to the code you used for baseline?

I pointed the --ernie-model to BERT pre-trained but I got this error

Traceback (most recent call last):
File "code/run_typing.py", line 573, in main()
File "code/run_typing.py", line 511, in main
train_examples, label_list, args.max_seq_length, tokenizer_label, tokenizer, args.threshold)
File "code/run_typing.py", line 168, in convert_examples_to_features
tokens_a, entities_a = tokenizer_label.tokenize(ex_text_a, [h])
AttributeError: 'NoneType' object has no attribute 'tokenize'

Please let me know if I miss something here?

Thank you!
June

@zzy14
Copy link
Member

zzy14 commented Apr 21, 2020

Yes, you can change the config file to load BERT pre-trained. Specifically, you need to set all layer types as "sim".

@zzy14 zzy14 closed this as completed Jun 14, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants