You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any way that instead of training from SciBERT, but directly fine-tune on SPECTER? It seems that the format of the model weights of SPECTER is different from SciBERT.
How do I fine-tune SPECTER like SciBERT on classification tasks?
The text was updated successfully, but these errors were encountered:
The model that is on huggingface should be easily fine-tunable like SciBERT.
You can follow instructions here https://huggingface.co/docs/transformers/training but instead of bert-base-uncased use allenai/specter as the pre-trained model name.
How does a custom training dataset has to look like? I understand from the repo that it consists of title+abstract+id for each paper
in metadata.json but I don't understand was data.json does? Are those the positive and negative examples of papers for each paper?
Is there any way that instead of training from SciBERT, but directly fine-tune on SPECTER? It seems that the format of the model weights of SPECTER is different from SciBERT.
How do I fine-tune SPECTER like SciBERT on classification tasks?
The text was updated successfully, but these errors were encountered: