Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fine-tuning OAG-BERT #5

Closed
yuzhimanhua opened this issue Aug 31, 2022 · 2 comments
Closed

Fine-tuning OAG-BERT #5

yuzhimanhua opened this issue Aug 31, 2022 · 2 comments

Comments

@yuzhimanhua
Copy link

Hello,

Thank you for your great work and for releasing the model!

May I ask whether the CogDL package supports fine-tuning OAG-BERT (e.g., by adding one layer upon OAG-BERT to perform [CLS] classification or sequence labeling)? If we implement this additional layer in PyTorch, will the backward process update the parameters in OAG-BERT as well? Thanks!

@Xiao9905
Copy link
Member

Xiao9905 commented Sep 1, 2022

@yuzhimanhua Hi,

Yes, the OAG-BERT is implemented by original PyTorch, so you can regard it as any other language models from huggingface transformers in your usage.

@Xiao9905 Xiao9905 closed this as completed Sep 1, 2022
@yuzhimanhua
Copy link
Author

Got it. Thank you for the prompt response!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants