Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I want to know if I can fine-tuning directly on the GPT-GNN model? #15

Closed
yangxia605 opened this issue Sep 27, 2020 · 3 comments
Closed

Comments

@yangxia605
Copy link

hi @acbull
image

that means I must take some categories for pre-training and then using another categories to fine-tuning?
I wonder if i can fine-tuning in the GPT-GNN model directly or not?like fine-tuning in the pre-trained bert,just modify the data to adjust their input format and then fine-tuning for the downstream task.

thanks.

@acbull
Copy link
Owner

acbull commented Sep 27, 2020

Of course, you can directly pre-train and fine-tune on the same graph. The reason I adopt that in the experiment is to make the task harder and more realistic (in a real setting, we sometimes cannot see the fine-tuning data during pre-training).

@acbull
Copy link
Owner

acbull commented Sep 27, 2020

Since you mentioned BERT, mostly the pre-training corpus and the fine-tuning corpus of BERT will come from a different domains, but BERT still shows a really impressive generalization ability. This is the reason why we evaluate our model in the three different transfer settings. But of course, you can directly fine-tune on the same graph.

@yangxia605
Copy link
Author

I am very grateful for your prompt and patient answer.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants