-
Notifications
You must be signed in to change notification settings - Fork 191
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug for BertPrompt series code? #15
Comments
Hi @tangzhy , are you asking the codes for P-tuning v2 or baseline v1? |
Hi @Xiao9905 , I'm asking questions for the codes of P-tuning v2. I came across the codes of |
@Xiao9905 Yes, I think the trick used for |
@tangzhy Hi, thank you for your reporting. We have fix the problem in our released codes. |
Hi, I notice that the bert prompt model does not use the cls & linear head. I try to explain it in the following code with toy inputs, where say input_ids 's shape is [8, 32], and pre_seq_len is 3, then inputs_embeds's shall be [8, 35, 768]. I'll comment the shape of the key variables in the code and state my concern
I wonder, is p-tuning v2 compared with soft prompt tuning?
But the token being used for the latter one in the head for classification is not the cls.
Is that expected?
The text was updated successfully, but these errors were encountered: