Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to switch to P-Tuning v2 #630

Closed
jiahuanluo opened this issue Jun 26, 2023 · 2 comments
Closed

How to switch to P-Tuning v2 #630

jiahuanluo opened this issue Jun 26, 2023 · 2 comments
Labels
solved solved

Comments

@jiahuanluo
Copy link

We can find the P-Tuning v2 in

peft/README.md

Line 29 in 8af8dbd

2. Prefix Tuning: [Prefix-Tuning: Optimizing Continuous Prompts for Generation](https://aclanthology.org/2021.acl-long.353/), [P-Tuning v2: Prompt Tuning Can Be Comparable to Fine-tuning Universally Across Scales and Tasks](https://arxiv.org/pdf/2110.07602.pdf)

But how can I switch to P-Tuning v2?

@pacman100
Copy link
Contributor

Hello, those are implemented together. P-Tuning v2 introduced optional parameterization of prompt tokens which you can specify via prefix_projection of PrefixTuningConfig. The other contribution was the ability of work without verbalizers using the linear classification head for NLU tasks whereas Prefix-Tuning paper which focused on NLG didn't focus on this.

So, they are supported via the same PrefixEncoder PEFT method

@pacman100 pacman100 added the solved solved label Jun 27, 2023
@github-actions
Copy link

This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread.

@github-actions github-actions bot closed this as completed Aug 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
solved solved
Projects
None yet
Development

No branches or pull requests

2 participants