Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

to_out.0 in lora doesn't work #4

Closed
gaoren002 opened this issue Jun 5, 2024 · 1 comment
Closed

to_out.0 in lora doesn't work #4

gaoren002 opened this issue Jun 5, 2024 · 1 comment

Comments

@gaoren002
Copy link

gaoren002 commented Jun 5, 2024

when Lora model passed to peft(0.10.0) ,peft dosen't solve to_out.0 this layer, so we can't train this layer. The details are, to_out is a Sequential, the code in peft try to set attribute 0 of the sequential, but actually, it only set an new attribute for this Sequential, but real "0"layer is in the layers attribute of sequential is not replaced by lora.

To solve this problem, I can only modify the peft to adapt the model passed by jdiffusion.

@gaoren002
Copy link
Author

bug has been fixed in jittor repo

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant