Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llama能加载多个lora模型的参数吗 #46

Open
ziwang-com opened this issue May 21, 2023 · 0 comments
Open

llama能加载多个lora模型的参数吗 #46

ziwang-com opened this issue May 21, 2023 · 0 comments

Comments

@ziwang-com
Copy link
Owner

Facico/Chinese-Vicuna#19

目前还不支持。如果只是单纯把lora模型的权重叠加并不会有好的效果。

不过lora应该可以像MoE那样将多个lora模型合并,这是一个很有前途的架构,估计现在有很多科研前线的研究人员在做了,其实就和adapterFusion一个道理,原理很简单AdapterFusion: Non-Destructive Task Composition for Transfer Learning(AdapterFusion),stable diffusion那边挺多弄这个的。

要实现的话可以参考一下思路:
1、hard MoE,在对一个句子动态选择使用哪个lora权重
2、soft MoE,对一个句子的时候,把各种lora计算一个注意力权重,然后融合起来

这些都是很有意思的idea,不过我们目前还不支持这样做。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant