Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

llora/convert.py not working #63

Closed
vierzwei opened this issue Dec 10, 2023 · 2 comments
Closed

llora/convert.py not working #63

vierzwei opened this issue Dec 10, 2023 · 2 comments
Assignees

Comments

@vierzwei
Copy link

I could not get the file llora/convert.py to work.
There is a typo with n_heads. it should be:
if "n_kv_heads" not in config: config["n_kv_heads"] = config["n_heads"]; if "head_dim" not in config: config["head_dim"] = config["dim"] // config["n_heads"];
Instead of config["n_kv_heads"] = n_heads; etc.

But even after fixing this, the resulting file did not work. The convert file in /llama/ did work.

@awni
Copy link
Member

awni commented Dec 10, 2023

Sorry about that, I pushed a fix should be merged soon: #64

Could yo tell me what else didn't work for you? (Stack trace / ...). After fixing the conversion, it runs fine for me..

@awni awni self-assigned this Dec 11, 2023
@awni
Copy link
Member

awni commented Dec 12, 2023

Comment if there is still an issue and I will reopen.

@awni awni closed this as completed Dec 12, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants