Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to combine Vicuna’s delta weight and original weight #53

Closed
hayleeXinyi opened this issue Apr 19, 2023 · 7 comments
Closed

Unable to combine Vicuna’s delta weight and original weight #53

hayleeXinyi opened this issue Apr 19, 2023 · 7 comments

Comments

@hayleeXinyi
Copy link

hayleeXinyi commented Apr 19, 2023

Has anyone else had the same problem as me?

image

I followed PrepareVicuna.md and download llama-13b-hf by

git clone https://huggingface.co/decapoda-research/llama-13b-hf.

But in the last step, the problem happened.
Could this be the cause of the problem? (But I run it on ubuntu)

image

@TsuTikgiau
Copy link
Collaborator

Hello! Are these files downloaded successfully? In case you cannot download them via git lfs, you can also download them manually

@hayleeXinyi
Copy link
Author

Thanks for answering! I then downloaded the file with the following command and it no longer shows an error.
git-lfs clone https://huggingface.co/decapoda-research/llama-13b-hf
But it still does not merge the weights.
Could it be that the HF version converted from 13B model this model using is not the same version in https://huggingface.co/decapoda-research/llama-13b-hf?

@TsuTikgiau
Copy link
Collaborator

what errors you received when merging the weights?

@one-mystery
Copy link

one-mystery commented Apr 19, 2023

I have the same problem,
windows 11 wsl2, memory 64G
image

wsl sometimes disconnects,error like:
[已退出进程,代码为 11 (0x0000000b)]
[已退出进程,代码为 4294967295 (0xffffffff)]
[Process exited with code 4294967295 (0xffffffff)]

@dazzle-me
Copy link

Most likely you don't have enough RAM (conversion is successful with 128Gb)

@hayleeXinyi
Copy link
Author

hayleeXinyi commented Apr 20, 2023

很可能您没有足够的 RAM(128Gb 转换成功)

I only have around 64G of RAM, which means this won't run up on my GPU? Is there any possible solution?

@hayleeXinyi
Copy link
Author

Thank you so much!!!!! I solved this problem by expanding the virtual memory.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants