Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FEAT: new model: mini-cpm-llama3-v-2.5 #1577

Merged
merged 7 commits into from
Jun 5, 2024

Conversation

Minamiyama
Copy link
Contributor

6.2.mp4

@XprobeBot XprobeBot added this to the v0.11.4 milestone Jun 2, 2024
@qinxuye qinxuye requested a review from amumu96 June 3, 2024 02:11
@amumu96
Copy link
Contributor

amumu96 commented Jun 3, 2024

Great job! @Minamiyama
I checkout your branch and try to run this model... how do you solve this error?
image
It seems when the model import transformers_modules.MiniCPM-Llama3-V-2.5-pytorch-8b, regard MiniCPM-Llama3-V-2 is a module, and cannot find it

@Minamiyama
Copy link
Contributor Author

Great job! @Minamiyama I checkout your branch and try to run this model... how do you solve this error? image It seems when the model import transformers_modules.MiniCPM-Llama3-V-2.5-pytorch-8b, regard MiniCPM-Llama3-V-2 is a module, and cannot find it

I got the problem, the model name could not be with 2.5, it should be like 2_5, and I've fix it.

@Minamiyama
Copy link
Contributor Author

Great job! @Minamiyama I checkout your branch and try to run this model... how do you solve this error? image It seems when the model import transformers_modules.MiniCPM-Llama3-V-2.5-pytorch-8b, regard MiniCPM-Llama3-V-2 is a module, and cannot find it

but the model name in UI model card is shown as "MiniCPM-Llama3-V-2 5", how make it show as "MiniCPM-Llama3-V-2.5" without this error🤣

@amumu96
Copy link
Contributor

amumu96 commented Jun 3, 2024

Great job! @Minamiyama I checkout your branch and try to run this model... how do you solve this error? image It seems when the model import transformers_modules.MiniCPM-Llama3-V-2.5-pytorch-8b, regard MiniCPM-Llama3-V-2 is a module, and cannot find it

but the model name in UI model card is shown as "MiniCPM-Llama3-V-2 5", how make it show as "MiniCPM-Llama3-V-2.5" without this error🤣

How about replacing _ or - to . in model name when read and write cache?

@Minamiyama
Copy link
Contributor Author

Great job! @Minamiyama I checkout your branch and try to run this model... how do you solve this error? image It seems when the model import transformers_modules.MiniCPM-Llama3-V-2.5-pytorch-8b, regard MiniCPM-Llama3-V-2 is a module, and cannot find it

but the model name in UI model card is shown as "MiniCPM-Llama3-V-2 5", how make it show as "MiniCPM-Llama3-V-2.5" without this error🤣

How about replacing _ or - to . in model name when read and write cache?

I have no idea about where to do that😭

@qinxuye qinxuye merged commit 9444e93 into xorbitsai:main Jun 5, 2024
12 checks passed
@qinxuye
Copy link
Contributor

qinxuye commented Jun 5, 2024

Thanks, about the dot in name, we will fix it later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants