Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature Request] Support InternLM Deploy #168

Closed
vansinhu opened this issue Sep 12, 2023 · 6 comments
Closed

[Feature Request] Support InternLM Deploy #168

vansinhu opened this issue Sep 12, 2023 · 6 comments
Labels
enhancement New feature or request Upstream Tracking an issue in llama.cpp

Comments

@vansinhu
Copy link

Dear LLamaSharp developer,

Greetings! I am vansinhu, a community developer and volunteer at InternLM. InternLM is a large language model similar to llama2, and we look forward to InternLM being supported in LLamaSharp. If there are any challenges or inquiries regarding support for InternLM, please feel free to join our Discord discussion at https://discord.gg/gF9ezcmtM3.

Best regards,
vansinhu

@martindevans
Copy link
Member

Is InternLM supported by llama.cpp? If it is then we probably already support it!

@geffzhang
Copy link

InternLM/InternLM#258

@martindevans martindevans added Upstream Tracking an issue in llama.cpp enhancement New feature or request labels Nov 8, 2023
@martindevans
Copy link
Member

ggerganov/llama.cpp#4283

@martindevans
Copy link
Member

ggerganov/llama.cpp#5184 has been merged (4 days ago), therefore #479 should include InternLM support for LLamaSharp.

@martindevans
Copy link
Member

0.10.0 has just been released which should include InternLM support at long last!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request Upstream Tracking an issue in llama.cpp
Projects
None yet
Development

No branches or pull requests

3 participants