Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add RoPE Scaling params from llamacpp #8422

Merged
merged 1 commit into from Jul 28, 2023

Conversation

imjwang
Copy link
Contributor

@imjwang imjwang commented Jul 28, 2023

Description:
Just adding parameters from llama-python-cpp that support RoPE scaling.
@hwchase17, @baskaryan

sources:
papers and explanation:
https://kaiokendev.github.io/context
llamacpp conversation:
ggerganov/llama.cpp#1965
Supports models like:
https://huggingface.co/conceptofmind/LLongMA-2-13b

img from
https://llama-cpp-python.readthedocs.io/en/latest/api-reference/#llama_cpp.Llama
Screen Shot 2023-07-28 at 11 27 26 AM

  • lint and format

@vercel
Copy link

vercel bot commented Jul 28, 2023

The latest updates on your projects. Learn more about Vercel for Git ↗︎

1 Ignored Deployment
Name Status Preview Comments Updated (UTC)
langchain ⬜️ Ignored (Inspect) Jul 28, 2023 3:29pm

@dosubot dosubot bot added the 🤖:improvement Medium size change to existing code to handle new use-cases label Jul 28, 2023
@imjwang imjwang changed the title add param to llamacpp Add RoPE Scaling params from llamacpp Jul 28, 2023
@baskaryan baskaryan added the lgtm PR looks good. Use to confirm that a PR is ready for merging. label Jul 28, 2023
@baskaryan
Copy link
Collaborator

thanks @imjwang!

@baskaryan baskaryan merged commit e0de62f into langchain-ai:master Jul 28, 2023
23 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🤖:improvement Medium size change to existing code to handle new use-cases lgtm PR looks good. Use to confirm that a PR is ready for merging.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants