Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update _llama_cpp.py to use new bindings #665

Merged
merged 1 commit into from
Feb 28, 2024

Conversation

nking-1
Copy link
Contributor

@nking-1 nking-1 commented Feb 28, 2024

The newest version of llama_cpp bindings breaks a few lines of code due to changed variable names and function signatures. I've attempted to make a fix that will be backwards compatible for people using the older version of the llama_cpp library while also working with the new one.

This PR will fix issue #659

@Harsha-Nori
Copy link
Collaborator

Harsha-Nori commented Feb 28, 2024

Thanks for this fix @Nking92! Our CI pipelines can't do this, but we should test the backwards compatibility on an env with say

pip install llama-cpp-python==0.2.47. Do you mind downgrading your llama-cpp-python and testing the fix in your environment? I'll do the same on my end.

given how frequently llama-cpp-python (and the underlying llama.cpp) introduces breaking changes, we may want to consider specifying versions in our setup.py

@slundberg FYI

@nking-1
Copy link
Contributor Author

nking-1 commented Feb 28, 2024

Thanks Harsha, I tested with llama-cpp-python==0.2.47 this morning without having any issues.

self.batch = llama_cpp.llama_batch_init(n_tokens=n_batch, embd=0, n_seq_max=n_ctx)
self.batch = llama_cpp.llama_batch_init(n_batch, 0, n_ctx)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Per llama-cpp-python, I think the function signature here didn't change -- did we figure out why there was an issue with using named args instead of positional ones here?

https://github.com/abetlen/llama-cpp-python/blob/8c71725d5345b2ba325e6d28209875057c5873e6/llama_cpp/llama_cpp.py#L1645

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In any case, changing this seems to work -- just curious about why! LGTM

@Harsha-Nori Harsha-Nori merged commit 3a20d2a into guidance-ai:main Feb 28, 2024
5 of 10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants