Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AttributeError: 'NoneType' object has no attribute 'llama_batch_free' #660

Closed
Fahmie23 opened this issue Feb 27, 2024 · 4 comments
Closed
Assignees

Comments

@Fahmie23
Copy link

This is my code

image

This is the error that occur when i try to execute the file.
image

@hudson-ai
Copy link
Collaborator

I've noticed this as well. It seems to be an issue with how certain objects are garbage collected at system. I think it happens when llamacpp gets garbage collected before the actual model does -- does adding del model to the end of your file remove the error? (not a real fix, just a question whose answer may help debug this)

@paulbkoch
Copy link
Collaborator

I think the underlying issue is this line in llama_cpp:

https://github.com/abetlen/llama-cpp-python/blob/08e910f7a7e3657cf210ab8633a6994e1bde7164/llama_cpp/llama_cpp.py#L121

Executed in:

https://github.com/abetlen/llama-cpp-python/blob/08e910f7a7e3657cf210ab8633a6994e1bde7164/llama_cpp/llama_cpp.py#L1684-L1687

Python gives no guarantees on process shutdown regarding the order that objects are destructed. When it destructs objects, it changes any existing references to be 'None'. So, if the lib object above happens to be destructed first, then the call to getattr will pass in None for the lib parameter. The error message you get in a call to getattr(None, "llama_batch_free") matches the error observed in this issue: AttributeError: 'NoneType' object has no attribute 'llama_batch_free'.

I haven't verified this, but I will submit a PR to llama-cpp if it checks out.

@paulbkoch
Copy link
Collaborator

@Fahmie23, what version do you get when you do:

import llama_cpp
print(llama_cpp.__version__)

I know I've seen an exception similar to what you reported on an older version of llama_cpp, but I'm not getting it to repo now on newer versions. I would expect this error to occur fairly randomly though, so I can't be sure if that's because it's been fixed or if I'm just not happening to see it due to randomness.

@paulbkoch paulbkoch self-assigned this Mar 21, 2024
paulbkoch added a commit that referenced this issue Apr 2, 2024
…ng process shutdown (#734)

llama_batch_free is failing during python shutdown when the interpreter
can set objects to None non-deterministically. I believe Guidance is
properly checking for None in the objects that can be accessed from
Guidance, but llama-cpp-python is not, and the native library object in
llama-cpp-python is being set to None before it can be called. Since
we're only dealing with memory deallocation here, and since the process
is being killed anyway, we can solve this on the Guidance side by not
calling llama_batch_free during process termination.
@paulbkoch
Copy link
Collaborator

I believe PR #734 should resolve this issue, but I wasn't getting the error in the first place, so I can't confirm. Closing this issue for now, but if anyone continues to see it in the next release, please re-open this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants