Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix for issue #660 where the call to llama_batch_free is failing during process shutdown #734

Merged
merged 2 commits into from
Apr 2, 2024

Conversation

paulbkoch
Copy link
Collaborator

llama_batch_free is failing during python shutdown when the interpreter can set objects to None non-deterministically. I believe Guidance is properly checking for None in the objects that can be accessed from Guidance, but llama-cpp-python is not, and the native library object in llama-cpp-python is being set to None before it can be called. Since we're only dealing with memory deallocation here, and since the process is being killed anyway, we can solve this on the Guidance side by not calling llama_batch_free during process termination.

…to the interpreter setting objects to None in a non-deterministic order during shutdown
@paulbkoch paulbkoch self-assigned this Apr 2, 2024
@codecov-commenter
Copy link

codecov-commenter commented Apr 2, 2024

Codecov Report

All modified and coverable lines are covered by tests ✅

Project coverage is 65.57%. Comparing base (94744ea) to head (8665e75).
Report is 1 commits behind head on main.

❗ Your organization needs to install the Codecov GitHub app to enable full functionality.

Additional details and impacted files
@@            Coverage Diff             @@
##             main     #734      +/-   ##
==========================================
- Coverage   69.17%   65.57%   -3.60%     
==========================================
  Files          53       53              
  Lines        3951     3957       +6     
==========================================
- Hits         2733     2595     -138     
- Misses       1218     1362     +144     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@paulbkoch paulbkoch merged commit c345d19 into guidance-ai:main Apr 2, 2024
80 checks passed
@paulbkoch paulbkoch deleted the llama_batch_free-failure branch April 2, 2024 05:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants