Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

monkey_patch_llama_cpp_python is not needed #6201

Closed
1 task done
kimjaewon96 opened this issue Jul 5, 2024 · 1 comment
Closed
1 task done

monkey_patch_llama_cpp_python is not needed #6201

kimjaewon96 opened this issue Jul 5, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@kimjaewon96
Copy link
Contributor

Describe the bug

A bit busy to do a pull request.
Current (v1.9) version does not work with llama.cpp, solved it by disabling line 54-55 in llama_cpp_python_hijack.py.

    #if return_lib is not None:
        #monkey_patch_llama_cpp_python(return_lib)

Guessing that the latest llama-cpp-python version update turned it into a generator so we don't have to patch one.

Is there an existing issue for this?

  • I have searched the existing issues

Reproduction

    #if return_lib is not None:
        #monkey_patch_llama_cpp_python(return_lib)

Screenshot

No response

Logs

included

System Info

Windows python==3.11
@kimjaewon96 kimjaewon96 added the bug Something isn't working label Jul 5, 2024
@GralchemOz
Copy link
Contributor

This solution fixed the issue for me; otherwise, an attempt to utilize llama.cpp would result in a "RecursionError: maximum recursion depth exceeded in comparison."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants