Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding the llm-gpt4all models breaks the python app. #28

Open
jwhowa opened this issue Apr 5, 2024 · 2 comments
Open

adding the llm-gpt4all models breaks the python app. #28

jwhowa opened this issue Apr 5, 2024 · 2 comments

Comments

@jwhowa
Copy link

jwhowa commented Apr 5, 2024

I installed llm no problem, assigning my openai key, and am able to speak to gpt4 without problem, see the output of my llm models command:

OpenAI Chat: gpt-3.5-turbo (aliases: 3.5, chatgpt)
OpenAI Chat: gpt-3.5-turbo-16k (aliases: chatgpt-16k, 3.5-16k)
OpenAI Chat: gpt-4 (aliases: 4, gpt4)
OpenAI Chat: gpt-4-32k (aliases: 4-32k)
OpenAI Chat: gpt-4-1106-preview
OpenAI Chat: gpt-4-0125-preview
OpenAI Chat: gpt-4-turbo-preview (aliases: gpt-4-turbo, 4-turbo, 4t)
OpenAI Completion: gpt-3.5-turbo-instruct (aliases: 3.5-instruct, chatgpt-instruct)

But when I attempt "llm install llm-gpt4all", it appears to work, but llm is now broken. "llm models" gives the following error:

`(bob) [pazgsai01|~/bob:$] llm models list
Traceback (most recent call last):
  File "/home/myuser/bob/bob/bin/llm", line 8, in <module>
    sys.exit(cli())
<snip>
FileNotFoundError: Model file does not exist: PosixPath('/home/myuser/.cache/gpt4all/mistral-7b-openorca.gguf2.Q4_0.gguf')

Whether I try this on Windows or Linux, I get the exact error message.

If I uninstall llm-gpt4all, LLM works just fine talking to OpenAI, it's just like the integration to iterate and download the models is broken?

Any help appreciated! -Jim

@slhck
Copy link
Contributor

slhck commented Apr 15, 2024

Same here, also under macOS.

➜ llm models list
Traceback (most recent call last):
  File "/opt/homebrew/bin/llm", line 8, in <module>
    sys.exit(cli())
             ^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/llm/cli.py", line 799, in models_list
    for model_with_aliases in get_models_with_aliases():
                              ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/llm/__init__.py", line 80, in get_models_with_aliases
    pm.hook.register_models(register=register)
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/pluggy/_hooks.py", line 501, in __call__
    return self._hookexec(self.name, self._hookimpls.copy(), kwargs, firstresult)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/pluggy/_manager.py", line 119, in _hookexec
    return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/pluggy/_callers.py", line 138, in _multicall
    raise exception.with_traceback(exception.__traceback__)
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/pluggy/_callers.py", line 102, in _multicall
    res = hook_impl.function(*args)
          ^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 57, in register_models
    models.sort(
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 59, in <lambda>
    not model.is_installed(),
        ^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 179, in is_installed
    GPT4All.retrieve_model(
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/llm_gpt4all.py", line 38, in retrieve_model
    return _GPT4All.retrieve_model(model_name, model_path, allow_download, verbose)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/homebrew/Cellar/llm/0.13.1_1/libexec/lib/python3.12/site-packages/gpt4all/gpt4all.py", line 309, in retrieve_model
    raise FileNotFoundError(f"Model file does not exist: {model_dest!r}")
FileNotFoundError: Model file does not exist: PosixPath('/Users/werner/.cache/gpt4all/mistral-7b-openorca.gguf2.Q4_0.gguf')

@slhck
Copy link
Contributor

slhck commented Apr 15, 2024

This is fixed in #27 actually …

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants