Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Downloading whl in v2.13-fix #3340

Closed
Portagoras opened this issue May 2, 2024 · 1 comment
Closed

Downloading whl in v2.13-fix #3340

Portagoras opened this issue May 2, 2024 · 1 comment
Labels

Comments

@Portagoras
Copy link

Describe the issue you are experiencing

Links are still going to the v.2.13 release versions, which you have deleted. So you can't really setup the addon atm.

What operating system image do you use?

generic-x86-64 (Generic UEFI capable x86-64 systems)

What version of Home Assistant Operating System is installed?

Not relevant

Did you upgrade the Operating System.

Yes

Steps to reproduce the issue

  1. Install the HACS Addon (fresh install)
  2. Try to set it up
  3. See the Error 404 with it trying to aquite the following file 'https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl'

Anything in the Supervisor logs that might be useful for us?

Not really

Anything in the Host logs that might be useful for us?

Following Entries in order
Logger: homeassistant.util.package
Source: util/package.py:123
First occurred: 12:50:02 AM (5 occurrences)
Last logged: 12:55:27 AM
Unable to install package https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl: ERROR: HTTP error 404 while getting https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl ERROR: Could not install requirement llama-cpp-python==0.2.64 from https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl for URL https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl

Logger: custom_components.llama_conversation.utils
Source: custom_components/llama_conversation/utils.py:118
integration: LLaMA Conversation (documentation)
First occurred: 12:50:02 AM (5 occurrences)
Last logged: 12:55:27 AM
Error installing llama-cpp-python. Could not install the binary wheels from GitHub for platform: x86_64, python version: 3.12. Please manually build or download the wheels and place them in the `/config/custom_components/llama_conversation` directory.Make sure that you download the correct .whl file for your platform and python version from the GitHub releases page.

Logger: custom_components.llama_conversation.config_flow
Source: custom_components/llama_conversation/config_flow.py:364
integration: LLaMA Conversation (documentation)
First occurred: 12:50:02 AM (5 occurrences)
Last logged: 12:55:27 AM
Failed to install wheel: False

System information

Not relevant

Additional information

Not really, should be clear.

@Portagoras Portagoras added the bug label May 2, 2024
@agners
Copy link
Member

agners commented May 3, 2024

Links are still going to the v.2.13 release versions, which you have deleted.

https://github.com/acon96/home-llm/ is a 3rd party repository which we have no control of. Please report the deletion in that repository.

  1. Install the HACS Addon

This is about operation system issues themselfs. Report the wrong path to the repository of the HACS add-on.

@agners agners closed this as not planned Won't fix, can't repro, duplicate, stale May 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants