Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This build of LocalAI doesn't work with models from the compatible models list #30

Open
FlattusBlastus opened this issue Apr 4, 2024 · 6 comments

Comments

@FlattusBlastus
Copy link

some backends(stablediffusion, tts) require LocalAI compiled with GO_TAGS

Sad, sad panda faces ensue. Any chance for a fix?

@KyTDK
Copy link

KyTDK commented Apr 6, 2024

Having same issue

@KyTDK
Copy link

KyTDK commented Apr 6, 2024

2:33AM INF [/build/backend/python/autogptq/run.sh] Attempting to load
2:33AM INF Loading model 'gpt-3.5-turbo' with backend /build/backend/python/autogptq/run.sh
2:33AM INF [/build/backend/python/autogptq/run.sh] Fails: grpc process not found: /tmp/localai/backend_data/backend-assets/grpc/build/backend/python/autogptq/run.sh. some backends(stablediffusion, tts) require LocalAI compiled with GO_TAGS

@amongesimpasta
Copy link

same issue here

@Hdw007
Copy link

Hdw007 commented Apr 11, 2024

I have the same issue, container has just been setup according to the documentation, getting the same error as above.

@szaimen
Copy link
Owner

szaimen commented May 31, 2024

Hi, can you check if it works now after I changed the docker tag to v2.16.0-aio-cpu with #41 and pushed a new container update?

@FlattusBlastus
Copy link
Author

FlattusBlastus commented May 31, 2024 via email

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants