Skip to content
Discussion options

You must be logged in to vote

@octalxx Thank you for the logs. From my quick search on Ollama discord, I found a similar issue reported by others: ollama/ollama#644

The underlying issue seems to be ggml-org/llama.cpp#1583 -- seems to be a problem with llama.cpp assuming AVX support

Replies: 3 comments

Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Comment options

You must be logged in to vote
0 replies
Answer selected by JayNakrani
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants
Converted from issue

This discussion was converted from issue #6 on December 01, 2023 21:50.