Skip to content

Issues: ollama/ollama

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

not able to download models from ollama behind proxy bug Something isn't working
#7522 opened Nov 6, 2024 by anshika1234
About OLLAMA_SCHED_SPREAD env,How to load a model on two GPUs bug Something isn't working needs more info More information is needed to assist nvidia Issues relating to Nvidia GPUs and CUDA
#7511 opened Nov 5, 2024 by Kouuh
Expose DRY and XTC parameters feature request New feature or request
#7504 opened Nov 5, 2024 by p-e-w
Tencent-Hunyuan-Large-MoE-389B-A52B model request Model requests
#7503 opened Nov 5, 2024 by vYLQs6
llama slows down a lot on the second and subsequent runs. bug Something isn't working nvidia Issues relating to Nvidia GPUs and CUDA
#7497 opened Nov 4, 2024 by vertikalm
mac Errors when running bug Something isn't working
#7495 opened Nov 4, 2024 by shan23chen
langchain-python-rag-document not working bug Something isn't working
#7492 opened Nov 4, 2024 by gsportelli
cuda runner fails to build correctly without CUDA_PATH set bug Something isn't working build Issues relating to building ollama from source windows
#7491 opened Nov 4, 2024 by auiphc
流式输出怎么弄,谢谢 question General questions
#7489 opened Nov 4, 2024 by NXL333
Packaging ollama: make including ROCm libraries in the dist optional build Issues relating to building ollama from source feature request New feature or request
#7483 opened Nov 3, 2024 by breezerider
HIP_VISIBLE_DEVICES vs ROCR_VISIBLE_DEVICES amd Issues relating to AMD GPUs and ROCm bug Something isn't working needs more info More information is needed to assist
#7480 opened Nov 3, 2024 by nathan-skynet
Issue with Reinstalling Ollama: "Killed" Error on ollama serve bug Something isn't working needs more info More information is needed to assist
#7478 opened Nov 3, 2024 by hosein97
Submit 4 images to Ollama visual model, generate a large amount of log without any return bug Something isn't working needs more info More information is needed to assist
#7477 opened Nov 3, 2024 by delubee
Cannot generate id_ed25519 - read-only file system bug Something isn't working
#7471 opened Nov 2, 2024 by duhow
Dual GPU token generation bug on 0.3.15 that did not exist on 0.3.13 amd Issues relating to AMD GPUs and ROCm bug Something isn't working windows
#7461 opened Nov 1, 2024 by calmingaura
mistake:ollama run llama3_8b_chat_uncensored_q4_0 bug Something isn't working
#7458 opened Oct 31, 2024 by 1015g
1 task done
ProTip! What’s not been updated in a month: updated:<2024-10-05.