Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Big cursor when using Alpaca with scaling different from 100% #325

Closed
MaledictYtb opened this issue Sep 23, 2024 · 7 comments
Closed

Big cursor when using Alpaca with scaling different from 100% #325

MaledictYtb opened this issue Sep 23, 2024 · 7 comments
Labels
bug Something isn't working

Comments

@MaledictYtb
Copy link

MaledictYtb commented Sep 23, 2024

Describe the bug
When launching the app with any scaling different from 100%, the cursor is bigger in alpaca.

I'm using Fedora Kinoite with the flatpak version.

Expected behavior
Normal cursor size

Screenshots
125% :
image
100% :
image

Debugging information

INFO	[main.py | main] Alpaca version: 2.0.3
INFO	[connection_handler.py | request] GET : http://127.0.0.1/api/tags
ERROR	[model_widget.py | update_local_list] HTTPConnectionPool(host='127.0.0.1', port=80): Max retries exceeded with url: /api/tags (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7fa7a1588950>: Failed to establish a new connection: [Errno 111] Connexion refusée'))
ERROR	[window.py | connection_error] Connection error
INFO	[connection_handler.py | start] Starting Alpaca's Ollama instance...
INFO	[connection_handler.py | start] Started Alpaca's Ollama instance
2024/09/23 16:03:07 routes.go:1125: INFO server config env="map[CUDA_VISIBLE_DEVICES: GPU_DEVICE_ORDINAL: HIP_VISIBLE_DEVICES: HSA_OVERRIDE_GFX_VERSION: OLLAMA_DEBUG:true OLLAMA_FLASH_ATTENTION:false OLLAMA_HOST:http://127.0.0.1:11435 OLLAMA_INTEL_GPU:false OLLAMA_KEEP_ALIVE:5m0s OLLAMA_LLM_LIBRARY: OLLAMA_MAX_LOADED_MODELS:0 OLLAMA_MAX_QUEUE:512 OLLAMA_MODELS:/home/maledict/.var/app/com.jeffser.Alpaca/data/.ollama/models OLLAMA_NOHISTORY:false OLLAMA_NOPRUNE:false OLLAMA_NUM_PARALLEL:0 OLLAMA_ORIGINS:[http://localhost https://localhost http://localhost:* https://localhost:* http://127.0.0.1 https://127.0.0.1 http://127.0.0.1:* https://127.0.0.1:* http://0.0.0.0 https://0.0.0.0 http://0.0.0.0:* https://0.0.0.0:* app://* file://* tauri://*] OLLAMA_RUNNERS_DIR: OLLAMA_SCHED_SPREAD:false OLLAMA_TMPDIR: ROCR_VISIBLE_DEVICES:]"
time=2024-09-23T16:03:07.808+02:00 level=INFO source=images.go:753 msg="total blobs: 0"
time=2024-09-23T16:03:07.808+02:00 level=INFO source=images.go:760 msg="total unused blobs removed: 0"
time=2024-09-23T16:03:07.808+02:00 level=INFO source=routes.go:1172 msg="Listening on 127.0.0.1:11435 (version 0.3.9)"
INFO	[connection_handler.py | start] client version is 0.3.9
INFO	[connection_handler.py | request] GET : http://127.0.0.1:11435/api/tags
time=2024-09-23T16:03:07.809+02:00 level=INFO source=payload.go:30 msg="extracting embedded files" dir=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/x86_64/cpu/bin/libggml.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/x86_64/cpu/bin/libllama.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu file=build/linux/x86_64/cpu/bin/ollama_llama_server.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx file=build/linux/x86_64/cpu_avx/bin/libggml.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx file=build/linux/x86_64/cpu_avx/bin/libllama.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx file=build/linux/x86_64/cpu_avx/bin/ollama_llama_server.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx2 file=build/linux/x86_64/cpu_avx2/bin/libggml.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx2 file=build/linux/x86_64/cpu_avx2/bin/libllama.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cpu_avx2 file=build/linux/x86_64/cpu_avx2/bin/ollama_llama_server.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/x86_64/cuda_v11/bin/libggml.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/x86_64/cuda_v11/bin/libllama.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v11 file=build/linux/x86_64/cuda_v11/bin/ollama_llama_server.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v12 file=build/linux/x86_64/cuda_v12/bin/libggml.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v12 file=build/linux/x86_64/cuda_v12/bin/libllama.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=cuda_v12 file=build/linux/x86_64/cuda_v12/bin/ollama_llama_server.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=rocm_v60102 file=build/linux/x86_64/rocm_v60102/bin/libggml.so.gz
time=2024-09-23T16:03:07.809+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=rocm_v60102 file=build/linux/x86_64/rocm_v60102/bin/libllama.so.gz
time=2024-09-23T16:03:07.813+02:00 level=DEBUG source=payload.go:182 msg=extracting variant=rocm_v60102 file=build/linux/x86_64/rocm_v60102/bin/ollama_llama_server.gz
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/cpu/ollama_llama_server
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/cpu_avx/ollama_llama_server
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/cpu_avx2/ollama_llama_server
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/cuda_v11/ollama_llama_server
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/cuda_v12/ollama_llama_server
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:71 msg="availableServers : found" file=/home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/rocm_v60102/ollama_llama_server
time=2024-09-23T16:03:18.602+02:00 level=INFO source=payload.go:44 msg="Dynamic LLM libraries [cpu cpu_avx cpu_avx2 cuda_v11 cuda_v12 rocm_v60102]"
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=payload.go:45 msg="Override detection logic by setting OLLAMA_LLM_LIBRARY"
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=sched.go:105 msg="starting llm scheduler"
time=2024-09-23T16:03:18.602+02:00 level=INFO source=gpu.go:200 msg="looking for compatible GPUs"
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=gpu.go:86 msg="searching for GPU discovery libraries for NVIDIA"
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=gpu.go:468 msg="Searching for GPU library" name=libcuda.so*
time=2024-09-23T16:03:18.602+02:00 level=DEBUG source=gpu.go:491 msg="gpu library search" globs="[/app/lib/ollama/libcuda.so* /app/lib/libcuda.so* /usr/lib/x86_64-linux-gnu/GL/default/lib/libcuda.so* /usr/lib/x86_64-linux-gnu/openh264/extra/libcuda.so* /usr/lib/x86_64-linux-gnu/openh264/extra/libcuda.so* /usr/lib/sdk/llvm15/lib/libcuda.so* /usr/lib/x86_64-linux-gnu/GL/default/lib/libcuda.so* /usr/lib/ollama/libcuda.so* /app/plugins/AMD/lib/ollama/libcuda.so* /usr/local/cuda*/targets/*/lib/libcuda.so* /usr/lib/*-linux-gnu/nvidia/current/libcuda.so* /usr/lib/*-linux-gnu/libcuda.so* /usr/lib/wsl/lib/libcuda.so* /usr/lib/wsl/drivers/*/libcuda.so* /opt/cuda/lib*/libcuda.so* /usr/local/cuda/lib*/libcuda.so* /usr/lib*/libcuda.so* /usr/local/lib*/libcuda.so*]"
time=2024-09-23T16:03:18.604+02:00 level=DEBUG source=gpu.go:525 msg="discovered GPU libraries" paths=[]
time=2024-09-23T16:03:18.604+02:00 level=DEBUG source=gpu.go:468 msg="Searching for GPU library" name=libcudart.so*
time=2024-09-23T16:03:18.604+02:00 level=DEBUG source=gpu.go:491 msg="gpu library search" globs="[/app/lib/ollama/libcudart.so* /app/lib/libcudart.so* /usr/lib/x86_64-linux-gnu/GL/default/lib/libcudart.so* /usr/lib/x86_64-linux-gnu/openh264/extra/libcudart.so* /usr/lib/x86_64-linux-gnu/openh264/extra/libcudart.so* /usr/lib/sdk/llvm15/lib/libcudart.so* /usr/lib/x86_64-linux-gnu/GL/default/lib/libcudart.so* /usr/lib/ollama/libcudart.so* /app/plugins/AMD/lib/ollama/libcudart.so* /home/maledict/.var/app/com.jeffser.Alpaca/cache/tmp/ollama/ollama688138191/runners/cuda*/libcudart.so* /usr/local/cuda/lib64/libcudart.so* /usr/lib/x86_64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/x86_64-linux-gnu/libcudart.so* /usr/lib/wsl/lib/libcudart.so* /usr/lib/wsl/drivers/*/libcudart.so* /opt/cuda/lib64/libcudart.so* /usr/local/cuda*/targets/aarch64-linux/lib/libcudart.so* /usr/lib/aarch64-linux-gnu/nvidia/current/libcudart.so* /usr/lib/aarch64-linux-gnu/libcudart.so* /usr/local/cuda/lib*/libcudart.so* /usr/lib*/libcudart.so* /usr/local/lib*/libcudart.so*]"
time=2024-09-23T16:03:18.605+02:00 level=DEBUG source=gpu.go:525 msg="discovered GPU libraries" paths="[/app/lib/ollama/libcudart.so.12.4.99 /app/lib/ollama/libcudart.so.11.3.109]"
cudaSetDevice err: 35
time=2024-09-23T16:03:18.606+02:00 level=DEBUG source=gpu.go:537 msg="Unable to load cudart" library=/app/lib/ollama/libcudart.so.12.4.99 error="your nvidia driver is too old or missing.  If you have a CUDA GPU please upgrade to run ollama"
cudaSetDevice err: 35
time=2024-09-23T16:03:18.607+02:00 level=DEBUG source=gpu.go:537 msg="Unable to load cudart" library=/app/lib/ollama/libcudart.so.11.3.109 error="your nvidia driver is too old or missing.  If you have a CUDA GPU please upgrade to run ollama"
time=2024-09-23T16:03:18.607+02:00 level=WARN source=amd_linux.go:59 msg="ollama recommends running the https://www.amd.com/en/support/linux-drivers" error="amdgpu version file missing: /sys/module/amdgpu/version stat /sys/module/amdgpu/version: no such file or directory"
time=2024-09-23T16:03:18.607+02:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/0/properties"
time=2024-09-23T16:03:18.607+02:00 level=DEBUG source=amd_linux.go:127 msg="detected CPU /sys/class/kfd/kfd/topology/nodes/0/properties"
time=2024-09-23T16:03:18.607+02:00 level=DEBUG source=amd_linux.go:102 msg="evaluating amdgpu node /sys/class/kfd/kfd/topology/nodes/1/properties"
time=2024-09-23T16:03:18.607+02:00 level=DEBUG source=amd_linux.go:217 msg="mapping amdgpu to drm sysfs nodes" amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties vendor=4098 device=5688 unique_id=0
time=2024-09-23T16:03:18.607+02:00 level=DEBUG source=amd_linux.go:251 msg=matched amdgpu=/sys/class/kfd/kfd/topology/nodes/1/properties drm=/sys/class/drm/card1/device
time=2024-09-23T16:03:18.607+02:00 level=INFO source=amd_linux.go:274 msg="unsupported Radeon iGPU detected skipping" id=0 total="512.0 MiB"
time=2024-09-23T16:03:18.607+02:00 level=INFO source=amd_linux.go:360 msg="no compatible amdgpu devices detected"
time=2024-09-23T16:03:18.607+02:00 level=INFO source=gpu.go:347 msg="no compatible GPUs were discovered"
time=2024-09-23T16:03:18.607+02:00 level=INFO source=types.go:107 msg="inference compute" id=0 library=cpu variant=avx2 compute="" driver=0.0 name="" total="14.9 GiB" available="9.8 GiB"
[GIN] 2024/09/23 - 16:03:18 | 200 |     230.464µs |       127.0.0.1 | GET      "/api/tags"

@MaledictYtb MaledictYtb added the bug Something isn't working label Sep 23, 2024
@Jeffser
Copy link
Owner

Jeffser commented Sep 23, 2024

I believe this is a bug with all (or most) GTK apps on KDE, could you test a different GTK app like Gnome Calculator?

@Xathros1
Copy link

I believe this is a bug with all (or most) GTK apps on KDE, could you test a different GTK app like Gnome Calculator?

Yes, this is not a issue specifically to Alpaca. Seems to be all GTK4 apps.

@Jeffser
Copy link
Owner

Jeffser commented Sep 23, 2024

I will close this issue since I have confirmation that it isn't an Alpaca problem, sorry I can't help with this

@Jeffser Jeffser closed this as completed Sep 23, 2024
@MaledictYtb
Copy link
Author

Well it's really strange since I don't seem to have this problem with Gnome Calculator.

@Xathros1
Copy link

Xathros1 commented Sep 24, 2024

Well it's really strange since I don't seem to have this problem with Gnome Calculator.

gnome calc doesn't use GTK4 i think.

This is actually a GTK4 bug/regression. It's triggered when you have global scaling AND a cursor theme with "nominal size" (the size you choose in the Cursor Theme KCM) different from image sizes.

E.g. if you run xcursor-viewer /usr/share/icons/breeze_cursors/cursors/, you'll see info like "Nominal size: 24. Image size: 32x32.".

But for xcursor-viewer /usr/share/icons/Adwaita/cursors/, it will be "Nominal size: 24. Image size: 24x24.".

So it can be worked around by using a cursor theme like Adwaita.

there is already a submitted fix, but it hasn't been merged yet:

https://gitlab.gnome.org/GNOME/gtk/-/merge_requests/7722

@KaKi87
Copy link

KaKi87 commented Nov 6, 2024

Hello,
Any news on this ?
Thanks

@MaledictYtb
Copy link
Author

The fix has been merged into GTK. We just need to wait for an update of GTK.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants