-
Notifications
You must be signed in to change notification settings - Fork 13.8k
Description
Name and Version
llama-server WebUI
version: 6980 (299f5d7)
built with Ubuntu clang version 21.0.0 (++20250702083358+a75587d2718f-1exp120250702083509.1007) for x86_64-pc-linux-gnu
Operating systems
No response
Which llama.cpp modules do you know to be affected?
llama-server
Command line
Problem description & steps to reproduce
I am using the WebUI through llama-server with various models, e.g. Qwen3-30B-A3B-Instruct-2507, to summarize articles.
When text is generated from a local model, I am starting to read.
I often read summaries that span several screenfulls.
When the model stops generating, the WebUI auto-scrolls to the bottom of the generated page.
This disrupts the reading massively, I have to scroll back and find the place where I was at.
Asking for clarification often generates another 2-3 screen pages. When the answer is finished, my reading flow is again interrupted by the WebUI auto-scrolling to the end and forces me to scroll back to find my last position again.
Please add an option to disable auto-scroll. At most, only scroll once the exact moment the prompt is submitted and leave reading/positioning to the user after that, especially if the user had already used the scroll wheel.
Users using the scroll wheel (or PgUp/PgDn to change scroll position) indicates that a position other than what the WebUI does is desired. That should be honored, or there should be a setting to allow to turn off any WebUI scrolling.
Thanks for consideration.
First Bad Commit
No response