-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement backpressure #106
Comments
Thanks for filing the issue. Now talking about it
I don't think this could be done. If we "consume" the buffer as the user scrolls down the page it would be impossible to show the same data again if the user decides to go back up. I knew this problem would surely arise someday and I have yet to figure out a the correct solution to this. Here are some of those which I had thought:-
I also believe that the solution isn't just one of them but instead a mixture of bits and pieces from all of these ideas. Anyways If you could share me more on how exactly you are using minus that would be quite helpful. If you want to contribute you are more than welcome to open a PR. Comment down below any suggestions if you have. EDIT: Give numbers to each point |
You are correct that you cannot scroll back up to the start if we limit the buffer of the pager. However if the input file is big enough, that's the only reasonable thing we can do, no? I tried However I still think this feature is a good idea for my use case.
As I mentioned above,
I think this could work, but |
I spent the last night giving thought to each of these ideas that's why I am replying quite late today. I know the 64KB approach of (3) is another solutions for applications but I fear that it might require more knowledge of the data. I want minus to know about the data only quantitatively. (1) can make the application code quite repetitive like you have to call whatever your fetch function is in each of key/mouse binding that move the screen down. I think the best option would be to implement (2) and have a hook like |
I've maybe got an idea for this: What is if |
Could you describe a bit more? |
I pushed some commits, specifically c0475fd and 4a015a8 which greatly improve text append throughout. I will get some concrete benchmarks posted tomorrow. EDIT:- Here are benchmarks after the changes:- test result: ok. 0 passed; 0 failed; 63 ignored; 1 measured; 0 filtered out; finished in 842.37s I wasn't able to get benchmarks before these changes as the benchmark process just didn't finish even in 4hrs. It still takes about 2.7 secs so there's should be a lot improve upon. I have created #127 to track performance improvements all over the project. |
@FlipB can you do this test again against the latest main and post the results? |
@arijit79 Sorry for the late reply, I've forgot to respond on this.
I think it's clearer if we use an image here: My idea is the following: |
Its alright. @TornaxO7 I really appreciate your efforts. I have created a branch called |
Is your feature request related to a problem? Please describe.
Minus becomes unresponsive when attempting to page big files (gigabytes).
Describe the solution you'd like
It seems like the pager's buffer is unbounded and keeps growing to attempt to fit the entire input.
Instead the pager ought to limit the buffer to eg. 1000 lines, and block
write_str
when full.Scrolling the output consumes the buffer, allowing new lines to be written by
write_str
.Additional context
It spends virtually all time in
PagerState::append_str
, pegging the CPU at 100%.The text was updated successfully, but these errors were encountered: