-
-
Notifications
You must be signed in to change notification settings - Fork 3.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Don't read unlimited data from files #19095
Conversation
No problems. |
I still haven't looked through the code, but I have a general comment. I feel some unnatural in this idea of artificially creating restrictions such as the maximum size of the file being read. I doubt that there are objective criteria that allow such a restriction to be set statically in the general case, so there is a high probability that sooner or later it will lead to application usage problems, the likes of which appear from time to time (for example, qBittorrent cannot load some @Chocobo1 |
Currently qbt isn't resilient enough when the file being read is malicious/malformed, for example, file linked to /dev/zero can generate infinite data. I feel that qbt can behave better in such situations. Instead of trying to exhaust reading it and causing qbt itself to halt forever (which is a kind of deny of service attack), it can simplify stop when it is too much. libtorrent too have such limits.
First of all I think it is (worth it) and living in modern world I have high expectations of qbt being resilient to deny of service situations. |
e6ca21b
to
8f92295
Compare
PR updated, comments addressed. Now I no longer strongly insist on having a limited size and I would still recommend providing one. |
In fact, I am most concerned with the question of how to avoid obvious nonsense, such as the situation I mentioned above, when qBittorrent is not able to load the correct file created by itself. This could be done by dynamically increasing the limit in such cases. |
Which issue? are you sure it is caused by qbt limits or it is about libtorrent limits?
What kind of 'other files'?
Then there will be >10 settings/limits for each file read. Do you consider that to be practical? or some of them can share the same setting? |
The one I'm talking about is because of libtorrent limits. I just gave it as an example of problems of this kind.
These are the files that are not produced by qBittorrent itself, so we cannot adjust their limits dynamically and only way to make it convenient is allowing the users to configure their limits. |
We can start with those that turn out to be the most questionable to have hardcoded limits. In addition, we may not initially provide a UI for these settings. |
I don't mind if there is only a few of them. But you will need to pinpoint which read instance and the settings key name. |
Well, I finally have code review done. |
file.close(); | ||
} | ||
else | ||
const int fileMaxSize = 10 * 1024 * 1024; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I believe we could avoid to make this one configurable in near future if we make it dependent from "max articles per feed".
PR updated, addressed comments. |
343e801
to
cf48894
Compare
It now guards against reading infinite files such as `/dev/zero`. And most readings are bound with a (lax) limit. As a side effect, more checking are done when reading a file and overall the reading procedure is more robust.
As a side effect, more checking are done when reading a file and overall the reading procedure is more robust.