Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Don't read unlimited data from files #19095

Merged
merged 1 commit into from
Jun 14, 2023
Merged

Conversation

Chocobo1
Copy link
Member

@Chocobo1 Chocobo1 commented Jun 5, 2023

As a side effect, more checking are done when reading a file and overall the reading procedure is more robust.

@thalieht
Copy link
Contributor

thalieht commented Jun 5, 2023

No problems.

@glassez
Copy link
Member

glassez commented Jun 6, 2023

I still haven't looked through the code, but I have a general comment.

I feel some unnatural in this idea of artificially creating restrictions such as the maximum size of the file being read. I doubt that there are objective criteria that allow such a restriction to be set statically in the general case, so there is a high probability that sooner or later it will lead to application usage problems, the likes of which appear from time to time (for example, qBittorrent cannot load some .fastresume files, created by itself earlier).

@Chocobo1
Since you are proposing such a PR, then you probably delved into this issue, so please explain what is the main purpose of introducing such restrictions? And is it worth it, given its possible drawbacks?

src/base/utils/fs.h Outdated Show resolved Hide resolved
src/base/utils/fs.cpp Outdated Show resolved Hide resolved
@Chocobo1
Copy link
Member Author

Chocobo1 commented Jun 6, 2023

Since you are proposing such a PR, then you probably delved into this issue, so please explain what is the main purpose of introducing such restrictions?

Currently qbt isn't resilient enough when the file being read is malicious/malformed, for example, file linked to /dev/zero can generate infinite data. I feel that qbt can behave better in such situations. Instead of trying to exhaust reading it and causing qbt itself to halt forever (which is a kind of deny of service attack), it can simplify stop when it is too much. libtorrent too have such limits.

And is it worth it, given its possible drawbacks?

First of all I think it is (worth it) and living in modern world I have high expectations of qbt being resilient to deny of service situations.
Probably the main concern is the size limit and IMO it is subjected to be raised when it gets in the way. The limits can also be presented in UI if needed (in another PR of course).
How do you feel about the impact of such drawbacks? From what I see on the issue tracker, reports of being denied by the size limit is scarce. Probably way less than 0.5% of all issues. Sure sometimes it causes inconveniences but it is way better than allowing the chance to bring down qbt entirely.

@Chocobo1 Chocobo1 changed the title Don't read unlimited size from files Don't read unlimited data from files Jun 6, 2023
@Chocobo1 Chocobo1 force-pushed the read branch 5 times, most recently from e6ca21b to 8f92295 Compare June 7, 2023 05:32
@Chocobo1
Copy link
Member Author

Chocobo1 commented Jun 7, 2023

PR updated, comments addressed. Now I no longer strongly insist on having a limited size and I would still recommend providing one. -1 infinite is allowed but should be explicitly provided.
Since I've already provided some values, I'm open to discuss which ones are unrealistic to have a limit and might run into problems.

@glassez
Copy link
Member

glassez commented Jun 7, 2023

How do you feel about the impact of such drawbacks? From what I see on the issue tracker, reports of being denied by the size limit is scarce. Probably way less than 0.5% of all issues. Sure sometimes it causes inconveniences but it is way better than allowing the chance to bring down qbt entirely.

In fact, I am most concerned with the question of how to avoid obvious nonsense, such as the situation I mentioned above, when qBittorrent is not able to load the correct file created by itself. This could be done by dynamically increasing the limit in such cases.
As for other files, relying on a low percentage of limit-related problems is not very friendly behavior. Of course, we are unlikely to be able to set such limits that would suit everyone in the long run. But "it is subjected to be raised when it gets in the way" is also not the best solution, especially considering our situation with producing releases. So I believe that all such "limits for good" should be at least configurable.

@Chocobo1
Copy link
Member Author

Chocobo1 commented Jun 7, 2023

In fact, I am most concerned with the question of how to avoid obvious nonsense, such as the situation I mentioned above, when qBittorrent is not able to load the correct file created by itself.

Which issue? are you sure it is caused by qbt limits or it is about libtorrent limits?

As for other files, relying on a low percentage of limit-related problems is not very friendly behavior.

What kind of 'other files'?

So I believe that all such "limits for good" should be at least configurable.

Then there will be >10 settings/limits for each file read. Do you consider that to be practical? or some of them can share the same setting?

@glassez
Copy link
Member

glassez commented Jun 7, 2023

Which issue? are you sure it is caused by qbt limits or it is about libtorrent limits?

The one I'm talking about is because of libtorrent limits. I just gave it as an example of problems of this kind.

What kind of 'other files'?

These are the files that are not produced by qBittorrent itself, so we cannot adjust their limits dynamically and only way to make it convenient is allowing the users to configure their limits.

@glassez
Copy link
Member

glassez commented Jun 7, 2023

Then there will be >10 settings/limits for each file read. Do you consider that to be practical? or some of them can share the same setting?

We can start with those that turn out to be the most questionable to have hardcoded limits. In addition, we may not initially provide a UI for these settings.

@Chocobo1
Copy link
Member Author

Chocobo1 commented Jun 8, 2023

We can start with those that turn out to be the most questionable to have hardcoded limits. In addition, we may not initially provide a UI for these settings.

I don't mind if there is only a few of them. But you will need to pinpoint which read instance and the settings key name.

@glassez
Copy link
Member

glassez commented Jun 10, 2023

I don't mind if there is only a few of them. But you will need to pinpoint which read instance and the settings key name.

Well, I finally have code review done.
The only limit I really worry about is fastresume file size.

src/base/bittorrent/sessionimpl.cpp Outdated Show resolved Hide resolved
src/base/bittorrent/sessionimpl.cpp Outdated Show resolved Hide resolved
file.close();
}
else
const int fileMaxSize = 10 * 1024 * 1024;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe we could avoid to make this one configurable in near future if we make it dependent from "max articles per feed".

src/base/torrentfileswatcher.cpp Outdated Show resolved Hide resolved
src/base/torrentfileswatcher.cpp Outdated Show resolved Hide resolved
src/gui/aboutdialog.cpp Outdated Show resolved Hide resolved
src/gui/rss/automatedrssdownloader.cpp Outdated Show resolved Hide resolved
src/gui/uithemesource.cpp Outdated Show resolved Hide resolved
src/webui/webapplication.cpp Outdated Show resolved Hide resolved
test/testutilsio.cpp Show resolved Hide resolved
@Chocobo1
Copy link
Member Author

PR updated, addressed comments.

src/base/utils/io.cpp Outdated Show resolved Hide resolved
src/base/utils/io.cpp Outdated Show resolved Hide resolved
src/base/utils/io.cpp Outdated Show resolved Hide resolved
@Chocobo1 Chocobo1 force-pushed the read branch 5 times, most recently from 343e801 to cf48894 Compare June 12, 2023 09:32
It now guards against reading infinite files such as `/dev/zero`.
And most readings are bound with a (lax) limit.
As a side effect, more checking are done when reading a file and
overall the reading procedure is more robust.
@Chocobo1 Chocobo1 merged commit 79ca2e1 into qbittorrent:master Jun 14, 2023
18 checks passed
@Chocobo1 Chocobo1 deleted the read branch June 14, 2023 05:38
@Chocobo1 Chocobo1 added this to the 4.6.0 milestone Jun 14, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants