Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

High memory usage leads to fork errors #7

Closed
ntg opened this issue Nov 18, 2011 · 5 comments
Closed

High memory usage leads to fork errors #7

ntg opened this issue Nov 18, 2011 · 5 comments

Comments

@ntg
Copy link

ntg commented Nov 18, 2011

Sorry about the format of this bug report, it's kind of vague because I don't have much detail to give. This seems to be an isolated occurrence, but it seems serious enough to report in case others are having similar problems.

Quick baseline:
rtorrent/libtor svn1272, Ubuntu 11.04, 16GB of RAM

Description:
On my machine with 16GB of RAM, rtorrent is the only thing running and it somehow got itself into a spot where it was using 50.1% (8.0001GB) of resident memory (as shown by top). At that point it started spitting out fork errors (ERR_NOMEM) every time it tried to spawn a scheduled process (which happens a few times a minute). I didn't notice this until 6 hours later, at which point I had to kill and restart rtorrent to free the RAM. In that entire time, rtorrent continued with its data transfers at ~2MB/s up&down, but all other scheduled activity stopped due to fork errors. Obviously the forks were failing because each child requires exactly as much memory as the parent, but...

Questions:

  1. why would rtorrent require 8GB in the first place? I could understand if there were high speeds and thousands of buffers involved, but transfer speeds were slow--at only 2MB/s up and down--with 17 active torrents. If it makes a difference, the total size of the active torrents was about 500GB, the size of ALL seeding torrents was over a terabyte, and the active torrents included 4 slow downloads totalling 100GB, and downloading at a total of 2MB/s. rtorrent reported memory usage of ~150MB, so there must have been almost 8GB of memmapped files. I'm afraid that running memtest slipped my mind until after I killed/restarted rtorrent, but 8GB seems extremely high no matter what it was trying to do. I lowered the bufsize to 256k, maxOpenFiles to 4, and stopped all torrents except the downloads, but there was no change in memory usage so I just shut down rtorrent and restarted.

  2. Has rtorrent always forked the scheduled processes this way? Because if so, then rtorrent could never have used more than half the available memory, which seems to be a huge waste (and it also should have caused problems for me and many other people). And, it also means that there could only be one forked process running at a time, or the memory problem goes up exponentially. Either way, would it be possible to fix this by running just a small controller as the main program, which would delegate all the memmaps and scheduled processes into separately-forked child processes? This way the main program and all of its forks would be tiny, although I assume it could take a fair amount of rewriting to structure it this way (I haven't looked at the code yet).

  3. Although I didn't get to run memstat, is there anything that seems likely to be causing this? Could the memmapped regions be leaking memory instead of getting freed, or any other likely explanation? I've watched the resident memory, and in normal use the RAM usage occasionally jumps up to ~2GB when I'm downloading above 50MB/s (400Mbps)...but within seconds it flushes the data and the RAM usage returns to just a couple hundred MB. This time it seems to have kept rising until it hit 8GB, at which point it couldn't rise any further or apparently even drop, and it just got stuck until I shut down rtorrent.

  4. ...Or is it maybe something that's just configured wrong with my setup, and not rtorrent at all?

Again, sorry for the report with lots of talk and few facts, but I'm hoping this description is enough to give an idea of what happened.

@rakshasa
Copy link
Owner

Try again with the latest git version, as there has been some fixes to background execution of programs.

Also it might be that the available address space is limited by the kernel to 8GB, while rtorrent is configured to try using more. (try setting 'max_memory_usage=7G')

@ntg
Copy link
Author

ntg commented Nov 18, 2011

I'll set the max_memory to 7G and build from head but I expect to have troubles reproducing this. I've never seen it happen before and I've used rtorrent with as little as 400MB (and I allowed rtorrent itself to eat up all the RAM, and there were OOM errors but no fork errors back then with 0.8.6).

I don't think that 8GB is significant apart from the fact that it's half the available RAM. ulimit -l shows 64-byte addressable memory, and rtorrent was also using more than 8GB already. The forks only failed because the system couldn't allocate another 8.01GB to match the parent process so I don't think the kernel was limiting anything, but I'll still set it to 7GB.

Changing the execution of background processes will help the fork errors if these conditions occur again, but that will actually allow the other main issue to go unchecked: why would rtorrent even get to 8GB in the first place when it was doing so little? That amount of activity would usually use no more than 500MB of RAM (and usually a lot less). I'll be sure to run memstat if this happens again but since I can't force this bug to happen, I have to play with theories right now.

BTW, Memory usage in rtorrent has always been kind of foggy to me, I was never entirely sure what number rtorrent is reporting or exactly how to figure out where the RAM was going. What's the best way to determine the memory used by a) rtorrent itself; b) memmapped files; and c) disk buffers?

@rakshasa
Copy link
Owner

It's actually 'ulimit -m' you're supposed to use.

If the memory shows up in ps or top as VSZ / VIRT it is the address space used, with no connection to actual memory usage. Under RSS / RES you have the memory the rtorrent process itself uses, with no mapped files, shared libraries or such included.

I don't know about any easy way of displaying the amount of memory used by the mapped files, except through the use of mincore function call.

@ntg
Copy link
Author

ntg commented Dec 6, 2011

Not much point keeping this open now, since it's only ever happened to me once in 2 years of using multiple rtorrent versions. I'll close it for now, and if I find more info later then I'll either re-open or re-issue the ticket.

@jenkins101
Copy link

hi,

I am running into this to when trying to use ipv4_filter.load with a 4.5MB file from iblocklist, level1 block.

memory usage is high, around 1GB on a 2GB system with nothing running more or less.

libt 0.13.3 and rtorrent 0.9.3

all works fine without ipv4_filter

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants