Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Out of memory - possible memory leak #60

Closed
Zomis opened this issue Feb 3, 2015 · 8 comments
Closed

Out of memory - possible memory leak #60

Zomis opened this issue Feb 3, 2015 · 8 comments

Comments

@Zomis
Copy link
Owner

Zomis commented Feb 3, 2015

Recently the Tomcat server is running out of memory. I believe this started to happen quite recently, possibly with either the Github API requests or the StackExchange API requests, or with the refactoring of the daily updates. Either way, try to locate this bug.

@Zomis Zomis added the bug label Feb 3, 2015
@skiwi2
Copy link
Contributor

skiwi2 commented Feb 28, 2015

Do you have a clue what the date or commit is when this issue started happening?

@Zomis
Copy link
Owner Author

Zomis commented Feb 28, 2015

It was definitely working at the beginning of 2015. I think it started happening in the middle of January. It can possibly be related to when I switched from Windows to Linux. I noticed yesterday that the whole machine is going low on memory when this issue occurs.

Somewhere between 6161c9b and 3d92844 is what I am sure of.

@skiwi2
Copy link
Contributor

skiwi2 commented Feb 28, 2015

Does the amount of RAM used by this application increase steadily over time?

@skiwi2
Copy link
Contributor

skiwi2 commented Feb 28, 2015

Do you also have an example of a stack trace that the OOME left behind?

While the cause is not necessarily in that specific method, it could still give some hints.

@Zomis
Copy link
Owner Author

Zomis commented Feb 28, 2015

I'll have to investigate about the RAM. I am not sure about that.

There is no stack trace, unfortunately.

@Zomis
Copy link
Owner Author

Zomis commented Mar 1, 2015

This might have been fixed. The bot has survived for more than one day so far. Leaving issue open for a little longer in case it is not fixed.

@Zomis
Copy link
Owner Author

Zomis commented Apr 26, 2015

This is not fixed. It is confirmed to not be because of the comment scanning though.

@Zomis
Copy link
Owner Author

Zomis commented May 28, 2015

Duga has been up and running now for ten days without the need to restart her. http://chat.stackexchange.com/transcript/20298?m=21702930#21702930

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants