New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

forbidden requests cause elasticsearch to fullGC #302

Closed
viewsite opened this Issue Dec 28, 2017 · 3 comments

Comments

Projects
None yet
2 participants
@viewsite

viewsite commented Dec 28, 2017

Hello.
I installed readonlyrest-1.16.14_es5.6.4.zip. When requests that have no permission to the cluster were forbidden by the plugins, I noticed that they remained in the cluster by using task management API, and the number of them increases when more HTTP requests are forbidden. After a few days, i found that there were millions of tasks remained in memory and causing continuous full GC.

  • command:

curl "localhost:9200/_tasks?pretty&filter_path=nodes.*.tasks.*.action" | grep action | sort | uniq -c

  • result:

236 "action" : "cluster:monitor/health"
8191197 "action" : "cluster:monitor/main"

  • log:
    [2017-12-28T12:14:44,526][WARN ][o.e.m.j.JvmGcMonitorService] [1498146454000002009] [gc][old][240145][905] duration [16.8s], collections [1]/[17s], total [16.8s]/[2.1h], memory [29.1gb]->[12.1gb]/[29.9gb], all_pools {[young] [416.5mb]->[3.8mb]/[665.6mb]}{[survivor] [83.1mb]->[0b]/[83.1mb]}{[old] [28.6gb]->[12.1gb]/[29.1gb]}

    It is easy to make this reproduce. All you have to do is sending forbidden requests to cluster continuously and check tasks of the cluster.

@viewsite viewsite changed the title from forbidden request cause elasticsearch to fullGC to forbidden requests cause elasticsearch to fullGC Dec 28, 2017

@sscarduzio

This comment has been minimized.

Show comment
Hide comment
@sscarduzio

sscarduzio Dec 28, 2017

Owner

Hi @viewsite thanks for reporting this. It's a regression, as we found this issue before, months ago before a couple of major refactoring sessions which must have reintroduced this behaviour.

Just to let you know I'm right on this at the moment. Will update soon with progress. 👍

Owner

sscarduzio commented Dec 28, 2017

Hi @viewsite thanks for reporting this. It's a regression, as we found this issue before, months ago before a couple of major refactoring sessions which must have reintroduced this behaviour.

Just to let you know I'm right on this at the moment. Will update soon with progress. 👍

@sscarduzio

This comment has been minimized.

Show comment
Hide comment
@sscarduzio

sscarduzio Dec 29, 2017

Owner

@viewsite this is now fixed, will release a new version this weekend!

Owner

sscarduzio commented Dec 29, 2017

@viewsite this is now fixed, will release a new version this weekend!

@viewsite

This comment has been minimized.

Show comment
Hide comment
@viewsite

viewsite commented Jan 2, 2018

Great!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment