New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

kv filter caused logstash crashed #48

Closed
yidingalan opened this Issue May 24, 2017 · 8 comments

Comments

Projects
None yet
6 participants
@yidingalan

yidingalan commented May 24, 2017

  • Version: Logstash and Elastic Versions: 5.4.0,
    filebeat version: filebeat version 5.4.0 (amd64), libbeat 5.4.0
  • Operating System: Mac OS Yosemite
  • Config File
filter {
  kv { 
    trim_key => "<>\[\],`\."
    remove_field => ["\\%{some_field}", "{%{some_field}"]
    include_brackets => false
  }
}
  • Sample Data:

time=2017-05-24T18:23:02.619100+00:00 severity=INFO pid=7078 method=GET path=/projects/20432/report format=html controller=reports action=show status=200 duration=2304.09 view=2172.47 db=89.51 time=2017-05-24 18:23:00 +0000 ip=216.16.231.94 host=demo.central.miovision.com params={"download_token"=>"1495650173", "report"=>{"format"=>"travel_time", "study_observation_set_ids"=>["", "57", "56", "55", ""], "bin_size"=>"3600"}, "project_id"=>"20432"}

  • Steps to Reproduce: I just upgraded our ELK stack to 5.4, and we got the time out error when we tried to publish the events

2017/05/24 15:20:00.468812 sync.go:85: ERR Failed to publish events caused by: read tcp 10.0.3.150:56617->167.114.251.27:19130: i/o timeout
2017/05/24 15:20:00.468844 single.go:91: INFO Error publishing events (retrying): read tcp 10.0.3.150:56617->167.114.251.27:19130: i/o timeout
2017/05/24 15:20:17.127813 metrics.go:39: INFO Non-zero metrics in the last 30s: libbeat.logstash.publish.read_errors=1 libbeat.logstash.publish.write_errors=1 libbeat.logstash.published_but_not_acked_events=4

Looks like there is something wrong with the kv filter and when we looked at the error log from logstash, we got this:

[[main]>worker15] ERROR logstash.pipeline - Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"-1", "backtrace"=>["java.util.ArrayList.elementData(ArrayList.java:418)", "java.util.ArrayList.remove(ArrayList.java:495)", "org.logstash.FieldReference.parse(FieldReference.java:37)", "org.logstash.PathCache.cache(PathCache.java:37)", "org.logstash.PathCache.isTimestamp(PathCache.java:30)", "org.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_set_field(JrubyEventExtLibrary.java:122)", "org.logstash.ext.JrubyEventExtLibrary$RubyEvent$INVOKER$i$2$0$ruby_set_field.call(JrubyEventExtLibrary$RubyEvent$INVOKER$i$2$0$ruby_set_field.gen)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:202)", "rubyjit.LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247.block_1$RUBY$file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-kv-4.0.0/lib/logstash/filters/kv.rb:326)", "rubyjit$LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247$block_1$RUBY$file.call(rubyjit$LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247$block_1$RUBY$file)", "org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:135)", "org.jruby.runtime.Block.yield(Block.java:142)", "org.jruby.RubyHash$13.visit(RubyHash.java:1355)", "org.jruby.RubyHash.visitLimited(RubyHash.java:648)", "org.jruby.RubyHash.visitAll(RubyHash.java:634)", "org.jruby.RubyHash.iteratorVisitAll(RubyHash.java:1306)", "org.jruby.RubyHash.each_pairCommon(RubyHash.java:1351)", "org.jruby.RubyHash.each19(RubyHash.java:1342)", "org.jruby.RubyHash$INVOKER$i$0$0$each19.call(RubyHash$INVOKER$i$0$0$each19.gen)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)", "org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)", "rubyjit.LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-kv-4.0.0/lib/logstash/filters/kv.rb:326)", "rubyjit.LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-kv-4.0.0/lib/logstash/filters/kv.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:201)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:177)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:183)", "rubyjit.LogStash::Filters::Base$$do_filter_8e8403dcfdf01a35ffca12ed35ec4e79455489071173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145)", "rubyjit.LogStash::Filters::Base$$do_filter_8e8403dcfdf01a35ffca12ed35ec4e79455489071173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:201)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:177)", "org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:188)", "rubyjit.LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247.block_0$RUBY$file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164)", "rubyjit$LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247$block_0$RUBY$file.call(rubyjit$LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247$block_0$RUBY$file)", "org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:135)", "org.jruby.runtime.Block.yield(Block.java:142)", "org.jruby.RubyArray.eachCommon(RubyArray.java:1606)", "org.jruby.RubyArray.each(RubyArray.java:1613)", "org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)", "org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)", "rubyjit.LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161)", "rubyjit.LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "rubyjit.LogStash::FilterDelegator$$multi_filter_43640ebf68de601b56cb618392ab9de0b4f8c58a1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:43)", "rubyjit.LogStash::FilterDelegator$$multi_filter_43640ebf68de601b56cb618392ab9de0b4f8c58a1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57)", "org.jruby.ast.DAsgnNode.interpret(DAsgnNode.java:110)", "org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)", "org.jruby.ast.BlockNode.interpret(BlockNode.java:71)", "org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:112)", "org.jruby.runtime.Interpreted19Block.evalBlockBody(Interpreted19Block.java:206)", "org.jruby.runtime.Interpreted19Block.yield(Interpreted19Block.java:194)", "org.jruby.runtime.Interpreted19Block.call(Interpreted19Block.java:125)", "org.jruby.runtime.Block.call(Block.java:101)", "org.jruby.RubyProc.call(RubyProc.java:300)", "org.jruby.internal.runtime.methods.ProcMethod.call(ProcMethod.java:64)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:206)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.block_0$RUBY$file(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:370)", "rubyjit$LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247$block_0$RUBY$file.call(rubyjit$LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247$block_0$RUBY$file)", "org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:159)", "org.jruby.runtime.CompiledBlock19.call(CompiledBlock19.java:87)", "org.jruby.runtime.Block.call(Block.java:101)", "org.jruby.RubyProc.call(RubyProc.java:300)", "org.jruby.RubyProc.call19(RubyProc.java:281)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(RubyProc$INVOKER$i$0$0$call19.gen)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)", "org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:206)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "rubyjit.LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247.block_0$RUBY$file(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224)", "rubyjit$LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247$block_0$RUBY$file.call(rubyjit$LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247$block_0$RUBY$file)", "org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:135)", "org.jruby.runtime.Block.yield(Block.java:142)", "org.jruby.RubyHash$13.visit(RubyHash.java:1355)", "org.jruby.RubyHash.visitLimited(RubyHash.java:648)", "org.jruby.RubyHash.visitAll(RubyHash.java:634)", "org.jruby.RubyHash.iteratorVisitAll(RubyHash.java:1306)", "org.jruby.RubyHash.each_pairCommon(RubyHash.java:1351)", "org.jruby.RubyHash.each19(RubyHash.java:1342)", "org.jruby.RubyHash$INVOKER$i$0$0$each19.call(RubyHash$INVOKER$i$0$0$each19.gen)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)", "org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)", "rubyjit.LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223)", "rubyjit.LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:161)", "org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)", "org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)", "rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.chained_0_rescue_1$RUBY$SYNTHETIC__file__(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:369)", "rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.file(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb)", "rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.file(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb)", "org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)", "org.jruby.ast.FCallOneArgNode.interpret(FCallOneArgNode.java:36)", "org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)", "org.jruby.ast.BlockNode.interpret(BlockNode.java:71)", "org.jruby.ast.WhileNode.interpret(WhileNode.java:131)", "org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)", "org.jruby.ast.BlockNode.interpret(BlockNode.java:71)", "org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)", "org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:225)", "org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:219)", "org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:202)", "org.jruby.ast.FCallTwoArgNode.interpret(FCallTwoArgNode.java:38)", "org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)", "org.jruby.ast.BlockNode.interpret(BlockNode.java:71)", "org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:112)", "org.jruby.runtime.Interpreted19Block.evalBlockBody(Interpreted19Block.java:206)", "org.jruby.runtime.Interpreted19Block.yield(Interpreted19Block.java:194)", "org.jruby.runtime.Interpreted19Block.call(Interpreted19Block.java:125)", "org.jruby.runtime.Block.call(Block.java:101)", "org.jruby.RubyProc.call(RubyProc.java:300)", "org.jruby.RubyProc.call(RubyProc.java:230)", "org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:99)", "java.lang.Thread.run(Thread.java:748)"]}
Exception in thread "[main]>worker15" java.lang.ArrayIndexOutOfBoundsException: -1
at java.util.ArrayList.elementData(ArrayList.java:418)
at java.util.ArrayList.remove(ArrayList.java:495)
at org.logstash.FieldReference.parse(FieldReference.java:37)
at org.logstash.PathCache.cache(PathCache.java:37)
at org.logstash.PathCache.isTimestamp(PathCache.java:30)
at org.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_set_field(JrubyEventExtLibrary.java:122)
at org.logstash.ext.JrubyEventExtLibrary$RubyEvent$INVOKER$i$2$0$ruby_set_field.call(JrubyEventExtLibrary$RubyEvent$INVOKER$i$2$0$ruby_set_field.gen)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:202)
at rubyjit.LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247.block_1$RUBY$file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-kv-4.0.0/lib/logstash/filters/kv.rb:326)
at rubyjit$LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247$block_1$RUBY$file.call(rubyjit$LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247$block_1$RUBY$file)
at org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:135)
at org.jruby.runtime.Block.yield(Block.java:142)
at org.jruby.RubyHash$13.visit(RubyHash.java:1355)
at org.jruby.RubyHash.visitLimited(RubyHash.java:648)
at org.jruby.RubyHash.visitAll(RubyHash.java:634)
at org.jruby.RubyHash.iteratorVisitAll(RubyHash.java:1306)
at org.jruby.RubyHash.each_pairCommon(RubyHash.java:1351)
at org.jruby.RubyHash.each19(RubyHash.java:1342)
at org.jruby.RubyHash$INVOKER$i$0$0$each19.call(RubyHash$INVOKER$i$0$0$each19.gen)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)
at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)
at rubyjit.LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-kv-4.0.0/lib/logstash/filters/kv.rb:326)
at rubyjit.LogStash::Filters::KV$$filter_ed7fe6468ef7edb99f1600863745e9b6ad8a191d1173230247.file(/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-kv-4.0.0/lib/logstash/filters/kv.rb)
at org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:201)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:177)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:183)
at rubyjit.LogStash::Filters::Base$$do_filter_8e8403dcfdf01a35ffca12ed35ec4e79455489071173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145)
at rubyjit.LogStash::Filters::Base$$do_filter_8e8403dcfdf01a35ffca12ed35ec4e79455489071173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb)
at org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:201)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:177)
at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:188)
at rubyjit.LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247.block_0$RUBY$file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164)
at rubyjit$LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247$block_0$RUBY$file.call(rubyjit$LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247$block_0$RUBY$file)
at org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:135)
at org.jruby.runtime.Block.yield(Block.java:142)
at org.jruby.RubyArray.eachCommon(RubyArray.java:1606)
at org.jruby.RubyArray.each(RubyArray.java:1613)
at org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)
at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)
at rubyjit.LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161)
at rubyjit.LogStash::Filters::Base$$multi_filter_6c0dea8219a042f89f5a5b41e60697f8088a7c451173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb)
at org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)
at rubyjit.LogStash::FilterDelegator$$multi_filter_43640ebf68de601b56cb618392ab9de0b4f8c58a1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:43)
at rubyjit.LogStash::FilterDelegator$$multi_filter_43640ebf68de601b56cb618392ab9de0b4f8c58a1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb)
at org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)
at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57)
at org.jruby.ast.DAsgnNode.interpret(DAsgnNode.java:110)
at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
at org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:112)
at org.jruby.runtime.Interpreted19Block.evalBlockBody(Interpreted19Block.java:206)
at org.jruby.runtime.Interpreted19Block.yield(Interpreted19Block.java:194)
at org.jruby.runtime.Interpreted19Block.call(Interpreted19Block.java:125)
at org.jruby.runtime.Block.call(Block.java:101)
at org.jruby.RubyProc.call(RubyProc.java:300)
at org.jruby.internal.runtime.methods.ProcMethod.call(ProcMethod.java:64)
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:206)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)
at rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.block_0$RUBY$file(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:370)
at rubyjit$LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247$block_0$RUBY$file.call(rubyjit$LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247$block_0$RUBY$file)
at org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:159)
at org.jruby.runtime.CompiledBlock19.call(CompiledBlock19.java:87)
at org.jruby.runtime.Block.call(Block.java:101)
at org.jruby.RubyProc.call(RubyProc.java:300)
at org.jruby.RubyProc.call19(RubyProc.java:281)
at org.jruby.RubyProc$INVOKER$i$0$0$call19.call(RubyProc$INVOKER$i$0$0$call19.gen)
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:210)
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:206)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)
at rubyjit.LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247.block_0$RUBY$file(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:224)
at rubyjit$LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247$block_0$RUBY$file.call(rubyjit$LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247$block_0$RUBY$file)
at org.jruby.runtime.CompiledBlock19.yield(CompiledBlock19.java:135)
at org.jruby.runtime.Block.yield(Block.java:142)
at org.jruby.RubyHash$13.visit(RubyHash.java:1355)
at org.jruby.RubyHash.visitLimited(RubyHash.java:648)
at org.jruby.RubyHash.visitAll(RubyHash.java:634)
at org.jruby.RubyHash.iteratorVisitAll(RubyHash.java:1306)
at org.jruby.RubyHash.each_pairCommon(RubyHash.java:1351)
at org.jruby.RubyHash.each19(RubyHash.java:1342)
at org.jruby.RubyHash$INVOKER$i$0$0$each19.call(RubyHash$INVOKER$i$0$0$each19.gen)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)
at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)
at rubyjit.LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb:223)
at rubyjit.LogStash::Util::WrappedSynchronousQueue::ReadBatch$$each_39448e3b53418e12c6a9b40cc27889a1b2905b7d1173230247.file(/usr/share/logstash/logstash-core/lib/logstash/util/wrapped_synchronous_queue.rb)
at org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:161)
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:143)
at org.jruby.runtime.callsite.CachingCallSite.callIter(CachingCallSite.java:154)
at rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.chained_0_rescue_1$RUBY$SYNTHETIC__file__(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:369)
at rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.file(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb)
at rubyjit.LogStash::Pipeline$$filter_batch_2b9b0fb1ba7f58d36d479ab8384717f4c3fd58e51173230247.file(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb)
at org.jruby.internal.runtime.methods.JittedMethod.call(JittedMethod.java:181)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:168)
at org.jruby.ast.FCallOneArgNode.interpret(FCallOneArgNode.java:36)
at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
at org.jruby.ast.WhileNode.interpret(WhileNode.java:131)
at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74)
at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:225)
at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:219)
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:202)
at org.jruby.ast.FCallTwoArgNode.interpret(FCallTwoArgNode.java:38)
at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:105)
at org.jruby.ast.BlockNode.interpret(BlockNode.java:71)
at org.jruby.evaluator.ASTInterpreter.INTERPRET_BLOCK(ASTInterpreter.java:112)
at org.jruby.runtime.Interpreted19Block.evalBlockBody(Interpreted19Block.java:206)
at org.jruby.runtime.Interpreted19Block.yield(Interpreted19Block.java:194)
at org.jruby.runtime.Interpreted19Block.call(Interpreted19Block.java:125)
at org.jruby.runtime.Block.call(Block.java:101)
at org.jruby.RubyProc.call(RubyProc.java:300)
at org.jruby.RubyProc.call(RubyProc.java:230)
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:99)
at java.lang.Thread.run(Thread.java:748)

This has never happened with our old version of ELK. Any idea on what's wrong with our kv filter? Or is this just a bug for 5.4?

@jordansissel

This comment has been minimized.

Show comment
Hide comment
@jordansissel

jordansissel May 25, 2017

Contributor

Summarizing the stack trace:

[[main]>worker15] ERROR logstash.pipeline - Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"-1", "backtrace"=>["java.util.ArrayList.elementData(ArrayList.java:418)", 
...

 Exception in thread "[main]>worker15" java.lang.ArrayIndexOutOfBoundsException: -1
at java.util.ArrayList.elementData(ArrayList.java:418)
at java.util.ArrayList.remove(ArrayList.java:495)
at org.logstash.FieldReference.parse(FieldReference.java:37)
at org.logstash.PathCache.cache(PathCache.java:37)
at org.logstash.PathCache.isTimestamp(PathCache.java:30)
at org.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_set_field(JrubyEventExtLibrary.java:122)
...
Contributor

jordansissel commented May 25, 2017

Summarizing the stack trace:

[[main]>worker15] ERROR logstash.pipeline - Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {"exception"=>"-1", "backtrace"=>["java.util.ArrayList.elementData(ArrayList.java:418)", 
...

 Exception in thread "[main]>worker15" java.lang.ArrayIndexOutOfBoundsException: -1
at java.util.ArrayList.elementData(ArrayList.java:418)
at java.util.ArrayList.remove(ArrayList.java:495)
at org.logstash.FieldReference.parse(FieldReference.java:37)
at org.logstash.PathCache.cache(PathCache.java:37)
at org.logstash.PathCache.isTimestamp(PathCache.java:30)
at org.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_set_field(JrubyEventExtLibrary.java:122)
...
@jordansissel

This comment has been minimized.

Show comment
Hide comment
@jordansissel

jordansissel May 25, 2017

Contributor

I haven't tried to reproduce yet, but I agree this is something strange. It appears to be a bug, but I am not sure what is causing the problem.

Contributor

jordansissel commented May 25, 2017

I haven't tried to reproduce yet, but I agree this is something strange. It appears to be a bug, but I am not sure what is causing the problem.

@yidingalan

This comment has been minimized.

Show comment
Hide comment
@yidingalan

yidingalan May 25, 2017

@jordansissel so I've been testing for a bit, and I think what's causing the issue is trim_key

Once I removed trim_key from kv filter, logstash doesn't seem to crash anymore. I don't know if it's any characters in our trim_key config that's causing the issue though.

yidingalan commented May 25, 2017

@jordansissel so I've been testing for a bit, and I think what's causing the issue is trim_key

Once I removed trim_key from kv filter, logstash doesn't seem to crash anymore. I don't know if it's any characters in our trim_key config that's causing the issue though.

@jsvd

This comment has been minimized.

Show comment
Hide comment
@jsvd

jsvd May 26, 2017

Contributor

I can't seem to reproduce this on 5.4.0 and your sample data + config:

% cat cfg
input { stdin {} }
filter {
  kv {
    trim_key => "<>\[\],`\."
      remove_field => ["\\%{some_field}", "{%{some_field}"]
      include_brackets => false
  }
}
output { stdout { codec => rubydebug } }
/tmp/logstash-5.4.0 % echo 'time=2017-05-24T18:23:02.619100+00:00 severity=INFO pid=7078 method=GET path=/projects/20432/report format=html controller=reports action=show status=200 duration=2304.09 view=2172.47 db=89.51 time=2017-05-24 18:23:00 +0000 ip=216.16.231.94 host=demo.central.miovision.com params={"download_token"=>"1495650173", "report"=>{"format"=>"travel_time", "study_observation_set_ids"=>["", "57", "56", "55", ""], "bin_size"=>"3600"}, "project_id"=>"20432"}' | bin/logstash -f cfg 
{
                         "severity" => "INFO",
                       "controller" => "reports",
                           "method" => "GET",
                               "ip" => "216.16.231.94",
                           "format" => "html",
                   "\"project_id\"" => ">\"20432\"}",
                              "pid" => "7078",
                       "\"report\"" => ">{\"format\"=>\"travel_time\",",
                          "message" => "time=2017-05-24T18:23:02.619100+00:00 severity=INFO pid=7078 method=GET path=/projects/20432/report format=html controller=reports action=show status=200 duration=2304.09 view=2172.47 db=89.51 time=2017-05-24 18:23:00 +0000 ip=216.16.231.94 host=demo.central.miovision.com params={\"download_token\"=>\"1495650173\", \"report\"=>{\"format\"=>\"travel_time\", \"study_observation_set_ids\"=>[\"\", \"57\", \"56\", \"55\", \"\"], \"bin_size\"=>\"3600\"}, \"project_id\"=>\"20432\"}",
                           "params" => "{\"download_token\"=>\"1495650173\",",
    "\"study_observation_set_ids\"" => ">[\"\",",
                         "duration" => "2304.09",
                             "path" => "/projects/20432/report",
                             "view" => "2172.47",
                       "@timestamp" => 2017-05-26T16:49:13.296Z,
                         "@version" => "1",
                             "host" => "demo.central.miovision.com",
                           "action" => "show",
                     "\"bin_size\"" => ">\"3600\"},",
                             "time" => [
        [0] "2017-05-24T18:23:02.619100+00:00",
        [1] "2017-05-24"
    ],
                               "db" => "89.51",
                           "status" => "200"
}
Contributor

jsvd commented May 26, 2017

I can't seem to reproduce this on 5.4.0 and your sample data + config:

% cat cfg
input { stdin {} }
filter {
  kv {
    trim_key => "<>\[\],`\."
      remove_field => ["\\%{some_field}", "{%{some_field}"]
      include_brackets => false
  }
}
output { stdout { codec => rubydebug } }
/tmp/logstash-5.4.0 % echo 'time=2017-05-24T18:23:02.619100+00:00 severity=INFO pid=7078 method=GET path=/projects/20432/report format=html controller=reports action=show status=200 duration=2304.09 view=2172.47 db=89.51 time=2017-05-24 18:23:00 +0000 ip=216.16.231.94 host=demo.central.miovision.com params={"download_token"=>"1495650173", "report"=>{"format"=>"travel_time", "study_observation_set_ids"=>["", "57", "56", "55", ""], "bin_size"=>"3600"}, "project_id"=>"20432"}' | bin/logstash -f cfg 
{
                         "severity" => "INFO",
                       "controller" => "reports",
                           "method" => "GET",
                               "ip" => "216.16.231.94",
                           "format" => "html",
                   "\"project_id\"" => ">\"20432\"}",
                              "pid" => "7078",
                       "\"report\"" => ">{\"format\"=>\"travel_time\",",
                          "message" => "time=2017-05-24T18:23:02.619100+00:00 severity=INFO pid=7078 method=GET path=/projects/20432/report format=html controller=reports action=show status=200 duration=2304.09 view=2172.47 db=89.51 time=2017-05-24 18:23:00 +0000 ip=216.16.231.94 host=demo.central.miovision.com params={\"download_token\"=>\"1495650173\", \"report\"=>{\"format\"=>\"travel_time\", \"study_observation_set_ids\"=>[\"\", \"57\", \"56\", \"55\", \"\"], \"bin_size\"=>\"3600\"}, \"project_id\"=>\"20432\"}",
                           "params" => "{\"download_token\"=>\"1495650173\",",
    "\"study_observation_set_ids\"" => ">[\"\",",
                         "duration" => "2304.09",
                             "path" => "/projects/20432/report",
                             "view" => "2172.47",
                       "@timestamp" => 2017-05-26T16:49:13.296Z,
                         "@version" => "1",
                             "host" => "demo.central.miovision.com",
                           "action" => "show",
                     "\"bin_size\"" => ">\"3600\"},",
                             "time" => [
        [0] "2017-05-24T18:23:02.619100+00:00",
        [1] "2017-05-24"
    ],
                               "db" => "89.51",
                           "status" => "200"
}
@yidingalan

This comment has been minimized.

Show comment
Hide comment
@yidingalan

yidingalan May 26, 2017

@jsvd It doesn't always crash. I am also not 100% sure if that's the specific message that's causing the issue. It could possibly be other messages in our logs

yidingalan commented May 26, 2017

@jsvd It doesn't always crash. I am also not 100% sure if that's the specific message that's causing the issue. It could possibly be other messages in our logs

@samjgalbraith

This comment has been minimized.

Show comment
Hide comment
@samjgalbraith

samjgalbraith Aug 23, 2017

I'm getting the same issue sporadically with the following pipeline. We've managed to reproduce it now with the log entry:

{"": "Exception on /v1/classify [POST]", "timestamp": "2017-08-24 02:48:00.564Z", "service": "mp-sps", "level": "ERROR"}

Looks like it can't handle an empty key in a json structured log.

We're on 5.4.0

input {
    beats {
        port => 5043
        ssl => true
        ssl_certificate => "..."
        ssl_key => "..."
    }
}

filter {
    if [type] == "tm-services-log" {
        json {
            source => "message"
        }
        # The @timestamp field is the time the record was picked up. timestamp is when it was written, which is better.
        date {
            match => ["timestamp", "ISO8601"]
            target => "@timestamp"
        }
        # Remove the surplus timestamp from above and some other things that are useless or misleading.
        mutate { remove_field => [ "@version", "_type", "beat", "host", "input_type", "offset", "source", "tags", "timestamp" ] }
    }
}

output {
    if [type] == "tm-services-log" {
        if [service] and [env_type] {
            elasticsearch {
                hosts => ["..."]
                ssl => true
                user => "..."
                password => "..."
                index => "tm-services-%{service}-%{env_type}-%{+xxxx.ww}"
            }
        } else {
            elasticsearch {
                hosts => ["..."]
                ssl => true
                user => "..."
                password => "..."
                index => "tm-services-unknown-%{+xxxx.ww}"
            }
        }
    }
}```

samjgalbraith commented Aug 23, 2017

I'm getting the same issue sporadically with the following pipeline. We've managed to reproduce it now with the log entry:

{"": "Exception on /v1/classify [POST]", "timestamp": "2017-08-24 02:48:00.564Z", "service": "mp-sps", "level": "ERROR"}

Looks like it can't handle an empty key in a json structured log.

We're on 5.4.0

input {
    beats {
        port => 5043
        ssl => true
        ssl_certificate => "..."
        ssl_key => "..."
    }
}

filter {
    if [type] == "tm-services-log" {
        json {
            source => "message"
        }
        # The @timestamp field is the time the record was picked up. timestamp is when it was written, which is better.
        date {
            match => ["timestamp", "ISO8601"]
            target => "@timestamp"
        }
        # Remove the surplus timestamp from above and some other things that are useless or misleading.
        mutate { remove_field => [ "@version", "_type", "beat", "host", "input_type", "offset", "source", "tags", "timestamp" ] }
    }
}

output {
    if [type] == "tm-services-log" {
        if [service] and [env_type] {
            elasticsearch {
                hosts => ["..."]
                ssl => true
                user => "..."
                password => "..."
                index => "tm-services-%{service}-%{env_type}-%{+xxxx.ww}"
            }
        } else {
            elasticsearch {
                hosts => ["..."]
                ssl => true
                user => "..."
                password => "..."
                index => "tm-services-unknown-%{+xxxx.ww}"
            }
        }
    }
}```
@joshuar

This comment has been minimized.

Show comment
Hide comment
@joshuar

joshuar Oct 13, 2017

Member

Indeed this looks to be caused by a null or missing key. For example, with the following simple pipeline:

input {
  stdin {}
}

filter {
    kv {
      value_split => ":"
      field_split => ","
      trim_key => " "
    }
}

output {
  stdout {
    codec => "rubydebug"
  }
}

This message would trigger it:

foo:2, :blah, bar:sgasgd

As a workaround, the following mutate filter before the kv filter can catch values with a missing/null key and place them into a single values_missing_keys field:

    mutate {
      gsub => ["pwnie_message", ",\s+:",",values_missing_keys:"]
    }

I note the issue is non-fatal to Logstash, it sill appears to be able to process subsequent events, but the event which triggered the exception appears to be lost.

Member

joshuar commented Oct 13, 2017

Indeed this looks to be caused by a null or missing key. For example, with the following simple pipeline:

input {
  stdin {}
}

filter {
    kv {
      value_split => ":"
      field_split => ","
      trim_key => " "
    }
}

output {
  stdout {
    codec => "rubydebug"
  }
}

This message would trigger it:

foo:2, :blah, bar:sgasgd

As a workaround, the following mutate filter before the kv filter can catch values with a missing/null key and place them into a single values_missing_keys field:

    mutate {
      gsub => ["pwnie_message", ",\s+:",",values_missing_keys:"]
    }

I note the issue is non-fatal to Logstash, it sill appears to be able to process subsequent events, but the event which triggered the exception appears to be lost.

@jsvd

This comment has been minimized.

Show comment
Hide comment
@jsvd

jsvd Aug 29, 2018

Contributor

this has been solved in 5.6.11 and 6.1.1 in core by elastic/logstash@cdb4f0c

Contributor

jsvd commented Aug 29, 2018

this has been solved in 5.6.11 and 6.1.1 in core by elastic/logstash@cdb4f0c

@jsvd jsvd closed this Aug 29, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment