We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I get this after upgrading to latest grok file:
[2018-06-09T00:21:47,250][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2018-06-09T00:21:47,259][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2018-06-09T00:21:47,590][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2018-06-09T00:21:47,796][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"} [2018-06-09T00:21:47,956][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-06-09T00:21:50,090][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50} [2018-06-09T00:21:50,322][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2018-06-09T00:21:50,325][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-06-09T00:21:50,489][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2018-06-09T00:21:50,529][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6} [2018-06-09T00:21:50,529][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the typeevent field won't be used to determine the document _type {:es_version=>6} [2018-06-09T00:21:50,535][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2018-06-09T00:21:50,538][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2018-06-09T00:21:50,545][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]} [2018-06-09T00:21:50,702][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x56b6146 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id=\"bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x3193091d @metric=#<LogStash::Instrument::Metric:0x53bc1c96 @collector=#<LogStash::Instrument::Collector:0x7ae16f64 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x24de502f @store=#<Concurrent::Map:0x00000000000fbc entries=4 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0xd2c6e48>, @fast_lookup=#<Concurrent::Map:0x00000000000fc0 entries=195 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01, :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"/etc/logstash/conf.d/patterns\"], match=>{\"message\"=>\"%{DHCPD}\"}, id=>\"bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{DHCPD} not defined", :thread=>"#<Thread:0x4fac4934@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"} [2018-06-09T00:21:50,887][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{DHCPD} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:inblock in compile'", "org/jruby/RubyKernel.java:1292:in loop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:in block in register'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:in block in register'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:in register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:342:in register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:353:in block in register_plugins'", "org/jruby/RubyArray.java:1734:in each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:353:in register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:731:in maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:363:in start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:250:in block in start'"], :thread=>"#<Thread:0x4fac4934@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"} [2018-06-09T00:21:50,910][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
[2018-06-09T00:21:47,250][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2018-06-09T00:21:47,259][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2018-06-09T00:21:47,590][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2018-06-09T00:21:47,796][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"} [2018-06-09T00:21:47,956][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-06-09T00:21:50,090][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50} [2018-06-09T00:21:50,322][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2018-06-09T00:21:50,325][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-06-09T00:21:50,489][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2018-06-09T00:21:50,529][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6} [2018-06-09T00:21:50,529][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the
event field won't be used to determine the document _type {:es_version=>6} [2018-06-09T00:21:50,535][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2018-06-09T00:21:50,538][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2018-06-09T00:21:50,545][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]} [2018-06-09T00:21:50,702][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x56b6146 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id=\"bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x3193091d @metric=#<LogStash::Instrument::Metric:0x53bc1c96 @collector=#<LogStash::Instrument::Collector:0x7ae16f64 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x24de502f @store=#<Concurrent::Map:0x00000000000fbc entries=4 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0xd2c6e48>, @fast_lookup=#<Concurrent::Map:0x00000000000fc0 entries=195 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01, :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"/etc/logstash/conf.d/patterns\"], match=>{\"message\"=>\"%{DHCPD}\"}, id=>\"bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{DHCPD} not defined", :thread=>"#<Thread:0x4fac4934@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"} [2018-06-09T00:21:50,887][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{DHCPD} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in
loop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in
block in register'", "org/jruby/RubyArray.java:1734:in
block in register'", "org/jruby/RubyHash.java:1343:in
register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:342:in
block in register_plugins'", "org/jruby/RubyArray.java:1734:in
register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:731:in
start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in
block in start'"], :thread=>"#<Thread:0x4fac4934@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"} [2018-06-09T00:21:50,910][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
The text was updated successfully, but these errors were encountered:
No branches or pull requests
I get this after upgrading to latest grok file:
[2018-06-09T00:21:47,250][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"} [2018-06-09T00:21:47,259][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"} [2018-06-09T00:21:47,590][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified [2018-06-09T00:21:47,796][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.4"} [2018-06-09T00:21:47,956][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600} [2018-06-09T00:21:50,090][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50} [2018-06-09T00:21:50,322][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}} [2018-06-09T00:21:50,325][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"} [2018-06-09T00:21:50,489][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"} [2018-06-09T00:21:50,529][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6} [2018-06-09T00:21:50,529][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the
typeevent field won't be used to determine the document _type {:es_version=>6} [2018-06-09T00:21:50,535][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil} [2018-06-09T00:21:50,538][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}} [2018-06-09T00:21:50,545][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]} [2018-06-09T00:21:50,702][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::FilterDelegator:0x56b6146 @metric_events_out=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @metric_events_in=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @metric_events_time=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @id=\"bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x3193091d @metric=#<LogStash::Instrument::Metric:0x53bc1c96 @collector=#<LogStash::Instrument::Collector:0x7ae16f64 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x24de502f @store=#<Concurrent::Map:0x00000000000fbc entries=4 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0xd2c6e48>, @fast_lookup=#<Concurrent::Map:0x00000000000fc0 entries=195 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01, :events]>, @filter=<LogStash::Filters::Grok patterns_dir=>[\"/etc/logstash/conf.d/patterns\"], match=>{\"message\"=>\"%{DHCPD}\"}, id=>\"bad3ef91a4a713af20e15b5f872cf70a9cb37544e4cccc9195af22e86b738e01\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{DHCPD} not defined", :thread=>"#<Thread:0x4fac4934@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"} [2018-06-09T00:21:50,887][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Grok::PatternError: pattern %{DHCPD} not defined>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in
block in compile'", "org/jruby/RubyKernel.java:1292:inloop'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in
compile'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:281:inblock in register'", "org/jruby/RubyArray.java:1734:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:275:inblock in register'", "org/jruby/RubyHash.java:1343:in
each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-grok-4.0.3/lib/logstash/filters/grok.rb:270:inregister'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:342:in
register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:353:inblock in register_plugins'", "org/jruby/RubyArray.java:1734:in
each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:353:inregister_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:731:in
maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:363:instart_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:290:in
run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:250:inblock in start'"], :thread=>"#<Thread:0x4fac4934@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:247 run>"} [2018-06-09T00:21:50,910][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
The text was updated successfully, but these errors were encountered: