Skip to content
This repository has been archived by the owner on Mar 22, 2024. It is now read-only.

Dashboard on Kibana 7.0.1 #328

Closed
NightWalker70 opened this issue May 16, 2019 · 16 comments
Closed

Dashboard on Kibana 7.0.1 #328

NightWalker70 opened this issue May 16, 2019 · 16 comments

Comments

@NightWalker70
Copy link

Hello
I am now to ELK stack and just trying to get the Elastiflow working on a Centos7 server. I have the ELK stack working, netflows being feed into the system. The elastiflow in created with data However I am unable to import the json files using Kibana UI import feature. Just curious what will be the best way to get the dashboard and index patterns.
Also I am trying to import elastiflow.kibana.7.0.x. and I am using elastiflow-3.5.0

Thanks

@NightWalker70
Copy link
Author

Also this is the error I am seeing on logstash log file while importing the json files.

[2019-05-15T21:22:12,604][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:286:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:191:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:159:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}
[2019-05-15T21:22:25,434][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:286:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:191:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:159:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}
[2019-05-15T21:23:16,616][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:286:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:191:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:159:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}
[2019-05-15T21:23:29,449][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:286:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:191:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:159:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}
[2019-05-15T21:24:20,631][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:286:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:191:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:159:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}
[2019-05-15T21:24:33,466][ERROR][logstash.outputs.elasticsearch] An unknown error occurred sending a bulk request to Elasticsearch. We will retry indefinitely {:error_message=>"bignum too big to convert into long'", :error_class=>"LogStash::Json::GeneratorError", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/json.rb:27:in jruby_dump'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:2577:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:119:in block in bulk'", "org/jruby/RubyArray.java:1792:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/http_client.rb:117:in bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:286:in safe_bulk'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:191:in submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:159:in retrying_submit'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.0.2-java/lib/logstash/outputs/elasticsearch/common.rb:38:in multi_receive'", "org/logstash/config/ir/compiler/OutputStrategyExt.java:118:in multi_receive'", "org/logstash/config/ir/compiler/AbstractOutputDelegatorExt.java:101:in multi_receive'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:235:in block in start_workers'"]}

@robcowart
Copy link
Owner

I am bit confused. Are you trying import dashbaords, or do you have a Logstash problem?

Importing dashboards is done the Kibana Management app, under Saved Objects.

The likely cause of the logstash error is discussed here... #330

@lintuxido
Copy link

lintuxido commented Jun 18, 2019

Hello robcowart,
I am also facing same issue,
Now I am on CentoOS 7 and using elastiflow-3.5.0, now trying to import elastiflow.kibana.7.0.x. but its getting failed. however tried other dashboards and only able to upload 6.3.x dashboard.json
Only 6.3.x is able to upload and that also showing errors like "Error in visualization aggs is undefined", I am able to see netflow data using tcpdump but unable to understand why its not able to view in kibana

@robcowart
Copy link
Owner

Which version of the Elastic Stack?

Please provide additional information... screenshots, logs, etc.

@lintuxido
Copy link

lintuxido commented Jun 18, 2019

Elastic Stack is "7.1.1"

2019-06-18 13_41_34-Kibana
2019-06-18 13_41_08-Kibana

attached error logs
logstash_error.txt
in text file

@robcowart
Copy link
Owner

You have attached a Logstash log. Importing Kibana dashboards has nothing to do with Logstash. You need to provide Kibana and Elasticsearch logs.

@robcowart
Copy link
Owner

Also in the log you provided there is a clear error java.lang.OutOfMemoryError: Java heap space

Please follow the installation instructions in INSTALL.md exactly. It covers setting the JVM heap size as required.

@lintuxido
Copy link

ok, will increase it, meanwhile please find attached the elasticsearch and kibana logs.
elasticsearch.log

kibana.txt

@robcowart
Copy link
Owner

I just imported the dashboards into 7.1.1 without any issue.

@lintuxido
Copy link

which dashboards should I use to import from kibana directory?
I just increased heap size but still unable to import dashboard. :(

@robcowart
Copy link
Owner

You tried to import the correct file.

Increasing the heap size of Logstash will not help you import the dashboards. Logstash has nothing to do with the dashboards. You should start with a completely clean Elasticsearch and Kibana instance, and try the import.

@lintuxido
Copy link

Any other way to import it by command line?? tried on new installation of centos 7 but still it showing same error, dont know what is the issue, yesterday I tried it on Ubuntu 18.04 usint all the proper instructions mentioned by you but still that was failed to import the dashboard

@lintuxido
Copy link

elastic search now showing index
curl http://localhost:9200/_cat/indices
yellow open elastiflow-3.5.0-2019.06.18 IGIJuWvQS7qefCYZecIlfQ 3 1 27070 0 8.2mb 8.2mb
But showing it as yellow,

[2019-06-18T16:47:49,408][INFO ][o.e.c.r.a.AllocationService] [sapmon] updating number_of_replicas to [0] for indices [.kibana_1]
[2019-06-18T16:47:49,492][INFO ][o.e.c.r.a.AllocationService] [sapmon] Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[.kibana_1][0]] ...]).
[2019-06-18T16:59:39,988][INFO ][o.e.c.m.MetaDataIndexTemplateService] [sapmon] adding template [elastiflow-3.5.0] for index patterns [elastiflow-3.5.0-*]
[2019-06-18T16:59:57,235][INFO ][o.e.c.m.MetaDataMappingService] [sapmon] [.kibana_1/As-59nzuSpOjQhSqN2_1bg] update_mapping [_doc]
[2019-06-18T17:01:43,505][INFO ][o.e.c.m.MetaDataMappingService] [sapmon] [.kibana_1/As-59nzuSpOjQhSqN2_1bg] update_mapping [_doc]
[2019-06-18T17:01:59,343][INFO ][o.e.c.m.MetaDataCreateIndexService] [sapmon] [elastiflow-3.5.0-2019.06.18] creating index, cause [auto(bulk api)], templates [elastiflow-3.5.0], shards [3]/[1], mappings [_doc]
[2019-06-18T17:01:59,607][INFO ][o.e.c.m.MetaDataMappingService] [sapmon] [elastiflow-3.5.0-2019.06.18/IGIJuWvQS7qefCYZecIlfQ] update_mapping [_doc]
[2019-06-18T17:02:11,197][INFO ][o.e.c.m.MetaDataMappingService] [sapmon] [.kibana_1/As-59nzuSpOjQhSqN2_1bg] update_mapping [_doc]
[2019-06-18T17:02:25,785][INFO ][o.e.c.m.MetaDataMappingService] [sapmon] [.kibana_1/As-59nzuSpOjQhSqN2_1bg] update_mapping [_doc]
[2019-06-18T17:02:40,081][INFO ][o.e.c.m.MetaDataMappingService] [sapmon] [.kibana_1/As-59nzuSpOjQhSqN2_1bg] update_mapping [_doc]

@lintuxido
Copy link

I am only able to import 6.3.x dashboard successfully. Now its working fine. 👍
Please check other dashboards as I doubt new installations are not able to import new dashboards.

@robcowart
Copy link
Owner

I just tested again with 7.1.1 and all is fine. Other users also have no problems. I have no idea what is causing your issue.

@lintuxido
Copy link

I found the issue.
Dashboard files which I was trying to import was html formatted files.
Now I just copy paste code in text file and imported it and its imported successfully.
Thanks for your support.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants