Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to get the raw JSON for searches, visualization and dashboard? #22

Closed
Jeeppler opened this issue Apr 3, 2019 · 10 comments
Closed

Comments

@Jeeppler
Copy link

Jeeppler commented Apr 3, 2019

How can I get the raw JSON for searches, visualization and dashboards to be able to use them in this module?

What do I have to do if I want to export existing searches, visualization and dashboards in the correct format? Is there a specific API endpoint I can use?

@Jeeppler
Copy link
Author

Jeeppler commented Apr 4, 2019

I useed the export functionality in the Kibana UI to export a search and got the following document:

[
  {
    "_id": "Metricbeat-Docker",
    "_type": "search",
    "_source": {
      "title": "Metricbeat Docker",
      "description": "",
      "hits": 0,
      "columns": [
        "_source"
      ],
      "sort": [
        "@timestamp",
        "desc"
      ],
      "version": 1,
      "kibanaSavedObjectMeta": {
        "searchSourceJSON": "{\"filter\":[],\"highlight\":{\"fields\":{\"*\":{}},\"fragment_size\":2147483647,\"post_tags\":[\"@/kibana-highlighted-field@\"],\"pre_tags\":[\"@kibana-highlighted-field@\"],\"require_field_match\":false},\"index\":\"metricbeat-*\",\"query\":{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"metricset.module:docker\"}},\"language\":\"lucene\"}}"
      }
    }
  }
]

Then I created an elasticsearch_kibana_object like this:

resource "elasticsearch_kibana_object" "metricbeat_docker_metricbeat_docker" {
  body = <<EOF
  [
    {
      "_id": "Metricbeat-Docker",
      "_type": "search",
      "_source": {
        "title": "Metricbeat Docker",
        "description": "",
        "hits": 0,
        "columns": [
          "_source"
        ],
        "sort": [
          "@timestamp",
          "desc"
        ],
        "version": 1,
        "kibanaSavedObjectMeta": {
          "searchSourceJSON": "{\"filter\":[],\"highlight\":{\"fields\":{\"*\":{}},\"fragment_size\":2147483647,\"post_tags\":[\"@/kibana-highlighted-field@\"],\"pre_tags\":[\"@kibana-highlighted-field@\"],\"require_field_match\":false},\"index\":\"metricbeat-*\",\"query\":{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"metricset.module:docker\"}},\"language\":\"lucene\"}}"
        }
      }
    }
  ]
EOF
}

The result is the following error:

* elasticsearch_kibana_object.metricbeat_docker_metricbeat_docker: 1 error(s) occurred:

* elasticsearch_kibana_object.metricbeat_docker_metricbeat_docker: elastic: Error 400 (Bad Request): Rejecting mapping update to [.kibana] as the final mapping would have more than 1 type: [search, doc] [type=illegal_argument_exception]

How to I properly export a search, visualization or dashboard to be able to create an Kibana object using this provider?

My ElasticSearch version is 6.4.

@Jeeppler
Copy link
Author

Jeeppler commented Apr 4, 2019

I found the explanation why I cannot just replay exported objects. However, I still do not have a solution on how to get the raw JSON which I can just replay against this provider.

@phillbaker
Copy link
Owner

phillbaker commented Apr 10, 2019

@kheldar in kibana, you should be able to go to http://localhost:9200/_plugin/kibana/app/kibana#/management/kibana/objects (or use the correct domain for your ES cluster) and use the export functionality to export existing objects.

@Jeeppler
Copy link
Author

@phillbaker yes, I tried to export the object via Kibana, but I get this error message:

Rejecting mapping update to [.kibana] as the final mapping would have more than 1 type: [search, doc] [type=illegal_argument_exception]

I run into the same issue with a visualization. I had to modify the exported visualization. My objective is to have the same MetricBeat dashboards, visualization and searches for different Kubernetes clusters. The current behaviour of MetricBeat is to overwrite existing dashboards, visualization and searches, unless they have a different name and index.

Here is the current working code:

cpu-usage.tpl

[
  {
    "_id": "visualization:metricbeat-docker-cpu-usage-${cluster_name}",
    "_type": "doc",
    "_source": {
      "type": "visualization",
      "visualization": {
        "title": "CPU usage [Metricbeat Docker - ${cluster_name}]",
        "visState": "{  \"type\": \"area\",  \"listeners\": {},  \"params\": {    \"scale\": \"linear\",    \"seriesParams\": [      {\"showCircles\": true,\"show\": \"true\",\"type\": \"area\",\"interpolate\": \"linear\",\"mode\": \"stacked\",\"drawLinesBetweenPoints\": true,\"valueAxis\": \"ValueAxis-1\",\"data\": {  \"id\": \"1\",  \"label\": \"Count\"}      }    ],    \"yAxis\": {},    \"smoothLines\": true,    \"categoryAxes\": [      {\"style\": {},\"scale\": {  \"type\": \"linear\"},\"show\": true,\"title\": {},\"labels\": {  \"truncate\": 100,  \"show\": true},\"position\": \"bottom\",\"type\": \"category\",\"id\": \"CategoryAxis-1\"      }    ],    \"legendPosition\": \"top\",    \"addTimeMarker\": false,    \"interpolate\": \"linear\",    \"addLegend\": true,    \"shareYAxis\": true,    \"grid\": {      \"style\": {\"color\": \"#eee\"      },      \"categoryLines\": false    },    \"mode\": \"stacked\",    \"defaultYExtents\": false,    \"setYExtents\": false,    \"addTooltip\": true,    \"valueAxes\": [      {\"style\": {},\"scale\": {  \"type\": \"linear\",  \"mode\": \"normal\"},\"name\": \"LeftAxis-1\",\"show\": true,\"title\": {  \"text\": \"Count\"},\"labels\": {  \"filter\": false,  \"rotate\": 0,  \"truncate\": 100,  \"show\": true},\"position\": \"left\",\"type\": \"value\",\"id\": \"ValueAxis-1\"      }    ],    \"times\": []  },  \"aggs\": [    {      \"params\": {\"field\": \"docker.cpu.total.pct\",\"customLabel\": \"Total CPU time\",\"percents\": [  75]      },      \"type\": \"percentiles\",      \"enabled\": true,      \"id\": \"1\",      \"schema\": \"metric\"    },    {      \"params\": {\"customInterval\": \"2h\",\"field\": \"@timestamp\",\"interval\": \"auto\",\"min_doc_count\": 1,\"extended_bounds\": {}      },      \"type\": \"date_histogram\",      \"enabled\": true,      \"id\": \"2\",      \"schema\": \"segment\"    },    {      \"params\": {\"orderBy\": \"1.75\",\"field\": \"docker.container.name\",\"customLabel\": \"Container name\",\"order\": \"desc\",\"size\": 5      },      \"type\": \"terms\",      \"enabled\": true,      \"id\": \"3\",      \"schema\": \"group\"    }  ],  \"title\": \"CPU usage [Metricbeat Docker - ${cluster_name}]\"}",
        "uiStateJSON": "{}",
        "description": "",
        "version": 1,
        "kibanaSavedObjectMeta": {
          "searchSourceJSON": "{\"filter\":[],\"highlight\":{\"fields\":{\"*\":{}},\"fragment_size\":2147483647,\"post_tags\":[\"@/kibana-highlighted-field@\"],\"pre_tags\":[\"@kibana-highlighted-field@\"],\"require_field_match\":false},\"index\":\"${index}\",\"query\":{\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"metricset.module:docker AND metricset.name:cpu\"}},\"language\":\"lucene\"}}"
        }
      }
    }
  }
]

The template file data source and elastic Kibana object in main.tf:

data "template_file" "metricbeat_docker_cpu_usage" {
  template = "${file("metricbeat/docker/cpu-usage.tpl")}"
  vars = {
    index = "${var.index}"
    cluster_name = "${var.cluster_name}"
  }
}

resource "elasticsearch_kibana_object" "metricbeat_docker_cpu_usage" {
  body = "${data.template_file.metricbeat_docker_cpu_usage.rendered}"
}

Here is same Kibana object exported from Kibana via export functionality with place holders:

cpu-usage-export.json

[
  {
    "_id": "Docker-CPU-usage-${CLUSTER_NAME}",
    "_type": "visualization",
    "_source": {
      "description": "",
      "kibanaSavedObjectMeta": {
        "searchSourceJSON": "{\"filter\":[],\"highlight\":{\"fields\":{\"*\":{}},\"fragment_size\":2147483647,\"post_tags\":[\"@/kibana-highlighted-field@\"],\"pre_tags\":[\"@kibana-highlighted-field@\"],\"require_field_match\":false},\"index\":\"${INDEX}\",\"query\":{\"query_string\":{\"analyze_wildcard\":true,\"query\":\"metricset.module:docker AND metricset.name:cpu\"}}}"
      },
      "title": "CPU usage [Metricbeat Docker - ${CLUSTER_NAME}]",
      "uiStateJSON": "{}",
      "version": 1,
      "visState": "{\"type\": \"area\", \"listeners\": {}, \"params\": {\"scale\": \"linear\", \"seriesParams\": [{\"showCircles\": true, \"show\": \"true\", \"type\": \"area\", \"interpolate\": \"linear\", \"mode\": \"stacked\", \"drawLinesBetweenPoints\": true, \"valueAxis\": \"ValueAxis-1\", \"data\": {\"id\": \"1\", \"label\": \"Count\"}}], \"yAxis\": {}, \"smoothLines\": true, \"categoryAxes\": [{\"style\": {}, \"scale\": {\"type\": \"linear\"}, \"show\": true, \"title\": {}, \"labels\": {\"truncate\": 100, \"show\": true}, \"position\": \"bottom\", \"type\": \"category\", \"id\": \"CategoryAxis-1\"}], \"legendPosition\": \"top\", \"addTimeMarker\": false, \"interpolate\": \"linear\", \"addLegend\": true, \"shareYAxis\": true, \"grid\": {\"style\": {\"color\": \"#eee\"}, \"categoryLines\": false}, \"mode\": \"stacked\", \"defaultYExtents\": false, \"setYExtents\": false, \"addTooltip\": true, \"valueAxes\": [{\"style\": {}, \"scale\": {\"type\": \"linear\", \"mode\": \"normal\"}, \"name\": \"LeftAxis-1\", \"show\": true, \"title\": {\"text\": \"Count\"}, \"labels\": {\"filter\": false, \"rotate\": 0, \"truncate\": 100, \"show\": true}, \"position\": \"left\", \"type\": \"value\", \"id\": \"ValueAxis-1\"}], \"times\": []}, \"aggs\": [{\"params\": {\"field\": \"docker.cpu.total.pct\", \"customLabel\": \"Total CPU time\", \"percents\": [75]}, \"type\": \"percentiles\", \"enabled\": true, \"id\": \"1\", \"schema\": \"metric\"}, {\"params\": {\"customInterval\": \"2h\", \"field\": \"@timestamp\", \"interval\": \"auto\", \"min_doc_count\": 1, \"extended_bounds\": {}}, \"type\": \"date_histogram\", \"enabled\": true, \"id\": \"2\", \"schema\": \"segment\"}, {\"params\": {\"orderBy\": \"1.75\", \"field\": \"docker.container.name\", \"customLabel\": \"Container name\", \"order\": \"desc\", \"size\": 5}, \"type\": \"terms\", \"enabled\": true, \"id\": \"3\", \"schema\": \"group\"}], \"title\": \"CPU usage [Metricbeat Docker - ${CLUSTER_NAME}]\"}"
    },
    "_meta": {
      "savedObjectVersion": 2
    }
  }
]

To replace the place holders sed can be used.

The key part here is the beginning of the file:

cpu-usage.tpl

    "_id": "visualization:metricbeat-docker-cpu-usage-${cluster_name}",
    "_type": "doc",
    "_source": {
      "type": "visualization",
      "visualization": {

and

cpu-usage-export.json

    "_id": "Docker-CPU-usage-${CLUSTER_NAME}",
    "_type": "visualization",
    "_source": {
      "description": "",

as you can see types are different. The cpu-usage.tpl has a the _type as doc whereas the export has the _type as visualization. In addition, the cpu-usage.tpl has an additional type attribute and contains an nested visualization object with all the details. In the cpu-usage-export.json file both are missing and it starts directly with the visualization code.

I figured out how to format the cpu-usage.tpl is by looking at your example code test_visualization_v6 in the README.md. Good documentation pays of 😉.

@phillbaker
Copy link
Owner

phillbaker commented Apr 11, 2019 via email

@Jeeppler
Copy link
Author

No, your test_visualization_v6 works well with Kibana version 6. However, if I export objects from Kibana 6 using the Web UI I will get an output which I cannot simply put into this provider. I figured out how to do it with one visualization, based on the test_visualization_v6 from the README.md. This was try and error and almost not repeatable.

Furthermore, I have no idea what the correct format for searches and dashboards is, because there are no examples in the README.md.

In addition, the Web UI export format seems to be incompatible with the API's internal format. Here is the answer from the forum:

At this time the import logic is in the web UI, it reads the file, transforms the objects and calls the create API one object at a time.

As a workaround to get your file working with the bulk_create API, you will have to replicate the front end logic and transform the attributes to what is expected by the bulk_create API.

[...]

We are currently working on a server side import and export API that would accept files directly but in the meantime this is the best workaround I can think of if the web UI import isn't suitable.

source: https://discuss.elastic.co/t/saved-objects-api-use-example/168742/2

My questions are:

  1. What is the proper format for Kibana v6 for both searches and dashboards? A README.md example, would be nice.

  2. Is there an ElasticSearch or Kibana API call to get the correct JSON format for visualization, searches and dashboards for this provider? The export function in the Web UI, clearly does not provide the correct format.

@phillbaker
Copy link
Owner

Odd. There's a test in the code using the format from the Kibana export which is passing:

body = <<EOF
[
{
"_id": "response-time-percentile",
"_type": "visualization",
"_source": {
"title": "Total response time percentiles",
"visState": "{\"title\":\"Total response time percentiles\",\"type\":\"line\",\"params\":{\"addTooltip\":true,\"addLegend\":true,\"legendPosition\":\"right\",\"showCircles\":true,\"interpolate\":\"linear\",\"scale\":\"linear\",\"drawLinesBetweenPoints\":true,\"radiusRatio\":9,\"times\":[],\"addTimeMarker\":false,\"defaultYExtents\":false,\"setYExtents\":false},\"aggs\":[{\"id\":\"1\",\"enabled\":true,\"type\":\"percentiles\",\"schema\":\"metric\",\"params\":{\"field\":\"app.total_time\",\"percents\":[50,90,95]}},{\"id\":\"2\",\"enabled\":true,\"type\":\"date_histogram\",\"schema\":\"segment\",\"params\":{\"field\":\"@timestamp\",\"interval\":\"auto\",\"customInterval\":\"2h\",\"min_doc_count\":1,\"extended_bounds\":{}}},{\"id\":\"3\",\"enabled\":true,\"type\":\"terms\",\"schema\":\"group\",\"params\":{\"field\":\"system.syslog.program\",\"size\":5,\"order\":\"desc\",\"orderBy\":\"_term\"}}],\"listeners\":{}}",
"uiStateJSON": "{}",
"description": "",
"version": 1,
"kibanaSavedObjectMeta": {
"searchSourceJSON": "{\"index\":\"filebeat-*\",\"query\":{\"query_string\":{\"query\":\"*\",\"analyze_wildcard\":true}},\"filter\":[]}"
}
}
}
]
EOF

@Jeeppler
Copy link
Author

Jeeppler commented Apr 29, 2019

@phillbaker Kibana exported all objects in Kibana v5 format, even though I used Kibana 6 (ElasticSearch v6).

@wloczykij2
Copy link

Hi,
I used to get object directly from .kibana index and it works for me

@phillbaker
Copy link
Owner

@wloczykij2's comment is correct, once past ESv5, the way to get raw JSON for kibana objects is directly from the index. Documented this a bit in a53e1fd. Closing this issue and will open a separate one for switching to use the kibana API instead of the index directly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants