Skip to content

[Dashboards] Recreate the data-exploration tutorial dashboard with the Dashboards API#6337

Open
florent-leborgne wants to merge 6 commits intomainfrom
docs-issue-5822-tutorial-api
Open

[Dashboards] Recreate the data-exploration tutorial dashboard with the Dashboards API#6337
florent-leborgne wants to merge 6 commits intomainfrom
docs-issue-5822-tutorial-api

Conversation

@florent-leborgne
Copy link
Copy Markdown
Member

@florent-leborgne florent-leborgne commented May 6, 2026

Summary

Closes the remaining checklist items of #5822 by adding a single collapsible Recreate this dashboard with one API call dropdown at the end of Step 3 in explore-analyze/kibana-data-exploration-learning-tutorial.md. The first checklist item (the Create programmatically overview page) was completed by #5927.

The dropdown contains one curl POST to /api/dashboards that recreates the finished tutorial dashboard, including the optional panels suggested across Step 3 (extra metrics, response size over time, pie, treemap). Eleven explicit <n> code callouts anchor each JSON panel to the tutorial sub-step that introduced it; the numbered list after the curl block highlights the non-obvious API patterns (breakdown_by, reference_lines as a separate layer, ES|QL column references in xy x/y, rows vs metrics in ES|QL data tables, pie/treemap with metrics + group_by).

The page from #5927 already covers the API payload dropdown on the Create dashboard page (item 2 of #5822) via the existing New ways to create dashboards tip, so no edit is needed on create-dashboard.md.

This supersedes #5770, which was drafted before the spec restructure landed.

Verification

The Dashboards API is in technical preview (x-state: Technical Preview; added in 9.4.0), so the embedded payload was rebuilt against the current dashboards-api-spec (commit 84120e3); the old draft used the pre-restructure schema (type: lens, dataset, operation-style x/y) which the API no longer accepts.

The current payload was POSTed end-to-end against a live serverless instance. HTTP 201, 11 panels created, panel set matches a fresh export from a manually-built tutorial dashboard. Schema details verified directly against openapi/kibana-openapi.yaml:

  • panel-level type vis with config.type (not lens)
  • data_source with data_view_spec or esql (not dataset)
  • ES|QL chart layers reference query result columns via x.column / y[].column
  • xy bar default is bar_stacked (matches Lens)
  • reference lines as a sibling layer of type reference_lines with thresholds
  • metric secondary with time_shift + compare
  • ES|QL data_table with rows (categorical) and metrics (values)

Maintenance

To keep the example honest as the schema evolves, this PR also adds:

  1. An HTML maintenance comment above the dropdown noting the spec source, last-verified commit, and pointer to the verification script.
  2. .github/scripts/verify-dashboards-api-example.py, a stdlib-only Python script that extracts the curl JSON from the markdown, strips <n> callouts, POSTs it against \${KIBANA_URL} with \${API_KEY}, asserts HTTP 201 + the expected panel count, then deletes the test dashboard.

Anyone editing the example can run:

KIBANA_URL=… API_KEY=… python3 .github/scripts/verify-dashboards-api-example.py

The script is designed to be promoted to a workflow_dispatch GitHub Action against a CI Kibana later if we want continuous drift detection.

Test plan

  • Preview renders the dropdown collapsed by default at the end of Step 3
  • applies_to badge on the dropdown shows preview / 9.4+
  • Code callouts <1><11> render as styled badges aligned with the corresponding panel titles/labels
  • All links resolve: dashboards/create-dashboards-programmatically.md, the in-page #explore-data-in-discover anchor, the external Dashboards API reference
  • python3 .github/scripts/verify-dashboards-api-example.py exits 0 against a Kibana 9.4+ instance with the sample logs dataset installed
  • Vale passes

Made with Cursor

florent-leborgne and others added 2 commits May 6, 2026 20:40
Adds a collapsible "Recreate this dashboard with one API call" dropdown at
the end of Step 3 of the Kibana data exploration learning tutorial. The
dropdown contains a single curl POST to /api/dashboards that recreates the
finished tutorial dashboard, including the optional panels suggested in
each sub-step (extra metrics, response size over time, pie, treemap).

Eleven explicit code callouts anchor each panel to the tutorial sub-step
that introduced it. The numbered list after the curl block calls out the
non-obvious API patterns: breakdown_by for split-by-host, reference_lines
as a separate layer, ES|QL column references in xy x/y, rows vs metrics in
ES|QL data tables, and pie/treemap with metrics + group_by.

The payload is gated at stack: preview 9.4+ / serverless: preview, matching
the rest of the Dashboards API documentation. An HTML maintenance comment
above the dropdown points editors at the verification script and notes the
spec source and date last verified, so the example can be re-checked when
the schema evolves.

Closes the second and third checklist items of #5822 (API payload on the
tutorial). Item one was completed by #5927.

Co-authored-by: Cursor <cursoragent@cursor.com>
Adds .github/scripts/verify-dashboards-api-example.py, a stdlib-only Python
script that extracts the curl POST payload from the data exploration
tutorial, strips docs-builder code callouts, and posts the result against
a live Kibana to assert it still creates a dashboard with the expected
panel count.

The Dashboards API is in technical preview, so its schema can change
between releases. The script gives editors a five-second feedback loop for
verifying the embedded example before merging payload changes:

    KIBANA_URL=... API_KEY=... \
      python3 .github/scripts/verify-dashboards-api-example.py

The script is parameterized on the markdown path and supports a --keep flag
for inspecting the resulting dashboard, otherwise the test dashboard is
deleted on success. Designed to be promoted to a workflow_dispatch GitHub
Action against a CI Kibana later if needed.

Co-authored-by: Cursor <cursoragent@cursor.com>
@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

Elastic Docs AI PR menu

Check the box to run an AI review for this pull request.

  • Review docs changes (docs-review). Status: not started.

Powered by GitHub Agentic Workflows and docs-actions. For more information, reach out to the docs team.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

✅ Vale Linting Results

No issues found on modified lines!


The Vale linter checks documentation changes against the Elastic Docs style guide.

To use Vale locally or report issues, refer to Elastic style guide for Vale.

@github-actions
Copy link
Copy Markdown
Contributor

github-actions Bot commented May 6, 2026

🔍 Preview links for changed docs

florent-leborgne and others added 2 commits May 6, 2026 21:06
The repo style guide forbids em dashes (use a comma, a colon, or rewrite).
The 11-item list mapping each panel to its tutorial sub-step used the
"**Title** — description" pattern, which is more naturally written as
"**Title**: description" anyway.

Co-authored-by: Cursor <cursoragent@cursor.com>
Each numbered explanation now links the bold sub-step name to the
auto-generated anchor on the matching {step} directive, so readers can
jump straight from a callout description back to the procedure that
introduced the panel. Items 2-4 (the optional metrics dropdown) point at
the parent metric step since {dropdown} directives don't have their own
anchors.

Co-authored-by: Cursor <cursoragent@cursor.com>
Bold + link is visual overkill; the link styling already makes the
sub-step name stand out.

Co-authored-by: Cursor <cursoragent@cursor.com>
Two changes to the verification script:

1. Each `raise SystemExit` site now has a short comment explaining what
   condition triggers it and what the error message means, with a
   numbered cross-reference to a new "Failure modes" section in the
   module docstring. So an editor reading the script (or hitting a
   failure) gets the full context without leaving the file.

2. When the API returns 201 but the stored dashboard has a different
   panel count from the request, the script now identifies which panels
   the server dropped by their grid position and prints `type` and
   `config.type` for each. Previously the message reported only the
   count mismatch, leaving the editor to find the dropped panel by
   eyeballing 11 visualization configs.

Co-authored-by: Cursor <cursoragent@cursor.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant