Skip to content

docs: restructure spark-install.md for clearer onboarding flow#857

Merged
cv merged 3 commits intoNVIDIA:mainfrom
Junior00619:docs/restructure-spark-install
Mar 25, 2026
Merged

docs: restructure spark-install.md for clearer onboarding flow#857
cv merged 3 commits intoNVIDIA:mainfrom
Junior00619:docs/restructure-spark-install

Conversation

@Junior00619
Copy link
Contributor

@Junior00619 Junior00619 commented Mar 25, 2026

Resolves #711

The current spark-install.md drops users into a Quick Start before
they've confirmed prerequisites, buries troubleshooting behind
architecture notes, and scatters related information across
disconnected sections. This makes first-run setup on DGX Spark harder
to follow than it needs to be.

This PR reorders and lightly restructures the doc without changing any
of the actual instructions or commands:

  • Overview → Prerequisites → Quick Start → Verify → Troubleshooting
    → Technical Reference
    — matches the order a user actually works
    through during setup.
  • Clarifies that OpenShell is pulled in automatically by the installer,
    removing the standalone curl step from Quick Start that implied it
    was a manual prerequisite.
  • Numbers the Ollama local-inference sub-steps so they read as a
    sequential walkthrough instead of a flat list of headings.
  • Consolidates Known Issues and Manual Setup under a single
    Troubleshooting section — the first place users look when something
    breaks.
  • Moves the architectural explanation and Spark-specific differences
    into a Technical Reference section at the bottom for anyone who wants
    the context without it blocking the happy path.

No functional changes — only section ordering, heading hierarchy, and
a short introductory paragraph were added.

Summary by CodeRabbit

  • Documentation
    • Updated DGX/Spark install guide with DGX-specific intro and clearer prerequisites (Docker present, Node.js auto-installed by installer, OpenShell required only for local installs)
    • Revised setup and quick-start guidance to reference cgroup v2 and Docker permission troubleshooting
    • Simplified Uninstall and Local Inference flows into numbered steps with explicit reinstall guidance
    • Added Troubleshooting (Known Issues, Manual Setup) and moved platform details to Technical Reference; renamed Architecture and simplified diagram labels

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 25, 2026

Note

Reviews paused

It looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the reviews.auto_review.auto_pause_after_reviewed_commits setting.

Use the following commands to manage reviews:

  • @coderabbitai resume to resume automatic reviews.
  • @coderabbitai review to trigger a single review.

Use the checkboxes below for quick actions:

  • ▶️ Resume reviews
  • 🔍 Trigger review
📝 Walkthrough

Walkthrough

spark-install.md reorganized: Overview and Prerequisites moved earlier; Quick Start updated to reflect installer-managed Node.js/OpenShell and a setup script for cgroup/Docker fixes; Troubleshooting consolidated with Known Issues and Manual Setup fallback; Local Inference steps renumbered; Architecture label simplified and Docker version removed. (49 words)

Changes

Cohort / File(s) Summary
Documentation (single file)
spark-install.md
Reordered and reworded content: added Overview, moved Prerequisites earlier, updated Quick Start to reference scripts/setup-spark.sh (cgroup v2 + Docker perms) and installer behavior for Node.js/OpenShell, clarified uninstall/reinstall flow, renumbered Local Inference (Ollama) steps, consolidated Troubleshooting with a Known Issues table and Manual Setup fallback, relocated “What’s Different on Spark” under Technical Reference, and renamed “Architecture Notes” to “Architecture” while removing the Docker version label from the diagram. Review attention: Quick Start wording about installer-managed components, Known Issues table accuracy, and Technical Reference relocation.

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~10 minutes

Poem

🐰 I hopped through headers, tidy and quick,

Rearranged steps like carrots in a pick.
Quick Start now skips the tumble and trip,
Troubleshoot table in a neat little strip.
Docs polished — a joyful rabbit skip!

🚥 Pre-merge checks | ✅ 5
✅ Passed checks (5 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title accurately summarizes the main change: restructuring spark-install.md documentation for improved onboarding flow clarity.
Linked Issues check ✅ Passed All coding-related objectives from issue #711 are met: sections reordered to user-centric flow, prerequisites clarified, Quick Start streamlined, verification steps included, troubleshooting consolidated, and technical reference created.
Out of Scope Changes check ✅ Passed All changes are within scope—documentation restructuring and content reorganization directly address issue #711 requirements; no unrelated functional changes introduced.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@spark-install.md`:
- Line 11: Replace the hardcoded Docker minor version string in the markdown
lines that read "**Docker** (pre-installed on DGX Spark, v28.x)" (and the
identical occurrences further down) with a generic, non-version-specific
phrasing such as "**Docker** (pre-installed on DGX Spark)" or "Docker
(pre-installed)" so the docs do not assert a specific minor stream; update all
instances of that exact phrase to remove the "v28.x" pin and leave a neutral
statement about Docker being pre-installed.
- Around line 12-13: Update the documentation text that currently states
"OpenShell CLI (installed automatically by the NemoClaw installer)" to clarify
that OpenShell is NOT installed by the NemoClaw installer and must be installed
separately before running NemoClaw; reference the installer script invocation
(./install.sh) and the runtime check in the code
(nemoclaw/src/blueprint/runner.ts which will throw "openshell CLI not found.
Install OpenShell first") so users know to pre-install OpenShell prior to
running the installer or NemoClaw.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: 88971c2e-4ff6-4b23-a1a6-4f3a155de4e2

📥 Commits

Reviewing files that changed from the base of the PR and between 36fa334 and ed74697.

📒 Files selected for processing (1)
  • spark-install.md

@Junior00619 Junior00619 force-pushed the docs/restructure-spark-install branch from ed74697 to 48424e0 Compare March 25, 2026 03:23
@Junior00619
Copy link
Contributor Author

Junior00619 commented Mar 25, 2026

@coderabbitai resume

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 25, 2026

@Junior00619 Sure! Resuming automated reviews for this PR.

✅ Actions performed

Reviews resumed.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Mar 25, 2026

✅ Actions performed

Reviews resumed.

@Junior00619 Junior00619 force-pushed the docs/restructure-spark-install branch from f79c188 to 3b41d7d Compare March 25, 2026 03:54
Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@spark-install.md`:
- Around line 32-33: Remove the incorrect parenthetical claim "(this also
installs OpenShell if needed)" in the line that contains the hosted install
command curl -fsSL https://www.nvidia.com/nemoclaw.sh | bash so the sentence
only advertises the hosted install script (which installs Node.js, Ollama if GPU
present, and NemoClaw) and does not mention OpenShell.

ℹ️ Review info
⚙️ Run configuration

Configuration used: Path: .coderabbit.yaml

Review profile: CHILL

Plan: Pro

Run ID: df21a56f-e242-437f-af25-b77c0ff17cea

📥 Commits

Reviewing files that changed from the base of the PR and between 48424e0 and 69009fd.

📒 Files selected for processing (1)
  • spark-install.md

@Junior00619 Junior00619 force-pushed the docs/restructure-spark-install branch from 713d891 to a189c2f Compare March 25, 2026 05:45
@mholtmanns
Copy link

Looking at it it is still confusing. If I want to use local inference and go through it step by step, I read it now as if I first need to install it with external API reference, then uninstall, then install it again after installing Ollama locally.
Do I need an API key if I only use local inference?
I suggest to add the local inference option as a comment in "Prerequisites" and linking to the relevant part in the doc.

@Junior00619
Copy link
Contributor Author

Looking at it it is still confusing. If I want to use local inference and go through it step by step, I read it now as if I first need to install it with external API reference, then uninstall, then install it again after installing Ollama locally. Do I need an API key if I only use local inference? I suggest to add the local inference option as a comment in "Prerequisites" and linking to the relevant part in the doc.

Good catch , you're right, the flow reads as install >uninstall > install Ollama > reinstall, which is confusing and unnecessary for someone who knows upfront they want local inference. Fixed em in the latest push

@Junior00619 Junior00619 force-pushed the docs/restructure-spark-install branch 2 times, most recently from 66c641a to e1027f2 Compare March 25, 2026 15:17
Reorder sections from the current confusing layout into a logical progression:
Overview → Prerequisites → Quick Start → Verify → Troubleshooting → Technical
Reference.

- Add brief overview paragraph explaining what the guide covers
- Clarify that OpenShell is installed automatically (not a manual prerequisite)
- Remove redundant OpenShell install step from Quick Start
- Number Ollama setup steps for easier navigation
- Group Known Issues and Manual Setup under a Troubleshooting section
- Move What's Different on Spark and Architecture into Technical Reference

Fixes NVIDIA#711
Address reviewer feedback: make it clear that an API key is only
needed for cloud inference, link to the Local Inference section from
Prerequisites, and restructure step 5 so users who haven't installed
yet don't need to uninstall/reinstall.
Copy link
Contributor

@prekshivyas prekshivyas left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Clean restructure — the new flow (Prerequisites → Quick Start → Verify → Troubleshooting → Technical Reference) matches how a user actually works through setup. Numbering the Ollama steps and consolidating troubleshooting under one section are nice improvements. LGTM.

@cv cv merged commit 1e2c826 into NVIDIA:main Mar 25, 2026
1 check passed
cv added a commit to cluster2600/NemoClaw that referenced this pull request Mar 26, 2026
Slot PR NVIDIA#885 content into NVIDIA#857 structure per reviewer:
known-issue rows to Troubleshooting, Web Dashboard and
NIM arm64 to Technical Reference, hardware details to
Prerequisites and Architecture diagram.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[NeMoClaw][SPark] spark-install.md doc content structure suggestion

4 participants