Description
When --json is active, the docs ask and docs chat commands stream AI-generated text paragraphs directly to process.stdout via streamAnswer() inside the handler, then return null. The factory sees --json and writes JSON.stringify(null) (i.e. the literal string null) to stdout after the streamed text. The combined output is rendered markdown paragraphs followed by null — not valid JSON.
This breaks machine consumers (LLMs, scripts, pipelines) that expect --json to produce exclusively well-formed JSON on stdout.
Both commands share the same root cause: they call streamAnswer() unconditionally regardless of --json, and both return null from their handler.
Steps to Reproduce
docs ask
- Configure a valid
.elasticrc.yml (any context; docs commands do not require it, but the CLI loads config on startup).
- Run:
elastic docs ask "what is an Elasticsearch index?" --json
- Observe stdout. You will see rendered markdown paragraphs (flushed incrementally by
streamAnswer) followed by a bare null line appended by the factory.
docs chat
-
Run:
elastic docs chat "what is an Elasticsearch index?" --json
-
Same problem: streamed markdown text followed by null on stdout.
With docs chat, the interactive follow-up loop is already suppressed when --json is active (the interactive guard handles that), but the initial question still streams text to stdout before the factory appends null.
Example output (abbreviated, same for both commands)
An Elasticsearch index is a collection of documents…
…optimized for search and analytics.
null
Expected behavior
--json should cause stdout to contain only valid JSON — for example:
{"answer":"An Elasticsearch index is a collection of documents…"}
No rendered markdown or bare null should appear.
Root Cause
In src/docs/ask.ts, the handler calls streamAnswer(gen, renderMarkdown, deps.stdout, spinner) which writes directly to deps.stdout (bound to process.stdout). After streamAnswer completes, the handler returns null.
In src/docs/chat.ts, the askQuestion helper calls streamAnswer() identically, writing to deps.stdout. The handler then returns null.
Back in src/factory.ts (line ~726), the factory action wrapper checks jsonFormat === true, sees the handler returned null, and writes JSON.stringify(null) + "\n" to process.stdout.
The two writes — streamAnswer flushing paragraphs and the factory serializing the return value — combine to produce invalid output.
The formatOutput: () => '' on both command configs only suppresses the non-JSON text path; it has no effect when --json is active.
Recommended Solution
When --json is active, buffer the full AI response text instead of streaming it, then return a structured JSON object from the handler.
docs ask
handler: async (parsed: ParsedResult): Promise<JsonValue> => {
// ...existing question/conversationId/spinner setup...
if (parsed.options['json'] === true) {
// Buffer mode: collect all chunks, return structured JSON
const chunks: string[] = []
for await (const event of deps.docsAskStream(question, conversationId)) {
if (event.kind === 'chunk') chunks.push(event.text)
}
return { answer: chunks.join('') }
}
// Interactive mode: stream paragraphs to stdout (existing behavior)
const gen = deps.docsAskStream(question, conversationId)
await streamAnswer(gen, renderMarkdown, deps.stdout, spinner)
return null
}
docs chat
Same approach for the initial question. Since the interactive follow-up loop is already disabled under --json, only the first answer needs buffering:
handler: async (parsed: ParsedResult): Promise<JsonValue> => {
// ...existing question/conversationId setup...
if (parsed.options['json'] === true) {
const chunks: string[] = []
for await (const event of deps.docsAskStream(question, conversationId)) {
if (event.kind === 'chunk') chunks.push(event.text)
}
return { answer: chunks.join('') }
}
// Interactive mode (existing behavior)
await askQuestion(question, conversationId, deps, interactive ? startSpinner(deps.stderr, 'Thinking…') : undefined)
// ...follow-up loop...
return null
}
This keeps the streaming UX for interactive use and produces clean JSON for --json.
Related
Parent issue: #71
Description
When
--jsonis active, thedocs askanddocs chatcommands stream AI-generated text paragraphs directly toprocess.stdoutviastreamAnswer()inside the handler, then returnnull. The factory sees--jsonand writesJSON.stringify(null)(i.e. the literal stringnull) to stdout after the streamed text. The combined output is rendered markdown paragraphs followed bynull— not valid JSON.This breaks machine consumers (LLMs, scripts, pipelines) that expect
--jsonto produce exclusively well-formed JSON on stdout.Both commands share the same root cause: they call
streamAnswer()unconditionally regardless of--json, and both returnnullfrom their handler.Steps to Reproduce
docs ask.elasticrc.yml(any context;docscommands do not require it, but the CLI loads config on startup).elastic docs ask "what is an Elasticsearch index?" --jsonstreamAnswer) followed by a barenullline appended by the factory.docs chatRun:
elastic docs chat "what is an Elasticsearch index?" --jsonSame problem: streamed markdown text followed by
nullon stdout.With
docs chat, the interactive follow-up loop is already suppressed when--jsonis active (theinteractiveguard handles that), but the initial question still streams text to stdout before the factory appendsnull.Example output (abbreviated, same for both commands)
Expected behavior
--jsonshould cause stdout to contain only valid JSON — for example:{"answer":"An Elasticsearch index is a collection of documents…"}No rendered markdown or bare
nullshould appear.Root Cause
In
src/docs/ask.ts, the handler callsstreamAnswer(gen, renderMarkdown, deps.stdout, spinner)which writes directly todeps.stdout(bound toprocess.stdout). AfterstreamAnswercompletes, the handler returnsnull.In
src/docs/chat.ts, theaskQuestionhelper callsstreamAnswer()identically, writing todeps.stdout. The handler then returnsnull.Back in
src/factory.ts(line ~726), the factory action wrapper checksjsonFormat === true, sees the handler returnednull, and writesJSON.stringify(null) + "\n"toprocess.stdout.The two writes —
streamAnswerflushing paragraphs and the factory serializing the return value — combine to produce invalid output.The
formatOutput: () => ''on both command configs only suppresses the non-JSON text path; it has no effect when--jsonis active.Recommended Solution
When
--jsonis active, buffer the full AI response text instead of streaming it, then return a structured JSON object from the handler.docs askdocs chatSame approach for the initial question. Since the interactive follow-up loop is already disabled under
--json, only the first answer needs buffering:This keeps the streaming UX for interactive use and produces clean JSON for
--json.Related
Parent issue: #71