Skip to content

bug: docs ask and docs chat with --json produce mixed text and JSON output #160

@JoshMock

Description

@JoshMock

Description

When --json is active, the docs ask and docs chat commands stream AI-generated text paragraphs directly to process.stdout via streamAnswer() inside the handler, then return null. The factory sees --json and writes JSON.stringify(null) (i.e. the literal string null) to stdout after the streamed text. The combined output is rendered markdown paragraphs followed by null — not valid JSON.

This breaks machine consumers (LLMs, scripts, pipelines) that expect --json to produce exclusively well-formed JSON on stdout.

Both commands share the same root cause: they call streamAnswer() unconditionally regardless of --json, and both return null from their handler.

Steps to Reproduce

docs ask

  1. Configure a valid .elasticrc.yml (any context; docs commands do not require it, but the CLI loads config on startup).
  2. Run:
    elastic docs ask "what is an Elasticsearch index?" --json
  3. Observe stdout. You will see rendered markdown paragraphs (flushed incrementally by streamAnswer) followed by a bare null line appended by the factory.

docs chat

  1. Run:

    elastic docs chat "what is an Elasticsearch index?" --json
  2. Same problem: streamed markdown text followed by null on stdout.

    With docs chat, the interactive follow-up loop is already suppressed when --json is active (the interactive guard handles that), but the initial question still streams text to stdout before the factory appends null.

Example output (abbreviated, same for both commands)

An Elasticsearch index is a collection of documents…

…optimized for search and analytics.

null

Expected behavior

--json should cause stdout to contain only valid JSON — for example:

{"answer":"An Elasticsearch index is a collection of documents…"}

No rendered markdown or bare null should appear.

Root Cause

In src/docs/ask.ts, the handler calls streamAnswer(gen, renderMarkdown, deps.stdout, spinner) which writes directly to deps.stdout (bound to process.stdout). After streamAnswer completes, the handler returns null.

In src/docs/chat.ts, the askQuestion helper calls streamAnswer() identically, writing to deps.stdout. The handler then returns null.

Back in src/factory.ts (line ~726), the factory action wrapper checks jsonFormat === true, sees the handler returned null, and writes JSON.stringify(null) + "\n" to process.stdout.

The two writes — streamAnswer flushing paragraphs and the factory serializing the return value — combine to produce invalid output.

The formatOutput: () => '' on both command configs only suppresses the non-JSON text path; it has no effect when --json is active.

Recommended Solution

When --json is active, buffer the full AI response text instead of streaming it, then return a structured JSON object from the handler.

docs ask

handler: async (parsed: ParsedResult): Promise<JsonValue> => {
  // ...existing question/conversationId/spinner setup...

  if (parsed.options['json'] === true) {
    // Buffer mode: collect all chunks, return structured JSON
    const chunks: string[] = []
    for await (const event of deps.docsAskStream(question, conversationId)) {
      if (event.kind === 'chunk') chunks.push(event.text)
    }
    return { answer: chunks.join('') }
  }

  // Interactive mode: stream paragraphs to stdout (existing behavior)
  const gen = deps.docsAskStream(question, conversationId)
  await streamAnswer(gen, renderMarkdown, deps.stdout, spinner)
  return null
}

docs chat

Same approach for the initial question. Since the interactive follow-up loop is already disabled under --json, only the first answer needs buffering:

handler: async (parsed: ParsedResult): Promise<JsonValue> => {
  // ...existing question/conversationId setup...

  if (parsed.options['json'] === true) {
    const chunks: string[] = []
    for await (const event of deps.docsAskStream(question, conversationId)) {
      if (event.kind === 'chunk') chunks.push(event.text)
    }
    return { answer: chunks.join('') }
  }

  // Interactive mode (existing behavior)
  await askQuestion(question, conversationId, deps, interactive ? startSpinner(deps.stderr, 'Thinking…') : undefined)
  // ...follow-up loop...
  return null
}

This keeps the streaming UX for interactive use and produces clean JSON for --json.

Related

Parent issue: #71

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No fields configured for Bug.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions