Skip to content

Ollama adapter ignores global systemPrompts #567

@wiizzl

Description

@wiizzl

TanStack AI version

0.18.0

Framework/Library version

React 19.2.6 + TanStack Start 1.168.2

Describe the bug and the steps to reproduce it

Hi! I noticed that global systemPrompts are completely ignored when using the @tanstack/ai-ollama chat adapter.

The issue

In packages/typescript/ai-ollama/src/adapters/text.ts, the mapCommonOptionsToOllama method appends systemPrompts at the root level of the payload ({ system: "..." }).

While this is correct for Ollama's generation endpoint (/api/generate), the chat endpoint (/api/chat used by this.client.chat()) ignores it. It expects the system prompt to be the first object inside the messages array with role: "system".

Proposed fix

We just need to prepend the system prompt to the formatted messages array instead:

private mapCommonOptionsToOllama(options: TextOptions): ChatRequest {
  const model = options.model
  const modelOptions = options.modelOptions as
      | OllamaTextProviderOptions
      | undefined

  const ollamaOptions = {
    temperature: options.temperature,
    top_p: options.topP,
    num_predict: options.maxTokens,
    ...modelOptions,
  }

  const formattedMessages = this.formatMessages(options.messages)

  if (options.systemPrompts?.length) {
    formattedMessages.unshift({
      role: 'system',
      content: options.systemPrompts.join('\n'),
    })
  }

  return {
    model,
    options: ollamaOptions,
    messages: formattedMessages,
    tools: this.convertToolsToOllamaFormat(options.tools),
  }
}

I've patched this locally and tested it with deepseek-r1. It completely fixes the behavior.

I'm opening a PR right away to fix this!

Your Minimal, Reproducible Example - (Sandbox Highly Recommended)

I didn't include a reproduction since this is a straightforward API implementation bug; the code snippet below points directly to the exact file and lines causing the mismatch.

Screenshots or Videos (Optional)

No response

Do you intend to try to help solve this bug with your own PR?

Yes, I am also opening a PR that solves the problem along side this issue

Terms & Code of Conduct

  • I agree to follow this project's Code of Conduct
  • I understand that if my bug cannot be reliable reproduced in a debuggable environment, it will probably not be fixed and this issue may even be closed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions