Skip to content

Conversation

@henfiber
Copy link
Contributor

@henfiber henfiber commented Jun 1, 2025

The content field in a user message may be an array instead of a plain string. This case is handled properly by llama.cpp and all OpenAI-compatible public services I tried. More specifically, I encountered this issue when trying to use llamafile with Smart Composer Obsidian extension which formats messages in such an array.

  {
      "role": "user",
      "content": [
        {
          "type": "text",
          "text": "Some user prompt."
        }
      ]
    }

With these changes, both server versions (regular and --v2) work properly with the above format.

Copy link
Collaborator

@mofosyne mofosyne left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks reasonable and code changes appear to match intent. Approving but will wait for others to double check.

@cjpais
Copy link
Collaborator

cjpais commented Jun 29, 2025

I'll try to give this a test this week.

@cjpais
Copy link
Collaborator

cjpais commented Jun 30, 2025

Tested this and it seems to work. Matches OpenAI spec for the 'text' type. Pulling it in.

@cjpais cjpais merged commit cfa861a into mozilla-ai:main Jun 30, 2025
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants