Skip to content

mcp-client JSON.parse error for local ollama #1502

@alexdconf

Description

@alexdconf

I am trying to get a local ollama-backed tiny-agent, with playwright, to work. I have llama3.1:latest working with ollama. I have a working tiny-agent which I execute via npx @huggingface/tiny-agents run ./my-agent and such that my agent.json looks like


{
  "model": "llama3.1:latest",
  "endpointUrl": "http://localhost:11434",
  "servers": [
    {
      "type": "sse",
      "config": {
        "url": "http://localhost:8931/sse"
      }
    }
  ]
}

I spin up playwright, before running the agent, manually via this command npx @playwright/mcp --port 8931.
The agent starts and I get a cursor, but when I enter a prompt, I always get some version of this error (JSON parse error):


Agent loaded with 25 tools:
- browser_close
- browser_resize
- browser_console_messages
- browser_handle_dialog
- browser_file_upload
- browser_install
- browser_press_key
- browser_navigate
- browser_navigate_back
- browser_navigate_forward
- browser_network_requests
- browser_pdf_save
- browser_take_screenshot
- browser_snapshot
- browser_click
- browser_drag
- browser_hover
- browser_type
- browser_select_option
- browser_tab_list
- browser_tab_new
- browser_tab_select
- browser_tab_close
- browser_generate_playwright_test
- browser_wait_for
> Close browser

browser_close {}
/home/.../playground/repos/huggingface.js/packages/tiny-agents/dist/cli.js:118
    throw err;
    ^

SyntaxError: Unexpected token { in JSON at position 2
    at JSON.parse ()
    at Agent.processSingleTurnWithTools (/home/.../playground/repos/huggingface.js/packages/mcp-client/dist/src/index.js:233:71)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Agent.run (/home/.../playground/repos/huggingface.js/packages/mcp-client/dist/src/index.js:328:9)
    at async mainCliLoop (/home/.../playground/repos/huggingface.js/packages/tiny-agents/dist/cli.js:129:22)
    at async main (/home/.../playground/repos/huggingface.js/packages/tiny-agents/dist/cli.js:343:5)

Node.js v18.19.1

I think the issue has to do with JSON.parse calls. I've tried stdio, http and sse for connecting to playwright, including letting the tiny-agent software issue the npx command for me. I think it's something to do with my leveraging a local model? This is why I'm here, I couldn't figure it out. Any and all assistance is appreciated.
My versions of packages are:


dependencies:
@huggingface/tiny-agents link:../repos/huggingface.js/packages/tiny-agents
@modelcontextprotocol/sdk 1.12.0
@playwright/mcp 0.0.27
express 5.1.0
ollama 0.5.15

(I was originally on @huggingface/tiny-agents v0.2.3, but I pulled down the repo, built and installed it to see if that would work, it didn't.)
I'm running Ubuntu 24.04.

Activity

sdelahaies

sdelahaies commented on May 30, 2025

@sdelahaies

I am facing a similar issue with while trying to complete the unit 2 of HF mcp-course Building a Tiny Agent with TypeScript

Borrowing from your json config, I use

{
	"model": "llama3.2:3b",
	"endpointUrl": "http://localhost:11434",
	"servers": [
		{
			"type": "stdio",
			"config": {
				"command": "npx",
				"args": [
					"mcp-remote",
					"http://localhost:7860/gradio_api/mcp/sse"
				]
			}
		}
	]
}

and running npx @huggingface/tiny-agents run ./my-agent

$ npx @huggingface/tiny-agents run ./my-agent
[189887] Using automatically selected callback port: 19650
[189887] [189887] Connecting to remote server: http://localhost:7860/gradio_api/mcp/sse
[189887] Using transport strategy: http-first
[189887] Received error: Error POSTing to endpoint (HTTP 405): Method Not Allowed
[189887] Recursively reconnecting for reason: falling-back-to-alternate-transport
[189887] [189887] Connecting to remote server: http://localhost:7860/gradio_api/mcp/sse
[189887] Using transport strategy: sse-only
[189887] Connected to remote server using SSEClientTransport
[189887] Local STDIO server running
[189887] Proxy established successfully between local STDIO and remote SSEClientTransport
[189887] Press Ctrl+C to exit
[189887] [Local→Remote] initialize
[189887] {
  "jsonrpc": "2.0",
  "id": 0,
  "method": "initialize",
  "params": {
    "protocolVersion": "2025-03-26",
    "capabilities": {},
    "clientInfo": {
      "name": "@huggingface/mcp-client (via mcp-remote 0.1.9)",
      "version": "0.2.0"
    }
  }
}
[189887] [Remote→Local] 0
[189887] [Local→Remote] notifications/initialized
[189887] [Local→Remote] tools/list
[189887] [Remote→Local] 1
Agent loaded with 1 tools:
- predict
> analyse this sentence: I am feeling good today!
<Tool call_5v1bkkdp>
predict {"text":"I am feeling good today!"}
/home/sylvain/node_modules/@huggingface/tiny-agents/dist/cli.js:120
    throw err;
    ^

SyntaxError: Unexpected non-whitespace character after JSON at position 35
    at JSON.parse (<anonymous>)
    at Agent.processSingleTurnWithTools (/home/sylvain/node_modules/@huggingface/mcp-client/dist/src/index.js:233:71)
    at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Agent.run (/home/sylvain/node_modules/@huggingface/mcp-client/dist/src/index.js:328:9)
    at async mainCliLoop (/home/sylvain/node_modules/@huggingface/tiny-agents/dist/cli.js:131:22)
    at async main (/home/sylvain/node_modules/@huggingface/tiny-agents/dist/cli.js:301:5)

Node.js v20.12.2
[189887] 
Shutting down...

leveraging the local model doesn't seem to be an issue here as we see a tool call with I guess a legit call to the mcp tool... but still a json.parse issue in the end...

sdelahaies

sdelahaies commented on Jun 1, 2025

@sdelahaies

@alexdconf my problem may be related to yours after all, inspecting .../node_modules/@huggingface/mcp-client/dist/src/index.js and putting console.log everywhere to debug, it seems that toolCall.function.arguments in line 233 contains twice the arguments string because of lines 222-224, thus commenting 222-224

// if (toolCall.function.arguments) {
//   finalToolCalls[toolCall.index].function.arguments += toolCall.function.arguments;
// }

fixes the parsing issue as we now have a well formed json dump.

I can make it work with the following example

npx @huggingface/tiny-agents run ./get-started

with the following agent config

{
  "model": "llama3.2:3b",
  "endpointUrl": "http://localhost:11434",
  "servers": [
    {
      "type": "http",
      "config": {
        "url": "https://evalstate-hf-mcp-server.hf.space/mcp"
      }
    }
  ]
}

and the prompt find details about MNIST dataset, however when the LLM pass an argument that should be a number as a string, I am getting an error, for instance

> search about llama3 model
<Tool call_igzqggqb>
model_search {"author":"null","library":"null","limit":"100","query":"llama3","sort":"downloads","task":"null"}[
  {
    id: 'call_igzqggqb',
    index: 0,
    type: 'function',
    function: {
      name: 'model_search',
      arguments: '{"author":"null","library":"null","limit":"100","query":"llama3","sort":"downloads","task":"null"}'
    }
  }
]

ERROR: {"author":"null","library":"null","limit":"100","query":"llama3","sort":"downloads","task":"null"}
JSON PARSE string

/home/sylvain/node_modules/@huggingface/tiny-agents/dist/cli.js:120
    throw err;
    ^

McpError: MCP error -32602: MCP error -32602: Invalid arguments for tool model_search: [
  {
    "code": "invalid_type",
    "expected": "number",
    "received": "string",
    "path": [
      "limit"
    ],
    "message": "Expected number, received string"
  }
]
julien-c

julien-c commented on Jun 2, 2025

@julien-c
Member

Hi!

I think there are a couple of different issues here, but I think one way to debug is to feed back errors into the LLM. Let me try to open a PR to do this.

julien-c

julien-c commented on Jun 2, 2025

@julien-c
Member

in the meantime, try using a larger model! It will generate valid JSON more reliably

julien-c

julien-c commented on Jun 2, 2025

@julien-c
Member

If you are able, can you please give a try to #1511 locally?

sdelahaies

sdelahaies commented on Jun 2, 2025

@sdelahaies

Hello, thanks for following up on this, as far as my problem goes it seems that the size of the model is not the problem. I tested it with llama3.2:3b and qwen2.5-coder:7b, both model producing legit json.dump outputs ending with the same error.

The problem came from the doubling of the tool_call arguments which I overcame by commenting lines 222-224 of mcp-client/..../index.js

For the other issue concerning a tool expecting an argument as number, I guess that's a common issue when you deal with functions as mcp/llm tools, you'd better treat the inputs as str no matter what and deal with conversion to number or whatever in the tool core logic, otherwise you need to enforce a type checking before you call a tool...

I can give a try to #1511, although I am not quite sure how, as as soon as the error strike the tiny-agent exits.

julien-c

julien-c commented on Jun 3, 2025

@julien-c
Member

we'll push a new release with #1511 inside, so it's easier to test it locally. (otherwise you have to build locally using pnpm)

julien-c

julien-c commented on Jun 3, 2025

@julien-c
Member

ok ok @sdelahaies i can now reproduce your error. Let me see how to best fix it

julien-c

julien-c commented on Jun 3, 2025

@julien-c
Member

Ok, how does this look @sdelahaies? #1512

sdelahaies

sdelahaies commented on Jun 3, 2025

@sdelahaies

Hello, it does solve the issue indeed, great, thanks!

added a commit that references this issue on Jun 4, 2025
49d93f2
julien-c

julien-c commented on Jun 4, 2025

@julien-c
Member

thanks a lot for the debugging info + resolution proposal @sdelahaies, it's been merged

added a commit that references this issue on Jun 16, 2025
76770d7
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Participants

      @julien-c@sdelahaies@alexdconf

      Issue actions

        mcp-client JSON.parse error for local ollama · Issue #1502 · huggingface/huggingface.js