Got Home Assistant to connect to localai via Ollama, that part is working. But when trying to chat gives me an error "Unexpected error during intent recognition"
Here's the logs from Home Assistant, local ai logs posted in the screenshot below:
Logger: homeassistant.components.assist_pipeline.pipeline
Source: components/assist_pipeline/pipeline.py:1286
Integration: Assist pipeline (documentation, issues)
First occurred: 4:32:55 AM (7 occurrences)
Last logged: 4:42:38 AM
Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 1286, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<9 lines>...
)
^
File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 129, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 55, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 68, in async_process
return await self._async_handle_message(user_input, chat_log)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/ollama/conversation.py", line 84, in _async_handle_message
await self._async_handle_chat_log(chat_log)
File "/usr/src/homeassistant/homeassistant/components/ollama/entity.py", line 259, in _async_handle_chat_log
[
...<4 lines>...
]
File "/usr/src/homeassistant/homeassistant/components/conversation/chat_log.py", line 507, in async_add_delta_content_stream
async for delta in stream:
...<91 lines>...
)
File "/usr/src/homeassistant/homeassistant/components/ollama/entity.py", line 155, in _transform_stream
async for response in result:
...<20 lines>...
yield chunk
File "/usr/local/lib/python3.14/site-packages/ollama/_client.py", line 757, in inner
"""
ollama._types.ResponseError: {'code': 400, 'message': 'code=400, message=failed parsing request body: code=400, message=Unmarshal type error: expected=int, got=number 8192.0, field=options.num_ctx, offset=70, internal=json: cannot unmarshal number 8192.0 into Go struct field OllamaOptions.options.num_ctx of type int', 'type': ''} (status code: 400)
Logger: homeassistant.components.assist_pipeline.pipeline
Source: components/assist_pipeline/pipeline.py:1286
Integration: Assist pipeline (documentation, issues)
First occurred: 4:43:32 AM (1 occurrence)
Last logged: 4:43:32 AM
Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 1286, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
...<9 lines>...
)
^
File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 129, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 55, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 68, in async_process
return await self._async_handle_message(user_input, chat_log)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/ollama/conversation.py", line 84, in _async_handle_message
await self._async_handle_chat_log(chat_log)
File "/usr/src/homeassistant/homeassistant/components/ollama/entity.py", line 215, in _async_handle_chat_log
_format_tool(tool, chat_log.llm_api.custom_serializer)
~~~~~~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/ollama/entity.py", line 47, in _format_tool
"parameters": convert(tool.parameters, custom_serializer=custom_serializer),
~~~~~~~^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.14/site-packages/voluptuous_openapi/init.py", line 102, in convert
pval = convert_with_args(value)
File "/usr/local/lib/python3.14/site-packages/voluptuous_openapi/init.py", line 53, in convert_with_args
return convert(
schema, custom_serializer=custom_serializer, openapi_version=openapi_version
)
File "/usr/local/lib/python3.14/site-packages/voluptuous_openapi/init.py", line 392, in convert
if schema in TYPES_MAP:
^^^^^^^^^^^^^^^^^^^
TypeError: cannot use 'homeassistant.helpers.selector.SelectSelector' as a dict key (unhashable type: 'SelectSelector')
LocalAI Logs:

Got Home Assistant to connect to localai via Ollama, that part is working. But when trying to chat gives me an error "Unexpected error during intent recognition"
Here's the logs from Home Assistant, local ai logs posted in the screenshot below:
LocalAI Logs:
