-
-
Notifications
You must be signed in to change notification settings - Fork 29.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ollama conversation agent #113962
Add Ollama conversation agent #113962
Conversation
): TemplateSelector(), | ||
vol.Optional(CONF_MODEL_OPTIONS): ObjectSelector(), | ||
vol.Optional( | ||
CONF_MAX_HISTORY, description={"suggested_value": MAX_HISTORY_NO_LIMIT} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why do we offer this to users?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Also not ask on initial creation, maybe even never.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Unlike OpenAI, context window size is a factor you have to take into account locally.
Models like phi become almost useless with too much in the context, I've found.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But how will a user know?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks like they can't currently: ollama/ollama-python#84
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Limit has be set to 20 by default
Using a fork of the Ollama Python library until the httpx constraint can be loosened in the official version. |
68ba35f
to
d7ca5a6
Compare
intent_response = intent.IntentResponse(language=user_input.language) | ||
intent_response.async_set_error( | ||
intent.IntentResponseErrorCode.UNKNOWN, | ||
f"Sorry, I had a problem generating my prompt: {err}", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Not for this PR, but it should be interesting to explore if we can have an error message that is for humans, and an error message with tech details.
Do you want your voice assistant to say: "Cannot concatenate None + Str" ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
My Google TV brilliantly tells me "Something's not right" and then continues to work just fine 🤖
if entity is not None: | ||
# Add aliases | ||
names.extend(entity.aliases) | ||
if entity.area_id and ( | ||
area := area_registry.async_get_area(entity.area_id) | ||
): | ||
# Entity is in area | ||
area_names.append(area.name) | ||
area_names.extend(area.aliases) | ||
elif entity.device_id and ( | ||
device := device_registry.async_get(entity.device_id) | ||
): | ||
# Check device area | ||
if device.area_id and ( | ||
area := area_registry.async_get_area(device.area_id) | ||
): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can become 1 big if-statement no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
How so? If the entity doesn't have an associated area or device, we still need the aliases. The area may be on the entity or the device too, but we don't know which upfront.
_LOGGER.exception("Unexpected exception") | ||
errors["base"] = "unknown" | ||
else: | ||
return self.async_create_entry(title="Ollama", data=user_input) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Idea: if a user configures a second Ollama instance, maybe the title can be set to the URL ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I set the title using the model, which is now configured once during the initial set up (and not in the options flow)
02f8926
to
d12db9f
Compare
Proposed change
Adds a new conversation agent integration for Ollama, a local large language model server.
Type of change
Additional information
Checklist
ruff format homeassistant tests
)If user exposed functionality or configuration variables are added/changed:
If the code communicates with devices, web services, or third-party tools:
Updated and included derived files by running:
python3 -m script.hassfest
.requirements_all.txt
.Updated by running
python3 -m script.gen_requirements_all
..coveragerc
.To help with the load of incoming pull requests: