Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Ollama conversation agent #113962

Merged
merged 10 commits into from
Mar 26, 2024
Merged

Conversation

synesthesiam
Copy link
Contributor

Proposed change

Adds a new conversation agent integration for Ollama, a local large language model server.

Type of change

  • Dependency upgrade
  • Bugfix (non-breaking change which fixes an issue)
  • New integration (thank you!)
  • New feature (which adds functionality to an existing integration)
  • Deprecation (breaking change to happen in the future)
  • Breaking change (fix/feature causing existing functionality to break)
  • Code quality improvements to existing code or addition of tests

Additional information

Checklist

  • The code change is tested and works locally.
  • Local tests pass. Your PR cannot be merged unless tests pass
  • There is no commented out code in this PR.
  • I have followed the development checklist
  • I have followed the perfect PR recommendations
  • The code has been formatted using Ruff (ruff format homeassistant tests)
  • Tests have been added to verify that the new code works.

If user exposed functionality or configuration variables are added/changed:

If the code communicates with devices, web services, or third-party tools:

  • The manifest file has all fields filled out correctly.
    Updated and included derived files by running: python3 -m script.hassfest.
  • New or updated dependencies have been added to requirements_all.txt.
    Updated by running python3 -m script.gen_requirements_all.
  • For the updated dependencies - a link to the changelog, or at minimum a diff between library versions is added to the PR description.
  • Untested files have been added to .coveragerc.

To help with the load of incoming pull requests:

): TemplateSelector(),
vol.Optional(CONF_MODEL_OPTIONS): ObjectSelector(),
vol.Optional(
CONF_MAX_HISTORY, description={"suggested_value": MAX_HISTORY_NO_LIMIT}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we offer this to users?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also not ask on initial creation, maybe even never.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Unlike OpenAI, context window size is a factor you have to take into account locally.
Models like phi become almost useless with too much in the context, I've found.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But how will a user know?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks like they can't currently: ollama/ollama-python#84

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Limit has be set to 20 by default

@synesthesiam
Copy link
Contributor Author

Using a fork of the Ollama Python library until the httpx constraint can be loosened in the official version.

@synesthesiam synesthesiam force-pushed the synesthesiam-20240320-ollama-conversation branch from 68ba35f to d7ca5a6 Compare March 25, 2024 16:38
intent_response = intent.IntentResponse(language=user_input.language)
intent_response.async_set_error(
intent.IntentResponseErrorCode.UNKNOWN,
f"Sorry, I had a problem generating my prompt: {err}",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not for this PR, but it should be interesting to explore if we can have an error message that is for humans, and an error message with tech details.

Do you want your voice assistant to say: "Cannot concatenate None + Str" ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My Google TV brilliantly tells me "Something's not right" and then continues to work just fine 🤖

CODEOWNERS Outdated Show resolved Hide resolved
Comment on lines 235 to 250
if entity is not None:
# Add aliases
names.extend(entity.aliases)
if entity.area_id and (
area := area_registry.async_get_area(entity.area_id)
):
# Entity is in area
area_names.append(area.name)
area_names.extend(area.aliases)
elif entity.device_id and (
device := device_registry.async_get(entity.device_id)
):
# Check device area
if device.area_id and (
area := area_registry.async_get_area(device.area_id)
):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can become 1 big if-statement no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How so? If the entity doesn't have an associated area or device, we still need the aliases. The area may be on the entity or the device too, but we don't know which upfront.

_LOGGER.exception("Unexpected exception")
errors["base"] = "unknown"
else:
return self.async_create_entry(title="Ollama", data=user_input)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Idea: if a user configures a second Ollama instance, maybe the title can be set to the URL ?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I set the title using the model, which is now configured once during the initial set up (and not in the options flow)

@synesthesiam synesthesiam force-pushed the synesthesiam-20240320-ollama-conversation branch from 02f8926 to d12db9f Compare March 26, 2024 20:09
@synesthesiam synesthesiam merged commit 72fed87 into dev Mar 26, 2024
6 checks passed
@synesthesiam synesthesiam deleted the synesthesiam-20240320-ollama-conversation branch March 26, 2024 21:15
@synesthesiam synesthesiam mentioned this pull request Mar 26, 2024
13 tasks
@github-actions github-actions bot locked and limited conversation to collaborators Mar 27, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants