Publish Home Assistant events to your Elasticsearch cluster!
- Features
- Inspiration
- Compatibility
- Getting Started
- Installation
- Create Elasticsearch Credentials
- Setup
- Configuration options
- Using Home Assistant data in Kibana
- Defining your own Index Mappings, Settings, and Ingest Pipeline
- Create your own cluster health sensor
- Support
- Contributing
- Efficiently publishes Home-Assistant events to Elasticsearch using the Bulk API
- Automatically sets up Datastreams using Time Series Data Streams ("TSDS"), Datastream Lifecycle Management ("DLM"), or Index Lifecycle Management ("ILM") depending on your cluster's capabilities
- Supports Elastic's stack security features via optional username, password, and API keys
- Selectively publish events based on domains or entities
Graph your home's climate and HVAC Usage:
Visualize and alert on data from your weather station:
Some usage examples inspired by real users:
- Utilizing a Raspberry Pi in kiosk mode with a 15" display, the homeassistant-elasticsearch integration enables the creation of rotating fullscreen Elasticsearch Canvas. Those canvas displays metrics collected from various Home Assistant integrations, offering visually dynamic and informative dashboards for monitoring smart home data.
- To address temperature maintenance issues in refrigerators and freezers, temperature sensors in each appliance report data to Home Assistant, which is then published to Elasticsearch. Kibana's alerting framework is employed to set up rules that notify the user if temperatures deviate unfavorably for an extended period. The Elastic rule engine and aggregations simplify the monitoring process for this specific use case.
- Monitoring the humidity and temperature in a snake enclosure/habitat for a user's daughter, the integration facilitates the use of Elastic's Alerting framework. This choice is motivated by the framework's suitability for the monitoring requirements, providing a more intuitive solution compared to Home Assistant automations.
- The integration allows users to maintain a smaller subset of data, focusing on individual stats of interest, for an extended period. This capability contrasts with the limited retention achievable with Home Assistant and databases like MariaDB/MySQL. This extended data retention facilitates very long-term trend analysis, such as for weather data, enabling users to glean insights over an extended timeframe.
- Elasticsearch 8.0+, 7.11+ (Self or Cloud hosted). Version
0.4.0
includes support for older versions of Elasticsearch. - Elastic Common Schema version 1.0.0
- Home Assistant Community Store
- Home Assistant 2024.1
The following table covers the Elasticsearch functionality used by the integration when configured against various versions of Elasticsearch:
Elasticsearch Version | Datastreams | Time Series Datastreams | Datastream Lifecycle Management | Index Lifecycle Management |
---|---|---|---|---|
7.11.0 - 7.12.0 | Supported | Partially Supported [See Note] | ||
7.13.0 - 7.17.0 | Supported | Supported | ||
8.0.0 - 8.6.0 | Supported | Supported | ||
8.7.0 - 8.10.0 | Supported | Supported | ||
8.11.0+ | Supported | Supported |
Note: Index Lifecycle Management is partially supported in versions 7.11.0 - 7.12.0. The integration will create an ILM policy that performs time-based rollover but does not support shard-size-based rollover.
The Elasticsearch component requires, well, Elasticsearch! This component will not host or configure Elasticsearch for you, but there are many ways to run your own cluster. Elasticsearch is open source and free to use: just bring your own hardware! Elastic has a great setup guide if you need help getting your first cluster up and running.
If you don't want to maintain your own cluster, then give the Elastic Cloud a try! There is a free trial available to get you started.
This component is available via the Home Assistant Community Store (HACS) in their default repository. Visit https://hacs.xyz/ for more information on HACS.
Alternatively, you can manually install this component by copying the contents of custom_components
to your $HASS_CONFIG/custom_components
directory, where $HASS_CONFIG
is the location on your machine where Home-Assistant lives.
Example: /home/pi/.homeassistant
and /home/pi/.homeassistant/custom_components
. You may have to create the custom_components
directory yourself.
The integration supports authenticating via API Key, Username and Password or unauthenticated access.
You must first create an API Key with the appropriate privileges.
Note that if you adjust the index_format
or alias
settings that the role definition must be updated accordingly:
POST /_security/api_key
{
"name": "home_assistant_component",
"role_descriptors": {
"hass_writer": {
"cluster": [
"manage_index_templates",
"manage_ilm",
"monitor"
],
"indices": [
{
"names": [
"metrics-homeassistant*"
],
"privileges": [
"manage",
"index",
"create_index",
"create"
]
}
]
}
}
}
If you choose not to authenticate via an API Key, you need to create a user and role with appropriate privileges.
# Create role
POST /_security/role/hass_writer
{
"cluster": [
"manage_index_templates",
"manage_ilm",
"monitor"
],
"indices": [
{
"names": [
"metrics-homeassistant*"
],
"privileges": [
"manage",
"index",
"create_index",
"create"
]
}
]
}
# Create user
POST /_security/user/hass_writer
{
"full_name": "Home Assistant Writer",
"password": "changeme",
"roles": ["hass_writer"]
}
This component is configured interactively via Home Assistant's integration configuration page.
- Restart Home-assistant once you've completed the installation instructions above.
- From the
Integrations
configuration menu, add a newElasticsearch
integration. - Select the appropriate authentication method
- Provide connection information and optionally credentials to begin setup.
- Once the integration is setup, you may tweak all settings via the "Options" button on the integrations page.
You can choose to include/exclude entities based on their domain or entity id. This allows you to publish only the entities you are interested in to Elasticsearch. By default, all entities and domains are included. You can combine inclusion and exclusion filters to fine-tune the entities you want to publish. The following flowchart describes the logic used to determine if an entity is published:
flowchart LR
A[Entity] --> B{Is entity excluded?}
B -->|Yes| Z{Do not publish}
B -->|No| C{Is entity or domain included?}
C -->|Yes| Y{Publish}
C -->|No| D{Is domain excluded?}
D -->|Yes| Z
D -->|No| Y
There are three modes to publish data to Elasticsearch:
All
- Publish configured entities to Elasticsearch, including those which did not undergo a state or attribute change.State changes
- Publish configured entities to Elasticsearch only when their state changes.Any changes
- Publish configured entities to Elasticsearch when their state or attributes change.
Publish Mode | State Change | Attribute Change | No Change |
---|---|---|---|
All | ✅ Publishes | ✅ Publishes | ✅ Publishes |
Any Changes | ✅ Publishes | ✅ Publishes | 🚫 Does not publish |
State Changes | ✅ Publishes | 🚫 Does not publish | 🚫 Does not publish |
The integration will put data into Elasticsearch under metrics-homeassistant.*
. To explore your data, create visualizations, or dashboards in Kibana you first need to create a Data View. To create a Data View follow the instructions in the Kibana documentation here: Create a data view called Homeassistant Metrics. For the Index pattern, use: metrics-homeassistant.*
Once you have created a Data View, you can start exploring your Home Assistant data in Kibana using Discover
:
- In Kibana select
Discover
- Select the
Homeassistant Metrics
Data View at the top left - You can now see all the Home Assistant data that has been published to Elasticsearch
- You can filter the data using the filter bar at the top
- You can pull specific fields into the document table at the bottom by clicking on the
+
icon next to a field - You can change the time range of the data you are viewing using the time picker in the top right
When creating new visualizations you may find the following fields useful:
@timestamp
- The timestamp of the event (ex.Apr 10, 2024 @ 16:23:25.878
)hass.entity.attributes.friendly_name
- The name of the entity in Home Assistant (ex.Living Room EcoBee Temperature
)hass.entity.device.area.name
- The area of the device in Home Assistant (ex.Living Room
)hass.entity.id
- The entity id of the entity in Home Assistant (ex.sensor.living_room_ecobee_temperature
)hass.entity.value
- The state of the entity in Home Assistant (ex.72.5
), as a string-typed valuehass.entity.valueas.integer
- The state of the entity in Home Assistant (ex.72
), as an integer-typed valuehass.entity.valueas.float
- The state of the entity in Home Assistant (ex.72.5
), as a float-typed valuehass.entity.valueas.boolean
- The state of the entity in Home Assistant (ex.true
), as a boolean-typed valuehass.entity.valueas.date
- The state of the entity in Home Assistant (ex.2024-04-10
), as a date-typed valuehass.entity.valueas.datetime
- The state of the entity in Home Assistant (ex.2024-04-10T16:23:25.878
), as a datetime-typed valuehass.entity.valueas.time
- The state of the entity in Home Assistant (ex.16:23:25.878
), as a time-typed value
To build a visualization that shows the temperature of a specific entity over time, you can use the following steps:
- In Kibana select
Visualizations
and create a new Lens visualization - Select
Homeassistant Metrics
- For the
Horizontal axis
select@timestamp
- For the
Vertical axis
selecthass.entity.valueas.float
- In the filter bar at the top, add a filter for
hass.entity.id
and set the value to the entity id of the entity you want to visualize (ex.sensor.living_room_ecobee_temperature
) orhass.entity.attributes.friendly_name
and set the value to the friendly name of the entity you want to visualize (ex.Living Room EcoBee Temperature
)
You can customize the mappings, settings and define an ingest pipeline by creating a component template called metrics-homeassistant@custom
The following is an example on how to push your Home Assistant metrics into an ingest pipeline called metrics-homeassistant-pipeline
:
PUT _ingest/pipeline/metrics-homeassistant-pipeline
{
"description": "Pipeline for HomeAssistant dataset",
"processors": [ ]
}
PUT _component_template/metrics-homeassistant@custom
{
"template": {
"mappings": {}
"settings": {
"index.default_pipeline": "metrics-homeassistant-pipeline",
}
}
}
Component template changes apply when the datastream performs a rollover so the first time you modify the template you may need to manually initiate ILM rollover to start applying the pipeline.
Versions prior to 0.6.0
included a cluster health sensor. This has been removed in favor of a more generic approach. You can create your own cluster health sensor by using Home Assistant's built-in REST sensor.
# Example configuration
sensor:
- platform: rest
name: "Cluster Health"
unique_id: "cluster_health" # Replace with your own unique id. See https://www.home-assistant.io/integrations/sensor.rest#unique_id
resource: "https://example.com/_cluster/health" # Replace with your Elasticsearch URL
username: hass # Replace with your username
password: changeme # Replace with your password
value_template: "{{ value_json.status }}"
json_attributes: # Optional attributes you may want to include from the /_cluster/health API response
- "cluster_name"
- "status"
- "timed_out"
- "number_of_nodes"
- "number_of_data_nodes"
- "active_primary_shards"
- "active_shards"
- "relocating_shards"
- "initializing_shards"
- "unassigned_shards"
- "delayed_unassigned_shards"
- "number_of_pending_tasks"
- "number_of_in_flight_fetch"
- "task_max_waiting_in_queue_millis"
- "active_shards_percent_as_number"
This project is not endorsed or supported by either Elastic or Home-Assistant - please open a GitHub issue for any questions, bugs, or feature requests.
Contributions are welcome! Please see the Contributing Guide for more information.