-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Update notebooks with Hass.io details #8
Comments
@fabaff once the addon by @frenck is ready I will happily update the notebooks. I've also got a package which we might demo https://github.com/robmarkcole/HASS-data-detective |
HASS Data Detective has been updated, it can now:
This makes it significantly easier to share notebooks and get people started, so I want to make it a requirement for a notebook to be in the root of this repo, it should use HASS Data Detective. And so about how this fits in the bigger picture:The goal is to launch https://data.home-assistant.io (repo) around Dec 14. The new site will guide users to install Frecks new Jupyter Lab Lite add-on and explore their own data using this repo. The add-on will have this repo pre-installed. They should be able to open each notebook and have it run stuff on their data. @robmarkcole is also working on a simplified Getting Started notebook that will just limit itself to be able to show information about the instance without configuration by the user, just pressing Run All should work. |
The Jupyter Lab Lite add-on is already available in the Community Hass.io add-ons edge repo: https://github.com/hassio-addons/repository-edge |
The notebooks were written to be uses directly beside a Home Assistant installation. With Hass.io around for quite some time I guess that people start to use the notebooks as well. To make their live easier, additional details for Hass.io should be added, e.g. path to the DB and a like.
@frenck is also working on a Hass.io add-on for Jupyter Labs. And this will boost the usage even more.
The text was updated successfully, but these errors were encountered: