-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Size of home assistant backup #24
Comments
How many entities do you get from your BACnet integration? If each entity is recording data, this does increase the database size, thus also the backup size. If you prefer smaller backup sizes, perhaps you should make exclusions in the recorder integration. |
Do you have any specific time set in the recorder integration for how long Home Assistant has to store data? |
no i don't do any configuration in record integraiton, i will update you next week about bkp size!thanks! |
How about it's size now? Did it stabilize or is it still ever growing? |
Hello, the size of the backups is stable at 130 MB now. However, I've noticed a gradual increase in the amount of RAM being used: at system startup, the occupied RAM is about 950 MB, and I've noticed a daily increase of approximately 150 MB. It reached a maximum of almost 2 GB occupied, then I had to restart for an update. With 8 GB available, I'm not sure if this gradual increase is normal until reaching a certain limit or if this is directly related to this integration. Can I ask for your opinion? |
This is the total system RAM usage? I'd say that's normal, as all the RAM that's not used is wasted RAM. |
I'll close this issue now since it seems solved. |
Hi, I've noticed a significant increase in the size of the Home Assistant backup images since I installed the Bepacom BACnet/IP Interface. On average, the backup file increases by 12MB every day. Today, it's almost 100MB, with more than half of that being from the file 97683af0_bacnetinterface_dev.tar.gz. Currently, I'm using version 1.4.1b6.
Here's my configuration
The text was updated successfully, but these errors were encountered: