Update MerlinAU.sh (Better memory management)#31
Conversation
|
tested this pretty much through and through, I'm confident this is a better way to handle memory. |
Apologies for not responding earlier. I was busy this weekend; we spent Saturday & Sunday at my brother's house for our annual family gathering (yeah, we do it sometime before Christmas to avoid all the traveling/traffic nightmares, and then afterward everyone is free to spend the 24th & 31st wherever they prefer - sometimes with the in-laws :>), and we have 2 birthdays this month so we celebrate them during the same family gatherings). Anyway, I have not yet reviewed the code, but based on your description it looks like the right solution. I'll take a look sometime later. I'm a little beat right now from the ~2-hour drive back. |
No problemo! :) I only pushed it to dev anyways, figured you'd see it when you were back. I'm pretty confident this is better way to handle memory than the old method so I pushed to dev, naturally if you find any issues or bugs feel free to point them out though! :) I also have 2 birthdays this month in the family, lots of celebration this month haha! |
I'll be taking a look sometime tomorrow or Friday evening. This week is very busy with final preparations for our major s/w release this coming Monday, Dec-18th. |
My releases are once a year. I also don't program anything for my releases. For me, most of this coding/scripting is a secondary part of my role, instead of my primary focus lol! |
I reviewed the code and it looks very good. I only made some code improvements to get an estimate of the available RAM that may be reclaimed to proceed with the F/W Update. Getting only the "Free" memory is not enough, especially after a long uptime. |
At the company I work for, our release schedules normally call for 2 major releases per year but also allow for minor releases or patches to address serious bugs or updates to 3rd-party libraries that include fixes for some security vulnerabilities that affect our code. We pretty much follow the Agile development process with specific adjustments to fit our company's own policies & processes, and do continuous integration & testing so we can issue a patch or a full release usually within 2 to 3 weeks. But with major releases, there are additional deliverables for documentation (e.g. User's Guide), screenshots of any new or modified features, translation strings, etc. so we have to "dot the I's and cross the T's" with some more components. In addition to s/w dev., I also deal with the technical writers & language translators to transfer back & forth the files they need & explain (if/when needed) any new features or behavior in the s/w so they can do their work, and then get their finalized files integrated into the full release package that goes into production. So yeah, this time is busier than usual. |
@Martinski4GitHub
How do we feel about this solution for better memory management?
It now dynamically assigns the required memory based on the size of the zip. (Includes a 10MB overhead for the zip file) and it's checked 3 times instead of once.
Basically making sure that every step along the way, the router maintains this required memory or it prompts for reboot.
I added debug code for the "get_free_ram()" function so we can test without actually being low on memory. Just make the script think it is.
Only concern I have is in the notes in the readme it says:
-Might need to add some "overhead" to the file size comparison to account for the "ZIP + F/W" files being on the "$HOME" directory at the same time, even if just temporarily.
Not sure if I'm correctly accounting for this. I remember you mentioning this, but I'm not sure I'm understanding correctly.
Are we saying we need to up the RAM requirement for when we decompress the zip file?
(aka lets say the .zip file is 150MB so right now it would do 150MB+10MB overhead for 160MB memory required, are we saying we need to up this overhead for the decompression step?)