Skip to content

Update MerlinAU.sh (Better memory management)#31

Merged
ExtremeFiretop merged 8 commits intodevfrom
ExtremeFiretop-patch-RAMmanagement
Dec 10, 2023
Merged

Update MerlinAU.sh (Better memory management)#31
ExtremeFiretop merged 8 commits intodevfrom
ExtremeFiretop-patch-RAMmanagement

Conversation

@ExtremeFiretop
Copy link
Owner

@ExtremeFiretop ExtremeFiretop commented Dec 9, 2023

@Martinski4GitHub

How do we feel about this solution for better memory management?
It now dynamically assigns the required memory based on the size of the zip. (Includes a 10MB overhead for the zip file) and it's checked 3 times instead of once.

  1. Once before the download.
  2. Once before the decompress.
  3. Once before the actual flash.

Basically making sure that every step along the way, the router maintains this required memory or it prompts for reboot.

I added debug code for the "get_free_ram()" function so we can test without actually being low on memory. Just make the script think it is.

Only concern I have is in the notes in the readme it says:
-Might need to add some "overhead" to the file size comparison to account for the "ZIP + F/W" files being on the "$HOME" directory at the same time, even if just temporarily.

Not sure if I'm correctly accounting for this. I remember you mentioning this, but I'm not sure I'm understanding correctly.
Are we saying we need to up the RAM requirement for when we decompress the zip file?

(aka lets say the .zip file is 150MB so right now it would do 150MB+10MB overhead for 160MB memory required, are we saying we need to up this overhead for the decompression step?)

@ExtremeFiretop
Copy link
Owner Author

tested this pretty much through and through, I'm confident this is a better way to handle memory.

@ExtremeFiretop ExtremeFiretop merged commit f58de6c into dev Dec 10, 2023
@ExtremeFiretop ExtremeFiretop deleted the ExtremeFiretop-patch-RAMmanagement branch December 10, 2023 19:45
@Martinski4GitHub
Copy link
Collaborator

Martinski4GitHub commented Dec 11, 2023

@Martinski4GitHub

How do we feel about this solution for better memory management? It now dynamically assigns the required memory based on the size of the zip. (Includes a 10MB overhead for the zip file) and it's checked 3 times instead of once.

1. Once before the download.

2. Once before the decompress.

3. Once before the actual flash.

Basically making sure that every step along the way, the router maintains this required memory or it prompts for reboot.

I added debug code for the "get_free_ram()" function so we can test without actually being low on memory. Just make the script think it is.

Only concern I have is in the notes in the readme it says: -Might need to add some "overhead" to the file size comparison to account for the "ZIP + F/W" files being on the "$HOME" directory at the same time, even if just temporarily.

Not sure if I'm correctly accounting for this. I remember you mentioning this, but I'm not sure I'm understanding correctly. Are we saying we need to up the RAM requirement for when we decompress the zip file?

(aka lets say the .zip file is 150MB so right now it would do 150MB+10MB overhead for 160MB memory required, are we saying we need to up this overhead for the decompression step?)

Apologies for not responding earlier. I was busy this weekend; we spent Saturday & Sunday at my brother's house for our annual family gathering (yeah, we do it sometime before Christmas to avoid all the traveling/traffic nightmares, and then afterward everyone is free to spend the 24th & 31st wherever they prefer - sometimes with the in-laws :>), and we have 2 birthdays this month so we celebrate them during the same family gatherings).

Anyway, I have not yet reviewed the code, but based on your description it looks like the right solution. I'll take a look sometime later. I'm a little beat right now from the ~2-hour drive back.

@ExtremeFiretop
Copy link
Owner Author

@Martinski4GitHub
How do we feel about this solution for better memory management? It now dynamically assigns the required memory based on the size of the zip. (Includes a 10MB overhead for the zip file) and it's checked 3 times instead of once.

1. Once before the download.

2. Once before the decompress.

3. Once before the actual flash.

Basically making sure that every step along the way, the router maintains this required memory or it prompts for reboot.
I added debug code for the "get_free_ram()" function so we can test without actually being low on memory. Just make the script think it is.
Only concern I have is in the notes in the readme it says: -Might need to add some "overhead" to the file size comparison to account for the "ZIP + F/W" files being on the "$HOME" directory at the same time, even if just temporarily.
Not sure if I'm correctly accounting for this. I remember you mentioning this, but I'm not sure I'm understanding correctly. Are we saying we need to up the RAM requirement for when we decompress the zip file?
(aka lets say the .zip file is 150MB so right now it would do 150MB+10MB overhead for 160MB memory required, are we saying we need to up this overhead for the decompression step?)

Apologies for not responding earlier. I was busy this weekend; we spent Saturday & Sunday at my brother's house for our annual family gathering (yeah, we do it sometime before Christmas to avoid all the traveling/traffic nightmares, and then afterward everyone is free to spend the 24th & 31st wherever they prefer - sometimes with the in-laws :>), and we have 2 birthdays this month so we celebrate them during the same family gatherings).

Anyway, I have not yet reviewed the code, but based on your description it looks like the right solution. I'll take a look sometime later. I'm a little beat right now from the ~2-hour drive back.

No problemo! :)

I only pushed it to dev anyways, figured you'd see it when you were back. I'm pretty confident this is better way to handle memory than the old method so I pushed to dev, naturally if you find any issues or bugs feel free to point them out though! :)

I also have 2 birthdays this month in the family, lots of celebration this month haha!

@Martinski4GitHub
Copy link
Collaborator

I only pushed it to dev anyways, figured you'd see it when you were back. I'm pretty confident this is better way to handle memory than the old method so I pushed to dev, naturally if you find any issues or bugs feel free to point them out though! :)

I'll be taking a look sometime tomorrow or Friday evening. This week is very busy with final preparations for our major s/w release this coming Monday, Dec-18th.

@ExtremeFiretop
Copy link
Owner Author

ExtremeFiretop commented Dec 14, 2023

I'll be taking a look sometime tomorrow or Friday evening. This week is very busy with final preparations for our major s/w release this coming Monday, Dec-18th.

My releases are once a year.
So the first time I was away for a month or two is gonna be mostly it until sometime next year lol!

I also don't program anything for my releases.
I handle the push out to the department using SCCM and handle scripting the deployment.

For me, most of this coding/scripting is a secondary part of my role, instead of my primary focus lol!
But for you I can tell, you take a much more upfront and personal approach to the coding aspect based on your history and skill level haha

@Martinski4GitHub
Copy link
Collaborator

I only pushed it to dev anyways, figured you'd see it when you were back. I'm pretty confident this is better way to handle memory than the old method so I pushed to dev, naturally if you find any issues or bugs feel free to point them out though! :)

I reviewed the code and it looks very good. I only made some code improvements to get an estimate of the available RAM that may be reclaimed to proceed with the F/W Update. Getting only the "Free" memory is not enough, especially after a long uptime.

@Martinski4GitHub
Copy link
Collaborator

My releases are once a year. So the first time I was away for a month or two is gonna be mostly it until sometime next year lol!

I also don't program anything for my releases. I handle the push out to the department using SCCM and handle scripting the deployment.

For me, most of this coding/scripting is a secondary part of my role, instead of my primary focus lol! But for you I can tell, you take a much more upfront and personal approach to the coding aspect based on your history and skill level haha

At the company I work for, our release schedules normally call for 2 major releases per year but also allow for minor releases or patches to address serious bugs or updates to 3rd-party libraries that include fixes for some security vulnerabilities that affect our code. We pretty much follow the Agile development process with specific adjustments to fit our company's own policies & processes, and do continuous integration & testing so we can issue a patch or a full release usually within 2 to 3 weeks. But with major releases, there are additional deliverables for documentation (e.g. User's Guide), screenshots of any new or modified features, translation strings, etc. so we have to "dot the I's and cross the T's" with some more components. In addition to s/w dev., I also deal with the technical writers & language translators to transfer back & forth the files they need & explain (if/when needed) any new features or behavior in the s/w so they can do their work, and then get their finalized files integrated into the full release package that goes into production. So yeah, this time is busier than usual.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

enhancement New feature or request

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants