-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Redis / AOF - Magento 2.4.6-p3 #38148
Comments
Hi @Nuranto. Thank you for your report.
Join Magento Community Engineering Slack and ask your questions in #github channel. |
Update : We still have issues with RDB. So the issue could be the number of sessions that increased, or the size of them. We'll continue investigate. |
I'm not sure, this ticket seems related to cache. We only have issues with sessions. |
Not many session related changes at first sight: https://github.com/magento/magento2/compare/2.4.6-p2..2.4.6-p3 Maybe one of the recent changes in these 3 repositories causes it? |
Yes, that was my thought too. |
I have a similar issue. After upgrade Magento from 2.3.5 to 2.4.6-p2, Redis memory start to grow very very fast and arrived to 11 GB.
if i flush theese caches, memory size of redis return to (around) 300MB. |
Hi @engcom-November. Thank you for working on this issue.
|
Hello @Nuranto, Thank you for the report and collaboration! We configured redis for session with readonly, and upgraded from 2.4.6-p2 to the latest 2.4.7-beta2. And this is when upgraded to 2.4.7-beta2 Please let us know if we are missing anything. Thank you. |
Same problem over here will multiple shops and Redis page cache. |
@Nuranto Were you able to fix it? What version of redis are you running, was it upgraded along with Magento? |
We're currently updating to 2.4.6-p3 in hopes to solve our Redis issues, this does not sound promising. Has anyone determined the issue yet? |
Hi @goivvy, no we did not upgrade redis along with magento. We're running on redis 7.0.14. And we stabilized this by disabling AOF |
I just noticed this commit made in core magento that limits 2 of the redis packages to a certain version (according to the commit message - in broken English - it sounds like newer versions of these packages have performance issues): 9369884 |
@Nuranto, @hostep we were not able to reproduce this issue on vanilla 2.4-develop instance with redis AOF turned on. Thank you. |
We're all experiencing the issue. It's not new to the most recent version, it's been a problem through all past versions. I am guessing you installed vanilla magento, with no data, and no traffic? You'll need a large catalog and simulated traffic to reproduce this issue. |
Currently experiencing this issue on 3 different setups. |
✅ Jira issue https://jira.corp.adobe.com/browse/AC-13309 is successfully created for this GitHub issue. |
✅ Confirmed by @engcom-Hotel. Thank you for verifying the issue. |
Hello, As mentioned here #38148 (comment), the issue is unable to reproduce easily, I think we can confirm this issue as so many users are facing the same issue. We will investigate more on this during the fixing of this issue. We will keep you posted on this. Thanks |
Hi everyone, I recently faced the same issue with a very large catalog, with a total of 2+ million products, meaning a lot of PDP layouts. We came to the conclusion that the ID and SKU specific PDP Layouts were generating individual cache entries for each individual product, which lead to a huge amount of cache keys in Redis after people visited different product pages. After some time, we were looking at around 5+ million cache keys stored in REDIS, which was causing the issues we saw previously. We then implemented a solution to disable the feature that allows the individual layouts by ID and SKU, and we're now sitting at only 35k cache keys stored in REDIS after around 15h after the deployment and after the cache was cleared. It is a significant reduction in the amount of cached keys, reducing by a ton the amount of memory used. This check was done by going into the REDIS CLI client and typing in MEMORY STATS . This returns a "keys.count" array key with the amount of keys used in REDIS. As a comparison, I ran the check on a smaller website, which has around 90k products, and got the result of around 100k keys. Keep in mind that in this smaller website's case, the SKU/ID specific layouts are still enabled. The solution consists of implementing plugins on: Magento\Framework\View\Model\Layout\Merge These plugins prevent the addition of the 'CATALOG_PRODUCT_VIEW_ID' and 'CATALOG_PRODUCT_VIEW_SKU' to the layout, further preventing the generation of these individual layout keys in REDIS. Of course, our solution implies that you cannot create ID/SKU specific layout files by specifying the catalog_product_view_id/sku_... layout, but I'd say it's still recommended to create a custom product type and use it for the purpose of customizing certain PDPs, and it would also make the layout reusable in future products. |
Even though my comment isn't session related but cache related, maybe others may find this info useful. For the people having this Redis issue, can you try to disable Lua mode in the Redis configuration for your default cache in your
Lua was enabled by default in We've seen significant improvement in Redis usage by disabling Lua. See this related discussion about it btw: colinmollenhour/Cm_Cache_Backend_Redis#181 For the issue @TalesDuque mentions, there is this module that does the same: https://github.com/Vendic/module-optimize-cache-size |
Summary
With redis for session, configured with appendonly=yes (AOF), we noticed that after upgrading from 2.4.6-p2 to 2.4.6-p3, storage was increasing very very fast (More than 5Go used for 300Mo of actual data). Our redis servers were crashing because of that (volume getting full within 24hours...). We guess the session are updated more frequently than before.
A hotfix was to switch to RDB.
I'm not sure it is related to 2.4.6-p3, could be another composer dependency upgrade. But I wanted to let you know, maybe others have experienced this issue too ?
Examples
Proposed solution
No response
Release note
No response
Triage and priority
The text was updated successfully, but these errors were encountered: