-
Notifications
You must be signed in to change notification settings - Fork 10.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Transient for Layered Nav Counts gets to big #17355
Comments
Hi,
This is not true, the counts have been there for quite some time.
This could happen I agree.
With the current data structures, The good news is we're working on new data stores for products and re-doing the data structures for attributes. This should help make layered nav performant. In the meantime you'll need to adjust your config, disable layered nav, or disable the counts in layered nav. We'll post on the dev blog when we have an update r/e custom tables. Thanks! |
Hi @mikejolley, I confirm that since WC 3.2.1, I have to purge my MariaDB bin log files every 5 days. With the current traffic I get on my site (which is yet little trafic), approximately every 5 days, this transient string is so huge that it generates up to 8 gigas of log when it's saved to the table. |
@opportus Consider memcache or flushing it via cron if it grows too much? |
Doing CRON job already, but this thing shouldn't be allowed anyway... |
If you disable the transient in the code, you'll see a slow down in page loads. I think it's a necessary evil until attributes/product data is moved to custom tables. wc_layered_nav_counts transient hangs around 1 day fwiw |
This is a major issue. We have a site generating ~150MB/sec of transaction logs. The transient is a huge bottleneck. We saw no noticable slow downs by removing the transient. In fact, we save MASSIVE gains in performance. You can see where we applied the patch. Using cron to clear out the transient is a poor solution. |
This is not a "wont-fix" scenario. It needs to be taken seriously. The ramifications, of an out of the box feature like this -- is that it's hurting any site it's installed on -- it doesn't scale -- it will cause people $$$ in hosting fees or their sites will be shut down. Not looking to be dramatic, but I want to be clear on the impact here. Ignoring this as a problem, would slate woocommerce for blacklisting as a solution for any professional property. |
cc @kloon and @rodrigoprimo for review. I assume we're not having issues on woo.com with this transient? |
We are not experiencing the issue on woo.com. That said from the new relic chart and discussions above it seems like there is no object cache in place. Transients will always be faster than a normal query as there is no joins or large queries needed to retrieve the data, it speeds up even more when you have an object cache present like memcache or reddis, which will then store these values in memory rather than in the DB. Saying this will cause people $$$ in hosting fees is not correct. Any decent managed WordPress host will have an object cache in place, even on small $29 plans. Removing the transients will definitely slow down your site, especially if you experience a large number of users on your site viewing the page where the layered navigation is present. I'd suggest you look into an object cache first as using transients is currently the fastest way to retrieve the data for repeat visits to a page where the layered nav is present. |
@kloon We transient cache all individual filters so avoid doing the same filter/query more than once. Do you anticipate issues with large transients? Should we disable it if object cache is disabled? |
We are seeing noticeable performance gains by removing the layered nav counts transient altogether. Our site did not slow down and has in fact, sped up. The single (very large) transient record was a bottleneck. As a quick fix to our immediate problem it is sufficient. We will look at enabling Redis and reverting our patch of course. We have a lot of attributes and filters with a decent amount of traffic. I think the core issue is that this single transient can get too large to be useful under certain conditions. Maybe there is a more efficient way to deal with the transients and how counts are cached? Perhaps instead of using a single transient record, use a unique transient record for each hashed query or other identifier? |
To expand on @kloon's answer (and please correct me if I'm wrong), we are not having issues on WooCommerce.com with this transient but that is because we don't use the "Filter products by attribute" widget.
@roylindauer and everyone else who is having trouble with this transient, if possible, could you please share some numbers? How big is/was the transient? How slow are the queries to count products within attributes?
I think that this is a good approach and I prefer it over not using a transient if object cache is disabled. @mikejolley and @kloon, do you see any drawbacks? |
I hear where you are coming from. I am here to voice a perspective based on the data I have currently. Which is, without doing anything fancy -- we encountered this problem.
Looking at that information, this became a larger concern about scale and whether or not this was a known problem. Then finding this ticket closed was concerning. Very specifically here's what was fixed: wp-content/plugins/woocommerce/includes/widgets/class-wc-widget-layered-nav.php:
The specific numbers and report we received from the hosting company was:
That being said, looking at the backups, I am not seeing the footprint I would expect if this was a long standing issue. Once again, that's a theory. |
@grok so your fix is basically removing the transient and leaving to do a large query every time that the page is requested? |
@claudiosanches Depending the site and its host, hundreds of large queries may be more preferable than having such a transient regularly written to the DB... Mike's proposition seems an |
@claudiosanches as a temporary fix it got our site back up and performing very well and appeased the DBA's. The large query is not that large, not compared to the bottleneck that was the massive transient record. @opportus, exactly. Short term fix was to disable the transient. It worked. But, there is obviously a benefit to using transients, so my recommendation now is to improve upon the current system. Store a transient for each hashed query. This will perform better for larger sites and will still benefit from object caching and other systems. I am happy to submit a PR. |
@rodrigoprimo I will see if I can get some numbers. |
@rodrigoprimo the SQL statement is not slow per se, but the payload is huge. The transient data is 1mb. It's a serialized array with 15625 indexes. For more context, we have 9 filter nav widgets on the page and around 50 options collectively across those filters. |
@roylindauer Would you be able to compare speeds:
#18771 worries me due to the volume of records it would create, and with the dynamic transient name those are harder to clear. |
Optionally, may compressing the transient value be possible/desirable? |
The biggest slow down I am seeing comes from writing the transient when the transient data is too large. We are having some issues with NewRelic so I ran my tests locally (using Docker + Xdebug + php7.1).
The patch does improve overall processing time, but at the expense of polluting the wp_options table. I attempted to use gzdeflate but the resulting record only saved 2kb (working with a 1mb transient) so I did not feel it was worth exploring further. Having a unique transient is interesting. With our current filter nav setup (9 filters), WP wrote 18 transient records (9 transients, 9 timeout entries) for a single page load. We have 25k hashed queries in the transient, so that would literally be 50k records added to the wp_options table. That may be an excessive amount of entries in the wp_options table. There is an index on the option_name though, so that helps. |
I'd like to see the limit option in place - that to me sounds like the best short term but would need stats to back it up. |
I tested all cases again, plus a new one where we have a transient per widget instance (in our case we have 9, so 9 unique transients).
The array_slice was configured to a length 2000. I ran each URL 3 times. The results below are the averages. It's not a perfect test. We are having some SSO issues with NewRelic so I was unable to use that. Here are the results:
Array slice shows good results! I would like to consider another option though; saving a transient for each instance of the widget. This is more manageable than 50k transients and would scale better than artificially restricting the size of the counts array. |
In my perfect world there would be a way to disable transients for the widget. The performance boost we got from not dealing with transients at all cannot be ignored. |
Split layered nav transient per issue #17355
I don't think this is fixed. My site has 60k+ products and two/three attributes with 50-100 terms and few more with less terms. I didn't configure robots.txt and Googlebot just came to my site and started crawling all combinations of filters by clicking everything recursively. It filled single transient row named
If I would extend memory limit and packet size limit it would fill up quickly to any size. I have >10k combinations of filters per subcategory. I have around 300 subcategories, which gives us >3M cached query results. I counted 66434 cached query results in above 16M data. For 3M cached query results this would be 720M fetched from single db row each time someone opens filters page. I suggest you use LRU cache + automatically purge items from transient if they exceed certain size. I believe this is a security vulnerability too because anyone can run their crawler to fill up transient and bring the site down (DoS) or bring fatal error/out of memory errors which could have unforeseen security consequences. See this relevant line from current trunk. |
We're experiencing the same issue as @jan-swiecki - in our case our transient is "only" 41 megabytes - on this relatively active site, we're seeing half a gigabit/s of MySQL queries going over the network 24/7, we then disable the code generating the transient, and usage drops to well below 30Mbps :) Current traffic in fact after disabling it: Apparently it's also happily causing a ton of disk IO (The drop in the end, is when I disabled the transient in the code): Hopefully that's a bug and not "by design" 😎 |
Resurrecting this thread - And adding a column for timestamp / modified shows that many of the layered nav counts are getting recalculated constantly - a few minutes after implementing this was the update counts on the transient rows. The timeout appears to be the correct timestamp, which should invalidate it after a day not after a minute. Currently I'm at nearly 700 updates for those very large serialized transients over perhaps an hour or so. I'm digging through WooCommerce to help understand what might be triggering the constant updating as I too run into issues with the Binary Log filling up (We have MySQL replication in place) Disabling WP_CRON does not stop these fields from getting updated. Leads me to believe there's some combination of either settings or logic that's causing those to be recalculated every other If your curious: Looking at the plugin - it seems that the Query Hash is routinely not set in the cached counts area, which then forces the transient to be updated every time. I wonder if the process should check if cached_counts has any transients like count($cached_counts) > 0 woocommerce/plugins/woocommerce/src/Internal/ProductAttributesLookup/Filterer.php Lines 180 to 187 in 78f659b
Either the Query Hash changes because perhaps maybe the order of terms changes in queries which results in changing the MD5 hash? I'm not quite sure. |
This appears to be still an issue @mikejolley A customer of ours had an options table of 22 GB, where after deleting all transients it only was 30MB (which is still not small). We notice that the rows with wc_layered_nav as name are growing rapidly. WP Core, plugins including Woo are at the latest version. update: we noticed a rapid growth of server disk usage in the pas few hours. It took 20 GB on two sites within two hours. |
Hi. We noticed significant slowdown on our database server yesterday, and it seems this transient is the culprit. Within a period 20 hours, the database recieved around 3000 update queries for these transients, taking between 20 - 40 seconds each. Totaling 20-30 hours for the database to run all queries! Essentially taking up over 1 core of the database the whole day. This caused all our websites to slow down a lot. We dont know what triggered all these queries. There have not been any problem with this from when we moved the website to a new server 17.11.2023 until 18.12.2023. The problems started 18.12.2023 01:55. The queries looked like this: UPDATE `wp_options` SET `option_value` = 'super_duper_long_serialized_string' WHERE `option_name` = '_transient_wc_layered_nav_counts_pa_{attribute_name}'; The super_duper_long_serialized_string is around 1.8MB per query. I did disable this caching by changing /wp-content/plugins/woocommerce/src/Internal/ProductAttributesLookup.php slightly: $cache = apply_filters( 'woocommerce_layered_nav_count_maybe_cache', true );
if ( false ) { //true === $cache ) { // Changed to always false because of transient database problems
$cached_counts = (array) get_transient( 'wc_layered_nav_counts_' . sanitize_title( $taxonomy ) );
} else {
$cached_counts = array();
}
if ( ! isset( $cached_counts[ $query_hash ] ) ) {
// phpcs:ignore WordPress.DB.PreparedSQL.NotPrepared
$results = $wpdb->get_results( $query_sql, ARRAY_A );
$counts = array_map( 'absint', wp_list_pluck( $results, 'term_count', 'term_count_id' ) );
$cached_counts[ $query_hash ] = $counts;
if ( true === $cache ) {
#set_transient( 'wc_layered_nav_counts_' . sanitize_title( $taxonomy ), $cached_counts, DAY_IN_SECONDS ); # Commented because of transient cache causing database problems
}
}
return array_map( 'absint', (array) $cached_counts[ $query_hash ] ); I testet response time of filtering before and after disabling. EDIT: |
Hi @Bjornar97,
This definitely surprises me. If you're happy to, and have an example to hand, could you share the string (serialized array) with us? Edit: you can ignore this question. |
Circling back to this: I suspect most merchants are not experiencing this, because of the existing cache invalidation that takes place (if new product attribute terms are defined and assigned to products, then the transient we are concerned with here will be deleted). Recent comments show that some merchants definitely are still being impacted, though. The current cache invalidation code seems to rely on third party code doing things the Woo way. If plugins are instead working via the WP API (or, perhaps, directly via MySQL) then cache invalidation is not triggered and the transient will grow and grow until it expires (as already noted, it expires within 24hrs by default). Here's a slightly contrived bit of code that can be setup as a mu-plugin, and lets us replicate that problem. https://gist.github.com/barryhughes/301ebdd3ede8f3c0c4fcea3e7a003f65 Perhaps, though, there are also other reasons for early cache invalidation failing. Whatever the cause, one thought is that instead of having a single transient storing an array like this:
We could instead have a separate transient, one per query hash:
There could be problems with that in some cases, but it might be the generally better way to do things. A more complex alternative could be to stick with a single transient, but include meta data that let's us manage an eviction process of some kind (a maximum of n cached results, and if we reach the max then we delete the oldest first) ... but that seems more complicated than we really need, since we can lean on WP to do this sort of work on our behalf via its transients API. |
Prerequisites
master
branch of WooCommerce on Github (this is not the same version as on WordPress.org!)Steps to reproduce the issue
Expected/actual behavior
Since the new update 3.2.1 for Woocommerce the counts of the layered_nav widgets are stored into an transient. Problem is that it will append to the transient each time it has a new key. Since we are running a webshop with a lot (i mean really a lot) filter possibilities, this transient gets really big. So big, it throws an error (our LOG folder was out of space) when trying to update the database with the new transient value:
Error while sending QUERY packet
Of course I could try to tweak my database server so it can accept bigger packets, but I think it is better to solve this in Woocommerce.
Isolating the problem
WordPress Environment
Home URL: [redacted]
Site URL: [redacted]
WC Version: 3.2.1
Log Directory Writable: ✔
WP Version: 4.8.2
WP Multisite: –
WP Memory Limit: 512 MB
WP Debug Mode: –
WP Cron: –
Language: nl_NL
Server Environment
Server Info: Apache/2.4.27 (CentOS) OpenSSL/1.0.1e-fips
PHP Version: 7.0.22
PHP Post Max Size: 50 MB
PHP Time Limit: 30
PHP Max Input Vars: 1000
cURL Version: 7.29.0
NSS/3.21 Basic ECC
SUHOSIN Installed: –
Max Upload Size: 50 MB
Default Timezone is UTC: ✔
fsockopen/cURL: ✔
SoapClient: ✔
DOMDocument: ✔
GZip: ✔
Multibyte String: ✔
Remote Post: ✔
Remote Get: ✔
Database
WC Database Version: 3.2.1
WC Database Prefix: O8pP3gM_
MaxMind GeoIP Database: ✔
Totale databasegrootte: 710.64MB
Database datagrootte: 468.11MB
Database index grootte: 242.53MB
O8pP3gM_woocommerce_sessions: Data: 12.53MB + Index: 0.50MB
O8pP3gM_woocommerce_api_keys: Data: 0.02MB + Index: 0.03MB
O8pP3gM_woocommerce_attribute_taxonomies: Data: 0.02MB + Index: 0.02MB
O8pP3gM_woocommerce_downloadable_product_permissions: Data: 0.02MB + Index: 0.05MB
O8pP3gM_woocommerce_order_items: Data: 0.06MB + Index: 0.02MB
O8pP3gM_woocommerce_order_itemmeta: Data: 0.33MB + Index: 0.34MB
O8pP3gM_woocommerce_tax_rates: Data: 0.02MB + Index: 0.06MB
O8pP3gM_woocommerce_tax_rate_locations: Data: 0.02MB + Index: 0.03MB
O8pP3gM_woocommerce_shipping_zones: Data: 0.02MB + Index: 0.00MB
O8pP3gM_woocommerce_shipping_zone_locations: Data: 0.02MB + Index: 0.03MB
O8pP3gM_woocommerce_shipping_zone_methods: Data: 0.02MB + Index: 0.00MB
O8pP3gM_woocommerce_payment_tokens: Data: 0.02MB + Index: 0.02MB
O8pP3gM_woocommerce_payment_tokenmeta: Data: 0.02MB + Index: 0.03MB
O8pP3gM_woocommerce_log: Data: 0.02MB + Index: 0.02MB
O8pP3gM_commentmeta: Data: 0.14MB + Index: 0.03MB
O8pP3gM_comments: Data: 0.09MB + Index: 0.09MB
O8pP3gM_kiyoh: Data: 0.02MB + Index: 0.00MB
O8pP3gM_links: Data: 0.02MB + Index: 0.02MB
O8pP3gM_nf3_actions: Data: 0.02MB + Index: 0.00MB
O8pP3gM_nf3_action_meta: Data: 0.05MB + Index: 0.00MB
O8pP3gM_nf3_fields: Data: 0.02MB + Index: 0.00MB
O8pP3gM_nf3_field_meta: Data: 0.11MB + Index: 0.00MB
O8pP3gM_nf3_forms: Data: 0.02MB + Index: 0.00MB
O8pP3gM_nf3_form_meta: Data: 0.02MB + Index: 0.00MB
O8pP3gM_nf3_objects: Data: 0.02MB + Index: 0.00MB
O8pP3gM_nf3_object_meta: Data: 0.02MB + Index: 0.00MB
O8pP3gM_nf3_relationships: Data: 0.02MB + Index: 0.00MB
O8pP3gM_options: Data: 12.41MB + Index: 4.50MB
O8pP3gM_postmeta: Data: 203.77MB + Index: 160.19MB
O8pP3gM_posts: Data: 40.52MB + Index: 14.75MB
O8pP3gM_prli_clicks: Data: 0.02MB + Index: 0.19MB
O8pP3gM_prli_groups: Data: 0.02MB + Index: 0.02MB
O8pP3gM_prli_links: Data: 0.02MB + Index: 0.16MB
O8pP3gM_prli_link_metas: Data: 0.02MB + Index: 0.03MB
O8pP3gM_redirects: Data: 1.52MB + Index: 0.00MB
O8pP3gM_revslider_css: Data: 0.13MB + Index: 0.00MB
O8pP3gM_revslider_layer_animations: Data: 0.02MB + Index: 0.00MB
O8pP3gM_revslider_navigations: Data: 0.02MB + Index: 0.00MB
O8pP3gM_revslider_sliders: Data: 0.05MB + Index: 0.00MB
O8pP3gM_revslider_slides: Data: 0.05MB + Index: 0.00MB
O8pP3gM_revslider_static_slides: Data: 0.02MB + Index: 0.00MB
O8pP3gM_termmeta: Data: 0.06MB + Index: 0.06MB
O8pP3gM_terms: Data: 0.06MB + Index: 0.03MB
O8pP3gM_term_relationships: Data: 6.48MB + Index: 4.30MB
O8pP3gM_term_taxonomy: Data: 0.11MB + Index: 0.06MB
O8pP3gM_usermeta: Data: 0.23MB + Index: 0.20MB
O8pP3gM_users: Data: 0.02MB + Index: 0.05MB
O8pP3gM_wfBadLeechers: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfBlockedIPLog: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfBlocks: Data: 0.02MB + Index: 0.02MB
O8pP3gM_wfBlocksAdv: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfConfig: Data: 0.27MB + Index: 0.00MB
O8pP3gM_wfCrawlers: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfFileMods: Data: 4.52MB + Index: 0.00MB
O8pP3gM_wfHits: Data: 164.67MB + Index: 54.64MB
O8pP3gM_wfHoover: Data: 0.02MB + Index: 0.02MB
O8pP3gM_wfIssues: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfKnownFileList: Data: 2.52MB + Index: 0.00MB
O8pP3gM_wfLeechers: Data: 5.52MB + Index: 0.00MB
O8pP3gM_wfLockedOut: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfLocs: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfLogins: Data: 0.16MB + Index: 0.06MB
O8pP3gM_wfNet404s: Data: 3.52MB + Index: 1.52MB
O8pP3gM_wfNotifications: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfPendingIssues: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfReverseCache: Data: 0.02MB + Index: 0.00MB
O8pP3gM_wfScanners: Data: 5.52MB + Index: 0.00MB
O8pP3gM_wfSNIPCache: Data: 0.06MB + Index: 0.05MB
O8pP3gM_wfStatus: Data: 1.52MB + Index: 0.27MB
O8pP3gM_wfThrottleLog: Data: 0.02MB + Index: 0.02MB
O8pP3gM_wfVulnScanners: Data: 0.02MB + Index: 0.00MB
O8pP3gM_woocommerce_buckaroo_transactions: Data: 0.02MB + Index: 0.00MB
O8pP3gM_woof_query_cache: Data: 0.02MB + Index: 0.02MB
O8pP3gM_yoast_seo_links: Data: 0.23MB + Index: 0.08MB
O8pP3gM_yoast_seo_meta: Data: 0.02MB + Index: 0.00MB
Post Type Counts
attachment: 26796
customize_changeset: 35
custom_css: 1
dgd_scrollbox: 1
hm_custom_css: 13
hm_custom_javascript: 13
mc4wp-form: 2
nav_menu_item: 19
nf_sub: 36
page: 46
post: 101
product: 5274
product_variation: 48998
revision: 716
shop_coupon: 1
shop_order: 247
shop_webhook: 1
vgpc: 4
vgwc: 26
warmcache: 4
Security
Secure connection (HTTPS): ✔
Hide errors from visitors: ✔
Active Plugins (28)
AJAX AutoSuggest: door Farhad Ahmadi – 1.9.8
Akismet Anti-Spam: door Automattic – 4.0
AMP: door Automattic – 0.5.1
Application Passwords: door George Stephanis – 0.1-dev
Scroll Triggered Box: door Dreamgrow Digital – 2.3
Enhanced E-commerce for Woocommerce store: door Tatvic – 1.2.0 – Niet getest met de actieve versie van WooCommerce
Eggplant 301 Redirects: door Shawn Wernig http://www.eggplantstudios.ca – 2.3.0
Gigagaaf Kiyoh: door F&J webcreation – 1.0.0
Gigagaaf: door F&J webcreation – 1.0.0
ImageMagick Engine: door Brand Manual – 1.5.2
WPBakery Page Builder: door Michael M - WPBakery.com – 5.3
Loco Translate: door Tim Whitlock – 2.0.16
MailChimp for WordPress: door ibericode – 4.1.10
Maintenance: door fruitfulcode – 3.5.1
Mega Main Menu: door MegaMain.com – 2.1.2
Ninja Forms: door De WP Ninjas – 3.2.2
Slider Revolution: door ThemePunch – 5.4.3.1
Transients Manager: door Pippin Williamson – 1.7.3
VG PostCarousel: door VinaWebSolutions – 1.1
VG WooCarousel: door VinaWebSolutions – 1.3 – Niet getest met de actieve versie van WooCommerce
VinaGecko Helper: door VinaGecko – 1.0.0
WC Buckaroo BPE Gateway: door Buckaroo – 2.4.1 – Niet getest met de actieve versie van WooCommerce
WooCommerce Checkout Field Editor: door WooCommerce – 1.5.5 – 1.5.7 is beschikbaar – Niet getest met de actieve versie van WooCommerce
WooCommerce - Store Exporter: door Visser Labs – 1.9.3 – Niet getest met de actieve versie van WooCommerce
WooCommerce: door Automattic – 3.2.1
Yoast SEO: door Team Yoast – 5.6.1
WP-Mail-SMTP: door WPForms – 0.10.1
Zendesk Chat Widget: door Zopim – 1.4.11
Settings
API Enabled: ✔
Force SSL: ✔
Currency: EUR (€)
Currency Position: left_space
Thousand Separator: .
Decimal Separator: ,
Number of Decimals: 2
Taxonomies: Product Types: external (external)
grouped (grouped)
simple (simple)
variable (variable)
Taxonomies: Product Visibility: exclude-from-catalog (exclude-from-catalog)
exclude-from-search (exclude-from-search)
featured (featured)
outofstock (outofstock)
rated-1 (rated-1)
rated-2 (rated-2)
rated-3 (rated-3)
rated-4 (rated-4)
rated-5 (rated-5)
WC Pages
Winkelbasis: #4577 - /overzicht/
Winkelmand: #2607 - /winkelmand/
Afrekenen: #2608 - /bestellen/
Mijn account: #2609 - /mijn-account/
Algemene voorwaarden: ❌ Pagina niet ingesteld
Theme
Name: [redacted]
Version: 1.6
Author URL: [redacted]
Child Theme: ❌ – Als je wijzigingen aanbrengt in WooCommerce of in een hoofdthema dat je niet zelf gebouwd hebt
raden we het gebruik van een child thema aan. Zie: Hoe maak je een child thema
WooCommerce Support: ❌ Niet aangegeven
Templates
Overrides:
[redacted]/woocommerce/archive-product.php
[redacted]/woocommerce/cart/cart-shipping.php versie 2.5.0 is verouderd. De hoofdversie is 3.2.0
[redacted]/woocommerce/cart/cart.php versie 3.0.3 is verouderd. De hoofdversie is 3.1.0
[redacted]/woocommerce/cart/cross-sells.php
[redacted]/woocommerce/cart/mini-cart.php versie 2.5.0 is verouderd. De hoofdversie is 3.2.0
[redacted]/woocommerce/checkout/form-checkout.php
[redacted]/woocommerce/checkout/form-login.php
[redacted]/woocommerce/checkout/payment.php
[redacted]/woocommerce/checkout/review-order.php
[redacted]/woocommerce/checkout/thankyou.php versie 3.0.0 is verouderd. De hoofdversie is 3.2.0
[redacted]/woocommerce/content-product.php
[redacted]/woocommerce/content-product_cat.php
[redacted]/woocommerce/content-single-product.php
[redacted]/woocommerce/global/quantity-input.php versie 2.5.0 is verouderd. De hoofdversie is 3.2.0
[redacted]/woocommerce/loop/loop-end.php
[redacted]/woocommerce/loop/loop-start.php
[redacted]/woocommerce/single-product/add-to-cart/variable.php
[redacted]/woocommerce/single-product/price.php
[redacted]/woocommerce/single-product/related.php
[redacted]/woocommerce/single-product/review.php
[redacted]/woocommerce/single-product/sale-flash.php
[redacted]/woocommerce/single-product/share.php
[redacted]/woocommerce/single-product/short-description.php
[redacted]/woocommerce/single-product/tabs/additional-information.php
[redacted]/woocommerce/single-product/tabs/description.php
[redacted]/woocommerce/single-product/tabs/tabs.php
[redacted]/woocommerce/single-product/up-sells.php
[redacted]/woocommerce/single-product-reviews.php versie 2.3.2 is verouderd. De hoofdversie is 3.2.0
[redacted]/woocommerce/single-product.php
Outdated Templates: ❌Leer hoe te updaten
`
The text was updated successfully, but these errors were encountered: