New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
memory_limit doesn't correspond to the one in /etc/php.ini #1791
Comments
Update. Regarding this issue, I discovered the existence of the WP constant: WP_MAX_MEMORY_LIMIT Which, if not defined in wp-config, defaults to 256M and it was limiting every setting I was trying to use to that value. To prove this I've put: define('WP_MAX_MEMORY_LIMIT', '512M'); In wp-config.php and now the crash occurs exactly at 536870912 bytes (512M), yay:
With this established, now I have to understand what the hell is going on on the importer to consume those helluva of resources to hit 512M for an image resize of 812K. Do wp-cli import flush cache time to time if you know? And frees/unset unused variables? |
Are referring to a memory leak in wp-cli? |
There are different SAPI-s. One for your webserver and one for PHP CLI. They should have separate config files. |
I'll have to ask the sysadmin, I'm a developer with a low-medium knownledge of unix system. As for the memory leak, to me it seems it should free/unset unused variables maybe, that accumulates and occupy ram, but I haven't dug in the code, I'm not sure. By the way with these lines in wp-config.php the import has run up until 50% of the 24413 post/attachments: set_time_limit( 86400 ); But just now it crashed again, with a new error at the bottom:
zend_mm_heap corrupted |
Try it with |
Tried, but crashed again at about 60% of total xml, same max_execution_time of 60 seconds error, even if the script was running for 3 hours straight because the server's mysql was a bit occupied with the other hosted sites. This makes no sense! :(
At this point I don't know, maybe I try to split/export in chunks to import smaller XMLs. |
Well, I finally got it working, the import has reached 100%. But, as I wrote earlier, I had to make a small edit to a Wordpress core file, to remove a set_time_limit call. file: /wp-includes/functions.php around line 573 function wp_get_http( $url, $file_path = false, $red = 1 ) {
// @set_time_limit( 60 ); <-- commented out this
if ( $red > 5 )
return false; So, this hack plus both:
in removed the fatal errors. Only small problem now is that I didn't see the finale message "Import complete, have fun", I closed manually, maybe it took much time to do the final calculations, but with no output feedback was hard to understand what was going on, but this is a problem with the default importer not wp-cli. |
I discovered this the hard way too: https://core.trac.wordpress.org/ticket/32075
Why is the download taking greater than 60 seconds?
The big memory suck is WordPress' internal object cache. The WordPress importer, which WP-CLI wraps, actually disables cache flush during import. I thought we had some utility to periodically flush cache, but it appears we don't. Related #1704 (comment) Ultimately, the problem is that the WordPress importer is long-overdue for a rewrite. |
I've added the utility if you'd like to try it out #1794 |
Sorry, bad question. I'll continue on #1792 |
Thanks for the utiliy/patch for the memory. I don't know if I'll manage to test it soon, the "official" import after the test is already running. I'm new to tha phar thing, if I update wp-cli with the auto updater or redownload it, will the patch be already available? |
No worries. I've landed it — I think it was just an oversight it wasn't included before.
Not until 0.19.0 is released, which I hope will be soon ;) |
Hello, I'm using wp-cli to import an xml consisting of posts and relative attachments.
wp import guides.xml --allow-root --user=1 --authors=create
I got this on big images:
Here it refers to a limit of 256M but it's nowhere to be found. /etc/php.ini is the config loaded for php and I set it to 512M there. Then I tried to put also in wp-config.php via define WP_MEMORY_LIMIT at 512M, but nothing.
To overcome this error I had to add --skip=image_resize and it worked, but still I'd like to know if you know why that was happening, that 256M seemed to be unchangable.
Thank you in advance for any help and suggestion.
The text was updated successfully, but these errors were encountered: