-
Notifications
You must be signed in to change notification settings - Fork 355
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Memory problems running finfo::buffer with PHP_CLI on AWS, large files #522
Comments
It turns out that, when truncating at 1024, may cause pptx files to be detected as zip. We found at least one case. Experimentally truncating at 10000 caused it to be correctly detected as "application/vnd.openxmlformats-officedocument.presentationml.presentation" again. |
Have you dealt with problem of large files in Gaufrette? |
Turns out this is indeed a specific AWS issue, increasing the memory limit in PHP will increase the file size u can handle. However increasing the memory limit is something i usualy avoid so we found a workaround to this. If you open a stream instead of the content and then use $fileInfo->file(stream_get_meta_data($content)['uri']) instead does work and yields the same results. All the Gaufrette adapters that we used can handle the stream and will proxy it to the $fileInfo->file(stream_get_meta_data($content)['uri']) instead of $fileInfo->buffer($content). Fixing this issue entirely for our use case. I still am not a fan of code breaking because of server quarks so i will follow this further with AWS support. The AWS support agent acknowledged the problem and redirected the issue to people that are more capable of resolving this. I will post an update as soon as i get a response. |
@boraneksen have you received any further info from AWS support regarding this issue? |
I also ran into this issue today. Have you heard anything from AWS, @boraneksen? |
On further investigation, the issue does not seem to be restricted to CLI or AWS: thephpleague/flysystem#1172 |
Moving large files from Amazon AWS to S3 using a CakePHP shell, the burzum/cakephp-file-storage plugin and knplabs/Gaufrette we ran into memory problems. The problems appear to be specific to AWS, working with the command-line PHP interpreter and using finfo::buffer on large files.
We get the following messages:
We were able to reproduce the first warning, which we think is at the core of this issue, with the following PHP script:
Serving this script through Apache/PHP-FPM doesn't cause any problems. Neither running this with PHP-CLI on other systems. But running it with PHP-CLI on AWS yields the same warning. The filename.m4v file is 311M. We (temporarily) configured PHP-CLI memory_limit to 3072M. On another server (non AWS) with a memory_limit of 1024M, we do not see this issue. We suspect it has something to do with the way the AWS filesystem or memory management is set up. Note that according to the above warning, PHP tried to allocate 2GB over the allowed 3GB to parse a 311MB file.
We managed to resolve this issue by cutting the input string short to 1024 characters. It appears that finfo:buffer continues to work for most/all files even when they're truncated this way?
The text was updated successfully, but these errors were encountered: