Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PHP Fatal error: Uncaught ErrorException: stream_set_chunk_size() #1592

Closed
otzy opened this issue Sep 16, 2021 · 4 comments
Closed

PHP Fatal error: Uncaught ErrorException: stream_set_chunk_size() #1592

otzy opened this issue Sep 16, 2021 · 4 comments
Labels
Milestone

Comments

@otzy
Copy link

otzy commented Sep 16, 2021

Monolog version 2.3.4

Somehow StreamHandler->streamChunkSize becomes too big:

PHP Fatal error:  Uncaught ErrorException: stream_set_chunk_size(): The chunk size cannot be larger than 2147483647 in vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php:144
Stack trace:
#0 [internal function]: Laravel\Lumen\Application->Laravel\Lumen\Concerns\{closure}(2, 'stream_set_chun...', '/var/releases/v...', 144, Array)
#1 vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php(144): stream_set_chunk_size(Resource id #641, 5583457484)
#2 vendor/monolog/monolog/src/Monolog/Handler/RotatingFileHandler.php(125): Monolog\Handler\StreamHandler->write(Array)
#3 vendor/monolog/monolog/src/Monolog/Handler/AbstractProcessingHandler.php(48): Monolog\Handler\RotatingFileHandler->write(Array)
#4 vendor/monolog/monolog/src/Monolog/Logger.php(327): Monolog\Handler\AbstractProce in vendor/monolog/monolog/src/Monolog/Handler/StreamHandler.php on line 144

probably something is wrong with memory limit calculation.
and the constant MAX_CHUNK_SIZE is not actually used to limit the chunk size

@otzy otzy added the Bug label Sep 16, 2021
@robvankeilegom
Copy link

robvankeilegom commented Sep 17, 2021

We're having the same problem. Our hosting provider sets the memory_limit to an unreasonable high amount instead of -1. Which causes this error.

edit: Can i send a PR that caps the calculated memory to self::MAX_CHUNK_SIZE?

robvankeilegom added a commit to robvankeilegom/monolog that referenced this issue Sep 17, 2021
@degecko
Copy link

degecko commented Sep 29, 2021

self::MAX_CHUNK_SIZE is actually correctly calculated but is not used in that function. Instead, the value from this line is used: 70fe092#diff-a1ddc5c4ead6773b8670f9be5007cbe0239638aedbcf12166acf075a9b8742a2R59

Which has been added a few days ago, and which calculates 10% of the value from memory_limit.

On my local setup I have 30 GB of memory limit, which is half my RAM, and it was trying to allocated 3 of that via stream_set_chunk_size(), which has a limit of 2 GB, apparently (the limit matches self::MAX_CHUNK_SIZE).

I think @Seldaek wanted to use min() on that line instead of max(). :)

For those needing a quick fix, you should set memory_limit to a maximum of 20 GB until this is resolved.

P.S. I know I'm a bit late to the party, but I think the 791f547 commit makes the matter worse instead of solving the actual problem my simply using min instead of max.

@PandyLegend
Copy link

I also encountered this issue. Setting the memory_limit to 20GB didn't work though, I had to set it to 18GB. Hopefully the issue will be resolved shortly.

@Seldaek Seldaek closed this as completed in 9d1fed4 Oct 1, 2021
@Seldaek Seldaek added this to the 2.x milestone Oct 1, 2021
@Seldaek
Copy link
Owner

Seldaek commented Oct 1, 2021

Sorry about the trouble here.. I didn't realize the previous max constant was a hard limit, and definitely did not assume people would have memory limit set to 30GB :) Will get a release out shortly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants