Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Solve performance issues with very large chunks #42

Closed
wants to merge 1 commit into from

Conversation

kelunik
Copy link
Member

@kelunik kelunik commented May 23, 2018

Very large chunks need to be copied every time there's a partial write, which is pretty problematic. Instead of doing an almost full copy of the full chunk every time, this patch splits very large chunks into multiple smaller chunks automatically.

Fixes #41.

Very large chunks need to be copied every time there's a partial write, which is pretty problematic. Instead of doing an almost full copy of the full chunk every time, this patch splits very large chunks into multiple smaller chunks automatically.

Fixes #41.
@kelunik kelunik requested a review from trowski May 23, 2018 17:40
$chunks = \str_split($data, self::LARGE_CHUNK_SIZE);
$data = \array_pop($chunks);
foreach ($chunks as $chunk) {
$this->writes->push([$chunk, $written, null]);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I would instantiate dummy Deferred objects here instead of cluttering the code with !== null checks.
If we're anyway splitting, allocating a few objects more won't add that much overhead, while reducing code complexity.

@trowski
Copy link
Member

trowski commented Oct 22, 2018

Modified and merged.

@trowski trowski closed this Oct 22, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants