Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming _json #413

Closed
Gunni opened this issue Feb 25, 2020 · 11 comments
Closed

Streaming _json #413

Gunni opened this issue Feb 25, 2020 · 11 comments
Assignees

Comments

@Gunni
Copy link

Gunni commented Feb 25, 2020

Has anyone considered streaming json to the client?

In the _json function you're calling $json = ($encode) ? json_encode($data, $option) : $data; and then sending a response with the result as part of the payload.

That uses a lot of memory, $data itself might be huge, but then the $json is basically double that with json encoding overhead added.

But what about instead: Send the response headers immediately without content-length, and then loop over the data to be encoded, turn it into json, until a buffer fills up, and then send the buffer to the client, repeat until data is finished.

@mfrederico
Copy link

I have tried this - using a header('connection: close') on regular php works great. However for some reason flight doesn't allow the "connection" header to be modified. I'm looking to blame my webserver, but it cooperates outside of flight.

@mfrederico
Copy link

mfrederico commented Jun 12, 2020

Putting this in a route works:

header('X-Accel-Buffering: no');
        header( 'Content-type: text/html; charset=utf-8' );
        echo 'Begin ...<br />';
        for( $i = 0 ; $i < 10 ; $i++ )
        {
            echo $i . '<br />';
            flush();
            ob_flush();
            sleep(1);
        }
        echo 'End ...<br />';

@mfrederico mfrederico mentioned this issue Jun 12, 2020
@Gunni
Copy link
Author

Gunni commented Jul 29, 2020

Hey,

I just fought this issue myself, i got it working atlast, so, here's some pseudo code:

// Ensure headers are sent before streaming data to client
$response = Flight::response();
$response->header('Content-Type', 'application/json');
// Ensures that $sent is set to true, so that _stop does not mess with our buffer
$response->send();

// Start building the output json
echo '{"something":[';

$cnt = 0;

while (($port = $ports_db->fetch_assoc())) {
    if ($cnt > 0) {
        echo ',';
    }

    $cnt++;
    echo json_encode($port);

    ob_flush();
}

echo ']}';

I had a "fun time" debugging why the last two or so objects kept being missing from my output, or with my buffer being replaced with the last echo.

Turns out that _stop does some shenanigans to the buffer, so to prevent that, just get the response object immediately, call send() and then do your streaming code.

Good luck to anyone trying this!

@n0nag0n
Copy link
Collaborator

n0nag0n commented Jan 3, 2024

Thanks for your help with this! I'll close the issue for now.

@n0nag0n n0nag0n closed this as completed Jan 3, 2024
@Gunni
Copy link
Author

Gunni commented Jan 4, 2024

I'd argue that my hack is a workaround and not a proper solution. This issue was requesting a proper way to be added to the framework.

@n0nag0n
Copy link
Collaborator

n0nag0n commented Jan 4, 2024

I agree with you, but I just see this as being the minority of usage, not the majority of usage. In time we'll have other Flight packages and we'll probably have something called JsonStream or something like that that is not part of the core, but will allow for functionality like this. I have to do something similar to this for work as json_encoding() 1000's of objects runs out of memory in our case.

@Gunni
Copy link
Author

Gunni commented Jan 4, 2024

So then just add it as a feature, many people are probably hitting this issue, just like you and me, only a tiny percentage of people check Issues, much fewer reports them.

@Gunni
Copy link
Author

Gunni commented Jan 5, 2024

I'd even go as far as to suggest it be made the default, stream the object given to the client as default behavior.
AFAIK all HTTP clients handle such a thing fine, some just wait until done, others stream process things but both handle either method transparently.

@n0nag0n
Copy link
Collaborator

n0nag0n commented Jan 5, 2024

I'll reopen this so we can put a pin in it for later.

@n0nag0n n0nag0n reopened this Jan 5, 2024
@n0nag0n n0nag0n self-assigned this Jan 13, 2024
@n0nag0n
Copy link
Collaborator

n0nag0n commented Jan 20, 2024

In doing some research, it looks like we'd have to make some modifications to the following:

Engine.php
We'd have to modify the filter for start so that it likely doesn't run, or handles things differently.
image

Then in the _stop() method, we'll need to possibly do an ob_get_flush() instead if there's some var set to stream instead of capture the output buffer.
image

I think the other comments on this thread will really help get this pointed in the right direction.

Also, no idea why it hates the Connection header?

@n0nag0n
Copy link
Collaborator

n0nag0n commented Feb 20, 2024

Finally got this in place. #547 Feedback welcome!

@n0nag0n n0nag0n closed this as completed Feb 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants