Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Symfony connsole table helper extremely slow when fed a big data set #21568

Closed
gggeek opened this issue Feb 8, 2017 · 8 comments
Closed

Symfony connsole table helper extremely slow when fed a big data set #21568

gggeek opened this issue Feb 8, 2017 · 8 comments

Comments

@gggeek
Copy link

gggeek commented Feb 8, 2017

Q A
Bug report? no
Feature request? yes
BC Break report? no
RFC? no
Symfony version 2.7.11

I am using the table helper to display a list of about 350K rows.
It takes more than a couple of minutes for the following code to execute and the display to appear on screen:

        $table = $this->getHelperSet()->get('table');
        $table
            ->setHeaders(array('#', 'Migration', 'Status', 'Executed on', 'Notes'))
            ->setRows($data);
        $table->render($output);

Is there a way to optimize the output speed ?
Is the case of that many rows considered out of scope ?

@stof
Copy link
Member

stof commented Feb 8, 2017

So many rows is probably out of scope (in practice, most shells will not less you scroll up enough to see them btw).

@stof stof added the Console label Feb 8, 2017
@gggeek
Copy link
Author

gggeek commented Feb 8, 2017

but, but... I have my screen buffer set to 500K in Putty! :-D

Anyway, I'm adding a warning to the user, in the specific console application, if rows are above 10K or similar

@javiereguiluz
Copy link
Member

I'd say this is an edge case. If you generate 250K rows, it's better to output them into a file instead of the command console.

@gggeek
Copy link
Author

gggeek commented Feb 8, 2017

@javiereguiluz true, but the way I output it to a file is to call the console command and pipe it to file :P

@javiereguiluz
Copy link
Member

@gggeek sure ... but the idea would be to replace the console table stuff by a simple fputcsv() call. Then the user can work on the 250K rows using Excel, Gsuite or any other related tool.

@robfrawley
Copy link
Contributor

If you are working with 100k+ rows you might want to invest some time in a more appropriate language for the task. That said, "a couple of minutes" sounds awfully long.

@gggeek
Copy link
Author

gggeek commented Feb 8, 2017

Just to clarify: the standard usecase for this tool is 10-100 lines of output, and it is designed to give its output to the user, not a report for further processing.
But there has been some abuse of the tool for some very precise and specific need, resulting in a lot of data.
Tbh I first suspected that getting the data out of the app was causing most of the lag, but when I started measuring it, I found out that the business logic was actually quite fast (thank god for proper indexes on db cols, php7 for computing speed and linux buffer cache for crazy fast disk access), and that the table output was in fact adding a big chunk of waiting.

@javiereguiluz
Copy link
Member

Let's close this as "won't fix" because this is a very rare edge case. You could output the report to a file (without using Console's tables), you could trim the results to 100 or 200 lines and add a message saying that there are 250K rows more, etc.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants