Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Uploading big excel file #1394

Closed
Silwady opened this issue Oct 14, 2017 · 9 comments
Closed

[BUG] Uploading big excel file #1394

Silwady opened this issue Oct 14, 2017 · 9 comments

Comments

@Silwady
Copy link

Silwady commented Oct 14, 2017

Package version, Laravel version

Laravel 5.5
Maatwebsite/Laravel-Excel 2.1

Expected behaviour

The project works fine on small files (less than 1.5MB). It Must upload big files like 20MB - 30MB.

Actual behaviour

It shows this error message when uploading excel files bigger than 1.5MB. My sheet is about 22 column and at least 10K rows

Error as shown in Whoops

maatweb excel error log

php info file

php 7 info page

I used this code (which is works with small files)

$data = Excel::load($path, function($reader) {})->get();

What should I do?

@mradham
Copy link

mradham commented Oct 14, 2017

Greetings Silwady,
Can you paste output of either the php error log , or the laravel.log ?

@mradham
Copy link

mradham commented Oct 14, 2017

sorry, just saw the error message in ur screenshoot, try increase the memory limit to either 512MB or even 1GB . laravel is known to consume good amount of memory.

@Silwady
Copy link
Author

Silwady commented Oct 14, 2017

I tried to use chunk filter, but it returned $data as empty array before looping inside foreach($results as $row)

    $data = [];
    Excel::filter('chunk')->load($path)->chunk(1000, function($results) use(&$data)
    {
        foreach($results as $row)
        {
            $data[] = $row;
        }
    });
    return $data;

What I need now is returning $data list. How can I return $data list after looping inside closure?

@stephanecoinon
Copy link

stephanecoinon commented Oct 14, 2017

pass false as third parameter to chunk() to disable queuing

    $data = [];
    Excel::filter('chunk')->load($path)->chunk(1000, function ($results) use (&$data) {
        foreach ($results as $row) {
            $data[] = $row;
        }
    }, $shouldQueue = false);
    return $data;

see Import > Queued Chunks in docs for details

@Silwady
Copy link
Author

Silwady commented Oct 15, 2017

Hi @stephanecoinon

it works :) it reads a first 1000 row, but unfortunately it is stopped directly after that, so it reads only 1000 row, What do you suggest for that?

EDIT:

The file analysis is called within custom queue (I created it), If I used $shouldQueue = false parameter, it stops all queues in my application,which is damaged my application, Can I only disable package queue (not mine)?

@stephanecoinon
Copy link

ok, so using a local variable to store the file content won't work in that case. Maybe store a collection in the container like so:

    // Setup a collection in the container
    app()->instance('analysisResults', collect());

    // Read the workbook in chunks
    Excel::filter('chunk')->load($path)->chunk(1000, function ($rows) {
        // Merge the current chunk of rows into the collection
        app()->instance('analysisResults', app('analysisResults')->merge($rows));
    });

    // Dump all the rows
    dd(app('analysisResults'));

@FBnil
Copy link

FBnil commented Nov 15, 2017

@stephanecoinon I suspect that it has to with having cell data, this worked for me: $row->toArray();

    $data = [];
    Excel::filter('chunk')->load($path)->chunk(1000, function ($results) use (&$data) {
        foreach ($results as $row) {
            $data[] = $row->toArray();
        }
    }, $shouldQueue = false);
    return $data;

Also found a sweetspot of 400 instead of 1000, but ymmv.

@manoelsouzaunicef
Copy link

Hello, guys
Im having the same problem with excel. I have a xls file with more than 65.000 records. Can i use the filter chunk? it's correct? Is there other mode for this?

@vipin-chand
Copy link

I am facing the same issue I have 30 records file size is 20 MB, using chunk as well

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants