New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Using chunk in collection and export to Excel #65
Comments
Thank you! Not sure it will answer your question, but you can chunk data and export multiple excels files: $users = User::all();
$chunks = $users->chunk(1000);
foreach($chunks as $id => $chunk) {
(new FastExcel($users))->export("file-$id.xlsx");
} (it will create |
I think he's asking if there's a way to chunk data while exporting one excel file? |
Hi, Thanks for the reply, |
if all fails I will have to split the chunks into different files. I like the speed of your package, I guess I have to compromise something. |
@rap2hpoutre I think maybe it's time to revisit chunking for the package? People will use this package because it's fast, and their needs will gravitate towards bigger data sets. |
@markdieselcore Yes you are right! Not sure about the implementation though. Maybe we could follow the same pattern as for |
@rap2hpoutre hello, export big data into one excel file by chunk , is that ok ? i need this method |
@mengdodo this is not developed yet, but it seems important, so I will try to do something about that! |
I can contribute! Just hit me up!
…On Mon, Aug 26, 2019, 6:00 AM Raphaël Huchet ***@***.***> wrote:
@mengdodo <https://github.com/mengdodo> this is not developed yet, but it
seems important, so I will try to do something about that!
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#65?email_source=notifications&email_token=AAS4R7S2FWQQTUS7DAII7VDQGOSVVA5CNFSM4F2OCRWKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD5D5JJY#issuecomment-524801191>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAS4R7VYWKHZMOA4G74NSO3QGOSVVANCNFSM4F2OCRWA>
.
|
I just created a draft PR for this issue: https://github.com/rap2hpoutre/fast-excel/pull/112/files The idea is to use generators, which seems to have been build for this exact kind of need. With my (quick) tests, for about 100k lines, it only use ~5M of RAM versus ~50M without chunks. 🎉 To make it work, you have to create a generator function using // Generator function
function getUsersOneByOne() {
// build your chunks as you want (200 chunks of 10 in this example)
for ($i = 0; $i < 200; $i++) {
$users = DB::table('users')->skip($i * 10)->take(10)->get();
// Yield user one by one
foreach($users as $user) {
yield $user;
}
}
}
// Export consumes only a few MB
(new FastExcel(getUsersOneByOne()))->export('test.xlsx'); Minimal (shorter) example: function myGenerator() {
$users = get_users_as_you_want();
foreach($users as $user) {
yield $user;
}
}
(new FastExcel(myGenerator()))->export('test.xlsx'); What do you think about this implementation @mengdodo @ArturoGasca @Elshaden @markdieselcore ? Could it solve your problem? |
Another (more readable, with "real" chunks) example: function usersGenerator() {
yield from User::chunk(200, function($users) {
foreach($users as $user) {
yield $user;
}
});
}
// Export consumes only a few MB
(new FastExcel(usersGenerator()))->export('test.xlsx'); |
Fixed, available in v1.3.0 🎉 |
Showing Error: Can use "yield from" only with arrays and Traversables. Can you help? |
I have the same problem |
For what is it worth, the following also solves the Using the
|
Hi
Thanks for this package, it is extremely fast,
I have problems with memory exhausted in exporting over 300K, rows.
I need to chunk the exported data, to save memory,
how can this be done.?
The text was updated successfully, but these errors were encountered: