You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is the bug applicable and reproducable to the latest version of the package and hasn't it been reported before?
Yes, it's still reproducable
What version of Laravel Excel are you using?
3.1.55
What version of Laravel are you using?
10.45.1
What version of PHP are you using?
8.2
Describe your issue
When exporting 35k models the jobs fail because of a memory exceeded exception. I am using a queued export with chunk size 3000. It fails on page 8, so it processed around 24k lines.
In the export, there are around 5 columns, so the amount of data in the file is not overly big.
Perhaps it is worth noticing that the execution time of the job gets longer with each job. Running locally:
and eventually it stops because it uses too much memory. I have a feeling this is caused when reopening the cached excel file (depending on the writer type of course).
How can the issue be reproduced?
Import a file with a lot of models. Furthermore, I am using the following concerns:
FromQuery, ShouldQueue, WithCustomChunkSize
What should be the expected behaviour?
I should just export the whole file and complete all chained jobs
The text was updated successfully, but these errors were encountered:
Just wrap the export in your own queue job and put it on a long running queue. Queue-Chunking is unfortunatly not possible without increasing memory on each job when using xlsx
Is the bug applicable and reproducable to the latest version of the package and hasn't it been reported before?
What version of Laravel Excel are you using?
3.1.55
What version of Laravel are you using?
10.45.1
What version of PHP are you using?
8.2
Describe your issue
When exporting 35k models the jobs fail because of a memory exceeded exception. I am using a queued export with chunk size 3000. It fails on page 8, so it processed around 24k lines.
In the export, there are around 5 columns, so the amount of data in the file is not overly big.
Perhaps it is worth noticing that the execution time of the job gets longer with each job. Running locally:
and eventually it stops because it uses too much memory. I have a feeling this is caused when reopening the cached excel file (depending on the writer type of course).
How can the issue be reproduced?
Import a file with a lot of models. Furthermore, I am using the following concerns:
What should be the expected behaviour?
I should just export the whole file and complete all chained jobs
The text was updated successfully, but these errors were encountered: