-
-
Notifications
You must be signed in to change notification settings - Fork 1.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] Export to s3 - CouldNotCreateChecksumException #2156
Comments
This issue doesn't appear to happen for version 3.1.7 but does occur in 3.1.8. I see there are quite a few changes between these versions but I can't seem to narrow down what specifically is causing the issue. |
Could you try as well on the latest available version? Currently, that's 3.1.13. Thanks! |
I just tried it with version 3.1.13 and received the same error. I put the stack trace below as well in case it could help in some way.
|
This looks similar: |
Yes I saw that when I was looking for solutions but that appears to be only for imports. In this case, it's happening for exports. I tried uploading an empty file to s3 first \Storage::disk('s3')->put($path,''); Then, did the export but I received the same error. I also tried storing the file locally and then calling the export but it didn't get uploaded to s3, it just added data to the local file. $unsubscribes->forAccount($this->account)->store(storage_path("app/{$path}"), 's3', \Maatwebsite\Excel\Excel::CSV); I would say queuing the export and trying to upload it s3 after would work but it would need to work with multiple servers so it would defeat the purpose. Maybe i'm missing something though and not handling this properly? |
I did some more digging and found what I believe to be the issue, but I can't figure out exactly why it's causing problems. $success = $this->put($destination, $readStream); into $success = $this->put($destination, $source->contents()); It uploads it to s3 without an issue. It also worked with queuing the export as well. I'm sure it was built this way for a reason so not sure how exactly to fix this. |
@jgodish according to the docblock of the Filesystem contract, a resource is accepted. It "FilesystemAdapter" it checks if's a stream and if so it will call the
Are you btw sure this is correct? You want s3 and the full path? Shouldn't you just pass |
Sorry for the confusion on that, I think I was trying some stuff and pasted the wrong thing. I can confirm it's using the path in s3. public function exportUnsubscribes() {
$path = 'exports/'.app()->environment()."/test-export.csv";
$unsubscribes = new Unsubscribes($this->start_date, $this->end_date);
$unsubscribes->forAccount($this->account)->store($path, 's3', \Maatwebsite\Excel\Excel::CSV);
} Then in s3 we have the exports/local/ folder where it should be stored. I ran it again to be sure of what I listed previously. When I change put to use $source->contents() instead of $readStream it stores in s3. Otherwise, it won't work and gives the checksum error. |
Ah okay the path is okay then. I'm not sure what's causing it, for me s3 is working fine with using a remote temporary file. Perhaps it's an idea to consider our commercial support so we can prioritize looking into this for you? You can see the possibilities here: https://laravel-excel.com/commercial-support Feel free to send us an email so we can discuss it further. |
I did a bit more digging and believe I have narrowed it down some more. I don't have enough experience with reading/writing files to suggest why this might be happening. In the file Maatwebsite\Excel\Files\LocalTemporaryFile -> readStream method the issue seems to be related to the "b+" mode flag. If I remove the + so the mode is rb it works as expected. If I change it to r+b it also works as expected. I have read that the "b" specifies it's a binary file but I have not found anything on doing b+. I'm not sure if it's possible that changing it to r+b would be a possible solution. |
Sorry for the late reply. There's not really any reason why the |
I finally found some time to submit a PR. I looked through things and I don't believe the "+" is needed either because it should only be used for reading a file, not writing to it (I could be wrong though). |
Hi @jgodish For reference Having said that, I'm not sure why you experience different behaviour on your end, but we are unable to reproduce the same here and therefore must conclude the issue is not within the package. Feel free to contact us for commercial support and we can take a look into your exact situation and code to try and solve the issue there. Or if you have more information that might help pinpoint this, feel free to open a new ticket with this information. |
The same error occurs to me while uploading (using method Changing the characters order solves the problem and leaves in accordance with php documentation for PR #2605 |
@jochemfuchs May you check this issue again with this new PR #2605 or is better to create a new one? |
@patrickbrouwers Can you help me? |
the problem here is with older versions of guzzlehttp/psr7. |
Prerequisites
Versions
Description
Upgrading from laravel 5.7 to 5.8 went through the normal upgrade steps. Got all code in the framework updated based on the laravel docs. Started testing and realized exporting data and storing it in s3 was failing.
Steps to Reproduce
Expected behavior:
The file should be exported and stored in s3 like it is in the previous version before upgrading the framework and laravel excel
Actual behavior:
The exception occurs with queuing and regular "store" methods.
I get the following error:
Additional Information
I have 2 instances of our application so I tested this on the 5.7 version to see if there was something configured wrong on my local environment but the same export worked as expected so it appears to be an issue with the upgrade. The previous version of laravel excel was 3.1.6 (I don't believe I missed anything with upgrading to 3.1.11).
I tried changing the excel config file for exports.csv to use "\r\n" for the line ending setting but that didn't change the behavior any.
Then I tried viewing the contents of the file by doing a dd right before the exception is thrown in SignatureV4 and I was able to read the contents fine (from the dump statement) so it seems it's related to the Psr7\hash function. The 1/2 exception shows "Cannot read from non-readable stream" and when I dumped the data for $request->getBody() it shows readable as false. I'm not sure how/why this happens but it seems to be related to the overall issue.
Stream {#25561 ▼
-stream: stream resource @59 ▼
timed_out: false
blocked: true
eof: false
wrapper_type: "plainfile"
stream_type: "STDIO"
mode: "rb+"
unread_bytes: 0
seekable: true
uri: "/tmp/tmp-laravel-excel/laravel-excel-J374kbEboqVHytyJLIH4MaxiIgp66cri"
options: []
}
-size: 172
-seekable: true
-readable: false
-writable: false
-uri: "/tmp/tmp-laravel-excel/laravel-excel-J374kbEboqVHytyJLIH4MaxiIgp66cri"
-customMetadata: []
}
I am using a custom tmp directory as well
The aws sdk hasn't changed during this upgrade (version 3.64.5 if it helps). I then tried putting a file into s3 with the traditional laravel function and it worked fine so I don't think it's an s3 config issue.
The export class implements FromQuery, WithHeadings, WithMapping
The class is called by doing:
I also tried setting the "visibility" option but still got the error.
I tried creating the file using the storage facade thinking it was some weird permission/access thing but the same error occurred.
The text was updated successfully, but these errors were encountered: