Question: stream from an S3 obj, transform the stream, then stream to another S3 obj #343
Thanks for answering so promptly Alex Yes I gathered as much from the docs, but it seems that in order to create a ChunkedBody for the RqBody from a Source, I need to know ahead of time the stream length (in order to populate the _chunkSize and _chunkedLength fields in ChunkedBody). I would like to avoid streaming twice - once to count the number of bytes and the second time to actually stream (and potentially transform the stream along the way) into another s3 object. OTOH, I just realized that I can do a simple head request (https://hackage.haskell.org/package/amazonka-s3-1.4.5/docs/Network-AWS-S3-HeadObject.html#v:horsContentLength <https://hackage.haskell.org/package/amazonka-s3-1.4.5/docs/Network-AWS-S3-HeadObject.html#v:horsContentLength>) to find out the size of an s3 object, so that may solve the problem of figuring out _chunkSize and _chunkedLength I was hoping there is an example of what i want to do somewhere. Seems like a pretty useful thing to stream from one s3 obj to another in constant space (say in AWS Lambda) while transforming the stream…
On Dec 17, 2016, at 9:10 PM, Alex Mason ***@***.***> wrote: What problems are you running into? RsBody provides a ResumableBody and RqBody can take a Producer, and it should be possible to compose those together. — You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub <#343 (comment)>, or mute the thread <https://github.com/notifications/unsubscribe-auth/AAO_VIbzQEdqYbAdfBr6eFwcEgXTEs3Aks5rJJYMgaJpZM4LQBpU>.