You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is there any plan to add compression support when putting data into Firehose?
Since Firehose charges are incurred based on the volume of data ingress, it would behoove a rational economic actor to minimize the amount of data being sent to Firehose (and thus compress data beforehand rather than using the compression option in Firehose to minimize the amount of data being written to S3).
The text was updated successfully, but these errors were encountered:
Hello. No there isn’t. The reason is that Firehose can’t automatically decompress data streams before egress to S3, but automatically will perform Base64 decoding, so compressed data would end up as gibberish is the target. Happy to refer you to someone at AWS who can help with overall cost optimisation if that helps. Email meyersi@amazon.com if you’d like to connect.
For the use case I was thinking about I wouldn't actually want to decompress the data stream before egress to S3, but rather take the compressed data received in Firehose and output that to S3 in its compressed state...
Is there any plan to add compression support when putting data into Firehose?
Since Firehose charges are incurred based on the volume of data ingress, it would behoove a rational economic actor to minimize the amount of data being sent to Firehose (and thus compress data beforehand rather than using the compression option in Firehose to minimize the amount of data being written to S3).
The text was updated successfully, but these errors were encountered: