-
Notifications
You must be signed in to change notification settings - Fork 543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unrecognized codec: gzip when using AvroFileReaderWriterFactory #482
Comments
It seems to me that you specified to use gzip codec in your secor config |
Hi So I don't know what to put here because I try to Big query to read the files and don't thing it support something else than Gzip or uncompressed. and you can't set neither !! |
Richard,
If you need to set those two params separately, you can modify the secor
code to introduce two separate params.
…On Mon, Nov 30, 2020 at 4:12 AM Richard Grossman ***@***.***> wrote:
Hi
I would like to add more on this because I check it now
The problem is that Avro doesn't support Gzip compression by default.
The problem is that Avro and the message writer both use the same
configuration params:
secor.compression.codec
So if you put org.apache.hadoop.io.compress.GzipCodec you get an exception
because Avro writer try to use it and fail
If you put null (no compression) so the MessageWriter fail with an
exception because it try to instance a class with that
If put empty it fails because there you must put value
So I don't know what to put here because I try to Big query to read the
files and don't thing it support something else than Gzip or uncompressed.
and you can't do both !!
—
You are receiving this because you modified the open/close state.
Reply to this email directly, view it on GitHub
<#482 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABYJP7ZUPX7PHJSHLDDMD7TSSODZRANCNFSM4GOLZA7A>
.
|
I get the following error
Is it possible to turn off compression? Or set it to something that works with Avro?
We didn't specify gzip anywhere.
The text was updated successfully, but these errors were encountered: