-
Notifications
You must be signed in to change notification settings - Fork 175
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IP address fields in debug output are packed and therefor are packed when retrieved by the Kafka consumer #8
Comments
It is technically debug output and not meant to be pushed directly to logstash for analysis. |
@lspgn how would you decode it? I serialized protobuf generated by goflow to json in php script:
And when I take src ip |
The protobuf serializer will encode bytes directly into Base64.
When you convert it |
Yup, got it. Thanks! |
Updated debug message. Now converting bytes into readable string. |
I'm having a problem where the IP address values added to Elasticsearch via logstash are packed. I spent quite a while trying to figure out why and then realized that even the debug output of goflow has them packed. I think Logstash is doing exactly what it has been told to do and the issue might be with goflow...
Debug output from ./goflow-v2.0.4-linux-x86_64 -loglevel debug -kafka=false
Incidentally the issue is the same using v1.1.0 and compiling flow.proto from it's source but the debug output does not contain enough information to see the same problem.
The text was updated successfully, but these errors were encountered: