You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The nf9/ipfix protocols have many fields of type 'string'. The code today just takes the string as it appears in the packet and puts it in the JSON output.
There are various special characters that cannot be written to JSON as-is, for example double-quote. These need to be properly escaped.
One possible solution is to keep the existing manual marshaling code, but change this
case string:
b.WriteByte('"')
b.WriteString(m.DataSets[i][j].Value.(string))
b.WriteByte('"')
to use Go's built-in json.Marshel function, like so:
case string:
var asJson, _ = json.Marshal(m.DataSets[i][j].Value.(string))
b.Write(asJson)
My measurements show that this has minimal impact on performance.
Another option would be to properly encode special characters manually.
The text was updated successfully, but these errors were encountered:
As an example of a ipfix with special characters, see attached. The ApplicationName and ApplicationDesc elements have fixed-size, and so the exporter added nulls at the end of string, which is something which must not be put directly into the JSON.
This is just one case, which could be solved by trimming the string, but other cases where the string might contain other special chars, obviously won't be handled by trimming. ipfix_null_padding_of_string.zip
The nf9/ipfix protocols have many fields of type 'string'. The code today just takes the string as it appears in the packet and puts it in the JSON output.
There are various special characters that cannot be written to JSON as-is, for example double-quote. These need to be properly escaped.
One possible solution is to keep the existing manual marshaling code, but change this
case string:
b.WriteByte('"')
b.WriteString(m.DataSets[i][j].Value.(string))
b.WriteByte('"')
to use Go's built-in json.Marshel function, like so:
My measurements show that this has minimal impact on performance.
Another option would be to properly encode special characters manually.
The text was updated successfully, but these errors were encountered: