-
Notifications
You must be signed in to change notification settings - Fork 1.1k
Description
I am using v0.12.1. I am seeing that very large strings are causing the tokener code to hang. I'm not exactly sure what the size cutoff is that is causing the issue because I don't control the code sending the messages. I have seen packets 300-500 bytes/characters long. The string that I"m working on now though is 1298 bytes/characters.
Is there a max length that is supported by json_c?
Here is a snippet of my code and where things hang up:
json_tokener* tok;
tok = json_tokener_new();
jsonObj = json_tokener_parse_ex(tok, i_strToParse, i_length);
Initially I had this, which worked for the smaller strings.
jsonObj = json_tokener_parse( i_strToParse );
I even tried calling json_tokener_new_ex() and passing a specific depth thinking that possibly it was allocating too much space. That made no difference.
I have to be doing something wrong, but I'm not sure what. The above code works for all of the other strings that I am trying to parse. Any help/guidance is appreciated.