-
Notifications
You must be signed in to change notification settings - Fork 169
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Report consumed ammount of bytes in on_parse_complete (or limit the JSON max size) #47
Comments
Yajl doesn't actually buffer anything while parsing, it just fires callbacks with pointers into the actual stream being parsed - once the entire length of the input buffer has been parsed, it just waits for the next round of parsing (keeping track of it's parsing state). But none of the actual input buffer is stored anywhere inside Yajl itself. That being said, someone could just continually send data infinitely and parsing would never complete, though this shouldn't eat up any more ram than normal on the server. I'm not completely opposed to the idea, but I think being a streaming parser sort of gets around these types of things by design? I've had cases of people parsing massive JSON documents from disk, so if an option like this were to be added I'm not sure it should be enabled by default. |
Thanks for your fast response, but I cannot understand: In my above example an attacker sends first a TCP packet with data = Obviously this data must be stored somewhere within Yajl-ruby or the yajl C library. The variable So the attacker sends a new TCP packet (or UDP or unixSocket...) containing Please correct me if I'm wrong, but for sure such situation would leak the server, am I right? In the other side, is the option Thanks a lot. |
Oh right I get what you're saying now, sorry :P (been a long week) I think I like the :maxsize solution the best, and it should be fairly easy to implement. I may not have time to work on it until I finish up another project I'm on, but will try to soon. |
Really thanks a lot for you very good work ;) |
I'm coding a JSONRPC server in EventMachine using yajl-ruby. It already works (similar as the provided in README documentation). However there is something I cannot figure how to achieve: avoid DoS attacks.
This is, imagine an attacker initially sends:
This is processed in EM
receive_data(data)
in which we run@parser << data
.After a while (using same connection) the attacker sends:
It's again processed and appended to the parser. The attacker sends
lalala
continually (in different TCP packets, which would cause a leak in the server.A half-solution could be counting
data.bytesize
and close connection if it reachs some limit. But this is a wrong solution: in case the same TCP data contains a valid JSON data followed by valid but non complete JSON data, we have no way to know when to reset the data size counter.Two possible solutions I suggest:
Option
:maxsize
in Yajl::Parser.In case the accumulated buffer in the Parser object exceeds such limit, the parser would raise some exception as Yajl::MaxSizeExceeded.
Report consumed bytes when calling on_parse_complete method
Getting this value, we can substract it from the above data size counter, so it would contain just the size of the incomplete data the parser contains. Of course, first solution seems much easier.
The text was updated successfully, but these errors were encountered: