Skip to content

Loading…

Fixing bytes contd. #7

Merged
merged 2 commits into from

2 participants

@crash2burn

Found a few more issues, but with these fixes I am finally seeing no issues under full load testing.

@3rd-Eden 3rd-Eden merged commit dc659f9 into 3rd-Eden:master

1 check passed

Details default The Travis build passed
@3rd-Eden
Owner

Merging this in to my generation branch and i'll release a new version. Thanks for the patches!

@3rd-Eden 3rd-Eden added a commit that referenced this pull request
@3rd-Eden [minor] Add in the changes from #7 45df606
@crash2burn

Anytime, we are seeing some great results with this over node-memcached!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Commits on Mar 13, 2013
  1. reversing one other issue with offset

    crash2burn committed
  2. adding \r\n to bytes needed

    crash2burn committed
Showing with 2 additions and 2 deletions.
  1. +2 −2 index.js
View
4 index.js
@@ -165,7 +165,7 @@ Parser.prototype.parse = function parse(bytes) {
, length; // Stores the total message length which is added to the
// cursor and subtracted from the bytesRemaining
- for (var i = 0, l = data.length; i <= l;) {
+ for (var i = 0, l = data.length; i < l;) {
charCode = data.charCodeAt(i);
rn = data.indexOf('\r\n', i);
@@ -402,7 +402,7 @@ Parser.prototype.parse = function parse(bytes) {
// Now that we know how much bytes we should expect to have all the
// content or if we need to wait and buffer more.
bytes = +bytes;
- if (bytes >= bytesRemaining) {
+ if (bytes > (bytesRemaining-2)) {
// Reset the cursor to the start of the command so the parsed data is
// removed from the queue when we leave the loop.
this.expecting = bytes + Buffer.byteLength(data.slice(start, i)) + 2;
Something went wrong with that request. Please try again.