Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for subscribing to large messages #284

Closed
wants to merge 2 commits into from

Conversation

macdiesel
Copy link

Using pubsub when you are subscribed to a channel and a large message is delivered in multiple packets the message is incorrectly handled. Only the first packet in the stream for the message is parsed. This patch demonstrates a fix for the issue.

… swallow exceptions raised from unit tests.

Updated Unit test to test large message pub/sub.
@macdiesel
Copy link
Author

I modified the subscribe test to test the handling of large messages.

@DTrejo
Copy link
Contributor

DTrejo commented Sep 22, 2012

Would you mind rebasing this on latest master and submitting a test that fails so I can prevent regressions after I take your pullreq?

Thank you,
David

@DTrejo
Copy link
Contributor

DTrejo commented Feb 18, 2013

@brycebaril this one looks abandoned; I suppose we should look into whether this issue is still happening. He did include a test though github chokes on it; the large message is large. Might be good to put it in a separate file and load it in synchronously so that it does not screw with people's editors.
D

@macdiesel
Copy link
Author

I unfortunately at the time did not have the time repull from master and write a test that failed. I've handed the ball on this one to @azafty468 who will help co-ordinate with you on this issue.

@brycebaril
Copy link
Contributor

Thanks @macdiesel -- @azafty468 will keep an eye out. There are some parser changes I'm working on and in #361 that will at least partially conflict with this change-set.

In terms of making large messages I've used SETBIT/GET before when generating split messages to have Redis do the work for you (e.g. client.setbit("foo", Math.pow(2, 22) - 1, 1); client.get("foo" ...) will generate a ~500k value.

@azafty468
Copy link

Does anyone know if there is a specific version of the redis server that is required here? I have a small correction written by MacDiesel and a modification to test.SUBSCRIBE that successfully passes, but only on newer versions of redis (2.6.9 is successful and 2.4.17 and 2.4.18 both fail). The problem that I'm having is that other test cases fail on this new version (test.MULTI_1, tests.MULTI_2, tests.KEYS, tests.ENABLE_OFFLINE_QUEUE_TRUE, tests.ENABLE_OFFLINE_QUEUE_FALSE, and tests.SLOWLOG).

@brycebaril
Copy link
Contributor

@azafty468 The MULTI_1, MULTI_2, and KEYS tests have a pending PR #378 that fixes them. The SLOWLOG test currently seems to only fail the first time you run it, then succeed on subsequent runs.

I have not heard of the OFFLINE_QUEUE tests failing before.

I don't know of any specific Redis version differences for this particular test. Most 2.4.x vs 2.6.x differences actually diverged at 2.5.0 -- the 2.6 dev version. You may be able to find something in the Redis changelog around then.

@DTrejo
Copy link
Contributor

DTrejo commented Feb 24, 2013

slowlog test will now succeed every time you run it; tests should succeed on both 2.4 and 2.6.

Care rebase this on latest master so I can run your code/tests? (no longer merges cleanly)

woops: didn't notice that azafty is working on it. Thanks everyone

Thank you,
D

@azafty468
Copy link

I've forked to todays version of node_redis and incorporated macdiesel and my tests here: https://github.com/azafty468/node_redis

@BridgeAR
Copy link
Contributor

@azafty468 @macdiesel is this still an issue as of today? Would you be so kind and check and otherwise try to write a failing test?

@azafty468
Copy link

Sorry, I rebased this thing once and have long since moved past this. No clue if this is or is not still an issue, but I'm well past any support here.

@BridgeAR
Copy link
Contributor

@azafty468 That sad, but I totally understood your frustration :/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants