Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for HTTP streaming #57

Closed
Rajan opened this issue Jan 6, 2016 · 4 comments
Closed

Support for HTTP streaming #57

Rajan opened this issue Jan 6, 2016 · 4 comments

Comments

@Rajan
Copy link

Rajan commented Jan 6, 2016

Does Deepstream support HTTP streaming beyond socket.io?

I ran into an online service: appbase.io which supports it but response times are ~1 sec.

@WolframHempel
Copy link
Member

Deepstream doesn't support HTTP streaming. In browsers that can't establish a WebSocket connection it falls back to XHR long-polling, for everything else it uses a low-level TCP connection which is magnitudes faster.

We've hoped, that between the two we should have every scenario covered. Would you have a special usecase in mind where HTTP streaming is preferable?

@Rajan
Copy link
Author

Rajan commented Jan 6, 2016

I'd have liked it to try HTTP streaming before falling back to XHR polling. It's because HTTP streaming is supported in most mobile devices and faster than the polling.

I've no problem with falling back to low-level tcp connection, if it works without any extra coding/handling.

@WolframHempel
Copy link
Member

I completely understand your point, but as deepstream sends many very small messages, keeping the overhead associated with each message as small as possible was one of the crucial aspects of its design. With any kind of HTTP (even chunk encoded streaming approaches) the message's meta data tends to be multiple times the size of the actual message.

This is why we use TCP as preferred transport. It works beautifully and is utilized by most deployments using the NodeJS client. The only consideration that needs to be taken into account is that each TCP packet can contain multiple messages and a message can be split over multiple TCP packets. (Something that will quickly show under load). Hence it's crucial for the client to gather messages and only process them once a message end character (ASCII 30) is received. (See https://github.com/hoxton-one/deepstream.io-client-js/blob/master/src/tcp/tcp-connection.js#L151 for implementation)

@WolframHempel
Copy link
Member

@Rajan, could you let me know if you're happy for me to close this issue?

@Rajan Rajan closed this as completed Jan 7, 2016
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants