Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How does the server part of haste compare to warp and wai #413

Open
doofin opened this issue Oct 6, 2017 · 1 comment
Open

How does the server part of haste compare to warp and wai #413

doofin opened this issue Oct 6, 2017 · 1 comment

Comments

@doofin
Copy link

doofin commented Oct 6, 2017

I am new to haste and is totally fascinated by its approach,but there does not seems to be any performance test , and the haste package does not depend on existing server implementations like wai and warp.Does it contains own server side implementation? Anc can i use warp as backend?

@valderman
Copy link
Owner

Right now, the performance is most likely pretty bad. The transport protocol is JSON, and the server uses the built-in websockets WS server. Even worse, a new WS connection is opened and closed for each server call, which is incredibly wasteful.

Previously, the server used warp's WS implementation, used a binary encoding and reused connections whenever possible, but that was all taken out to simplify the implementation for the paper. The plan is to put all those things back in again, but I haven't gotten around to it yet.

Of course, you can still use warp (or any other web server) to serve up the client parts, since Haste doesn't deal with that part at all unless you're using something like haste-standalone, which also hasn't been ported to the latest version yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants