-
Notifications
You must be signed in to change notification settings - Fork 12
Adding to core via development outside of core #10
Comments
To argue against myself briefly: one thing we have to be careful about is scope creep. For example, I think it's totally valid to say that any project that includes support for HTTP/1 should also include support for HTTP/2 and web sockets. But, I don't think we should include support for XML parsing (or worse, HTML parsing). Where do we draw the line? Maybe it's just on a case-by-case basis, with the TC examining proposed future additions and saying "HTTP2: yes. XML: no." Maybe we try to set up some guidelines that the TC is consistent with? |
I think the question is "what are the primitives necessary to define compatibility for a successful platform?" Having an HTTP primitive in core allowed framework and module authors to standardize on an API they could build on top of. We may need the same for HTTP2. I don't think XML is on the list because we don't see a diverse ecosystem being built on top of competing XML primitives. If you need to write something that deals with XML you pick a parser and export a good API, the XML library you build it on top of and its API doesn't persist through the API that you write. The case with HTTP (and we assume HTTP2) is that frameworks markup and extend the primitives but pass them along so that the entire ecosystem built for HTTP can remain compatible. |
My take on this has been that core should provide whatever modules the entire userland ecosystem must agree on; that core is a (the?) pressure release valve for transitive dependencies. With XML, it's usually a pretty quick step between source data and POJOs, but with HTTP2, multiple middleware packages pass request objects through and amongst themselves. Being able to globally agree on what a request object is is of high value to that sort of package, and is something that is hard-to-impossible to do in userland today.
I see this a lot too, and agree that it isn't productive. To pile onto this: it's an alluring idea, especially coming from the userland ethos. However, the reality of core is that all of the modules are deeply interdependent on one another. Ripping everything out is nearly impossible while maintaining backwards compatibility – and even if we somehow got to that promised land, it would be paralyzing to lock-step version these resulting, "independent" modules. This is cause to be careful when adding new modules to core, but not to prevent growth of core in general: sometimes peer dependencies are necessary. That said, with regards to web sockets – I'm not entirely convinced they need to go into core at present. My gut feeling on them is that because they're not likely to be passed between packages, there's less pressure to provide a core API for them, just to make sure that that abstraction can be built on top of what exists in a clean, fast way. That's a general sentiment I hold, too: we don't need an API for every subfeature of a given API we support in core, just to make sure that it's possible for userland to build support for the subfeature. |
In the TC meeting today, during the discussion of HTTP/2 and web sockets, @piscisaureus brought up how Node/io.js, when it came out, was very modern. But these days, it's fallen behind, and doesn't ship support for the latest stuff like web sockets and HTTP/2. I definitely agree with that; it feels sad that, as a web server runtime, we don't support the latest server technologies.
At the same time, however, we've learned a lot over the last few years about development workflows for this kind of thing. We've learned that often the community is going to be able to iterate and produce functionality faster in user-land, on npm, than with PRs to io.js core. We've also continually tried to enable such modules to be built purely in JavaScript, by adding to core the necessary low-level support (e.g. today we were discussing additions to Buffer that would allow web socket code to be purely in JS). This is a definite virtue.
This latter point, however, often drives people in the direction of "core should be completely minimal! Give me libuv.js and I'll build the rest! Death to core modules!" I think this isn't very productive. Instead, we need to arrive at a synthesis. Core needs to keep its focus on exposing low-level primitives to enable user-land experimentation. But at the same time, core also should be able to provide a coherent and modern set of functionality for a server runtime.
I think the conclusion of this line of thought leads us to a process wherein we say "yes, core is interested in supporting feature X." Then, someone---maybe an io.js collaborator, or maybe not!---goes off and builds a prototype of feature X as a user-land module. Along the way, they might need to ask for more low-level APIs from core, and those will get rolled in. But over time, this external implementation of feature X matures, and io.js collaborators start commenting on it, and the ecosystem starts using it, until we decide: yes, we should roll this in to core, and start shipping with it.
I think that's the kind of process we'd like to see happen with HTTP/2, and maybe web sockets; see #8. In particular, I feel like web sockets is pretty far along in this regard, and we should start considering what it would take to roll it into core. Whereas, HTTP/2 needs a lot more experimentation. But the end state for both should be targeted at shipping with io.js out of the box.
What do people think of this perspective?
The text was updated successfully, but these errors were encountered: