Skip to content
This repository has been archived by the owner on Mar 25, 2018. It is now read-only.

Versioned core modules #9

Open
domenic opened this issue Mar 4, 2015 · 10 comments
Open

Versioned core modules #9

domenic opened this issue Mar 4, 2015 · 10 comments

Comments

@domenic
Copy link

domenic commented Mar 4, 2015

Part of the point of the NG brainstorming is to see how we can make backward-incompatible changes to core.

One early proposal was to use ES6 modules as a "switch": const fs = require("fs") gives you the existing fs, and import fs from "fs" gives you "fs v2" with back-compat breakages. However, this suffers from the "this time we'll do it right!" fallacy, and assumes we won't ever want to break back-compat again. (Or at least, it doesn't have a strategy for doing so.)

I think a better approach would be to start versioning the core modules. The version numbers that are used in code would be a single number, i.e. 1 or 2, not 1.3.4; we only need to signal back-compat breakages, not additions or fixes. There are a variety of ways you could envision this working then:

  • require and import do the exact same thing, and "fs" is an alias for "fs@1"
  • require stays unchanged and can only ever require "fs"; import cannot import "fs", but can import "fs@1" or "fs@2" or...

(I think I like the second better; however, it does mean we block progress on this idea until ES6 modules ships in V8, and that could be a long time.)

Relatedly there comes the question of how we distribute these modules. Here is one scheme that I like:

  • io.js ships with a set of core modules, as it does now. We just also make them accessible using versions like "fs@1" in addition to the normal way.
  • If we want to make a back-compat-breaking change to fs, we keep shipping fs@1 with no changes, but we also ship fs@2 with the change. When doing so, we do not increment the major io.js version, as no old code breaks! We can just increment the minor instead.
  • If at some point we feel that we are shipping too many versions of core modules (maybe we are shipping 10 versions of fs now, or maybe we are shipping 2-3 versions of every core module), and we want to trim the fat, we can bump the io.js major version and remove old core module versions.
  • (More tricky to get right) At all times, every version is distributed on npm (probably via automated publish process from core, assuming we like the single-repo structure). This means that people who want to use old versions that are no longer included in io.js can just add them to package.json. And sometimes it will allow people on older versions of io.js to use newer versions of core modules, but this is not guaranteed, since those newer versions might depend on e.g. new C++ APIs or other things.
@mikeal
Copy link
Contributor

mikeal commented Mar 4, 2015

One early proposal was to use ES6 modules as a "switch": const fs = require("fs") gives you the existing fs, and import fs from "fs" gives you "fs v2" with back-compat breakages. However, this suffers from the "this time we'll do it right!" fallacy, and assumes we won't ever want to break back-compat again. (Or at least, it doesn't have a strategy for doing so.)

This isn't entirely accurate. import fs from "fs" or import fs from "core" could give you the version of fs, or the version of stdlib, noted in your package.json.

There are some technical limitations we should acknowledge. I don't think it's practical to ship with every version of every stdlib module, especially if we want to continue bundling them in the way we do now. If we decide to ship with a single version then we also need to acknowledge that many core modules depend on each other.

@domenic
Copy link
Author

domenic commented Mar 4, 2015

This isn't entirely accurate. import fs from "fs" or import fs from "core" could give you the version of fs, or the version of stdlib, noted in your package.json.

Ahh, that makes much more sense, now I understand.

If we decide to ship with a single version then we also need to acknowledge that many core modules depend on each other.

Yeah, that's the tricky part. The import { fs } from "core" idea (or maybe import fs from "core/fs") sidesteps it quite nicely.

@chrisdickinson
Copy link

This isn't entirely accurate. import fs from "fs" or import fs from "core" could give you the version of fs, or the version of stdlib, noted in your package.json.

Treating core modules like npm modules feels like an overapplication of an abstraction that works in one case to another case where it fits uncomfortably (sort of a "when you have a hammer, screws are just difficult nails" situation.)

Core modules are the only modules in io.js that can declare a global abstraction that is invariant across packages. That means that those abstractions are safe to return or accept across package boundaries. For example, http requests, event emitters, streams, crypto hashes, etc, are all okay to accept or return from your module.

Userland is not able to make these guarantees – the classic example is voxel.js's issues with vectors, which was solved by downcasting parameters and returns to a shared type (array) between packages, before upcasting back to "ndarray" internally. It's also seen in http middleware – adding attributes to an existing type creates a new type, of sorts, which at best, silos a package's use to a given framework ecosystem, and at worst, ends up in peerDependency hell.

Allowing userland to "pick" core library versions means giving up the ability to make global, cross-package abstractions in core. That means that global, cross-package abstractions will be difficult to enforce anywhere in the ecosystem, even when they start to become necessary.

I worry that versioning "core" runs into the same problem, also – when you require('url@2'), you're using the second version of the url parser. Can you pass the result of url.parse to http.request? What if http.request is using url@1? What if you don't control the version of http in use, because you're passing the url object to a different package?

@timoxley
Copy link

minor point but the @ sign will make make imports look like a syntax shitshow when combined with npm's scoped packages syntax:

import express from "@strongloop/express@2"

@timoxley
Copy link

Being able to specify aliases/mappings up in the package.json could solve this, along with a bunch of other issues related to being tied directly to filesystem for dependency lookups:

{
  "aliases": {
    "url": "url@2",
    "express": "@strongloop/express@2"
  }
}

Another alternative could be to attach versions to the core namespace:

import { http } from "core" // current version
import { fs } from "core@3" // previous
import { url } from "core@2" // legacy io.js

@trevnorris
Copy link

+1 on @timoxley suggestion. I like the '<module>@<version>' syntax, and seems most intuitive.

@iamstarkov
Copy link

One negative outcome from dependency defined as fs@2, that you need to rewrite all code which are using this stuff, once you decided to upgrade NodeJS in you environment, from this perspective is useful to have aliases in centralized package.json

@iamstarkov
Copy link

Other thing is bothering me, even if you can control your own code, you can't do the same for 3rd party node modules. For example i used some lib decided to use fs@2 which became usual fs in next major nodejs version. So all lib dependents will be unable to update to new major version until fixes in the lib itself.

In this case consumers can benefit a lot from package.json aliases, if aliases in the app root can override lib aliases, and it seems like natural approach because app can be hosted only with one nodejs version at the same time.

In case if app can be run in different nodejs versions then aliases idea can be extended with aliases binded to nodejs versions.

{
  "aliases": {
    "v5": { "url": "url@2" },
    "v6": { "url": "url" }
   }
}

@mikeal
Copy link
Contributor

mikeal commented Feb 4, 2016

For example i used some lib decided to use fs@2 which became usual fs in next major nodejs version. So all lib dependents will be unable to update to new major version until fixes in the lib itself.

This isn't an issue because require('fs') won't be changing. About a hundred thousand modules already rely directly or indirectly on this and we won't be breaking them all any time in the future. That's why we need something like require('fs/2') in order to evolve a new API without breaking compatibility.

@iamstarkov
Copy link

That's why we need something like require('fs/2') in order to evolve a new API without breaking compatibility.

it will become a problem for npm ecosystem when evolved fs/2 will become back normal fs and previous fs become obsolete in following major nodejs update, that's why I prefer aliases approach — it allows to avoid this problem

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

6 participants