New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jsnext:main – should we use it, and what for? #5

Open
Rich-Harris opened this Issue Oct 23, 2015 · 73 comments

Comments

Projects
None yet
@Rich-Harris

Rich-Harris commented Oct 23, 2015

note: jsnext:main has been superseded by pkg.module, which indicates the location of a file with import/export declarations.

Typically, the pkg.module file will not have other ES2015+ features unless the package explicitly states that it doesn't support older environments — in other words, best practice is still to transpile ES2015+ features other than import/export. More info here.

Original issue follows.


This is a follow-up to this Twitter conversation:

Exposition

In CommonJS packages, it's common to have a main field in your package.json that tells Node.js or Browserify (etc) where to find the code:

{
  "name": "some-package",
  "main": "lib/some-package.js"
}

Nowadays, thanks to Babel, it's increasingly common to author in ES6/7 and use a prepublish hook to generate distributable files:

{
  "name": "some-package",
  "main": "dist/some-package.js",
  "scripts": {
    "build": "babel src --out-dir dist",
    "prepublish": "npm run build"
  }
}

So far so good. But the dist file must be CommonJS or UMD, otherwise nothing can use it. That means we're missing an opportunity, because ES6 modules are better in lots of important ways, particularly when it comes to making efficient bundles for use in the browser. (See The Importance of import and export by @benjamn if you need persuading.)

Unfortunately, we can't just create a better UMD block that incorporates ES6 imports and exports, because in engines that don't support ES6 modules (all of them), that's a syntax error. So package authors can't distribute ES6 modules without making their packages incompatible with everything.

Enter jsnext:main

There is a proposed solution to this problem – jsnext:main – which, like main, indicates the package's entry point, except that it uses export instead of module.exports. This is great, because it means that CommonJS/UMD distributable files can co-exist with ES6 distributable files. In the future, once everyone supports ES6 imports and exports, we can simply ditch the CommonJS/UMD stuff.

But does jsnext:main mean 'entry point written with potentially unsupported ES6/7 features', or 'entry point that runs in existing engines, except for the import/export statements'? It's unclear. It sounds as though you can use ES6/7 features, the assumption being that it points to your source code, and that a consumer (such as a bundler) takes responsibility for transpiling it (e.g. with Babel):

{
  "main": "dist/some-package.js",
  "jsnext:main": "src/index.js"
}

But that's problematic. What if src/index.js uses stage 0 features? some-package might have a build process that transpiles those features, but does that mean that all consumers of some-package have to be similarly equipped? (Yes, .babelrc makes that possible, but it's still a weird and brittle process, and it may become rather more complex with Babel 6.)

A better solution: jsnext:main could instead refer to an ES6-module entry point that is otherwise ready to use:

{
  "main": "dist/some-package.cjs.js",
  "jsnext:main": "dist/some-package.es6.js"
}

Prepublish hooks make this stuff very straightforward to set up, and it means that everyone can use your package, and no-one needs to worry about your source code – they're only ever dealing with code that is ready to distribute. It's a very simple solution to the problem of serving CommonJS/UMD and ES6 builds simultaneously.

But the 'jsnext' part of jsnext:main is confusing. I'd suggested changing it to something less ambiguous, but would that just confuse people even more?

Does it go far enough?

@RReverser thinks not, and argues that engines/bundlers/whatever should be able to select different files based on feature detection. I have no idea exactly how that would work but I'm eager to hear more.

In summary

  • We badly need a way for package authors to make their packages available as both CommonJS/UMD and ES6. Unless there's a way to transition from one to the other, we'll never be able to move past the legacy module formats, even when most widely-used engines support ES6 modules
  • jsnext:main is one possible solution, though it's ambiguous. Personally I think that if we use it, it should be used to point to ready-to-use (other than import/export) distributable code, not source code that still needs to be run through Babel
  • We could go even further and have multiple entry points based on feature detection

All thoughts welcome. Thanks

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 23, 2015

Also, to save @RReverser reiterating himself, I'll link to some of his earlier comments on the subject: acornjs/acorn#305. There's also some previous discussion here: rollup/rollup#106

Rich-Harris commented Oct 23, 2015

Also, to save @RReverser reiterating himself, I'll link to some of his earlier comments on the subject: acornjs/acorn#305. There's also some previous discussion here: rollup/rollup#106

@rtsao

This comment has been minimized.

Show comment
Hide comment
@rtsao

rtsao Oct 24, 2015

What about the case of non-main entries?

For example, given the following module:

foo
├ package.json
├ main.js
├ bar.js

It seems the following case

var foo = require('foo');

is solved in a straightforward manner with a jsnext:main entry, what about below?

var bar = require('foo/bar')
// or
import bar from 'foo/bar';

Is there a way to specify both ES5 and ES6 versions of all the non-main modules?

rtsao commented Oct 24, 2015

What about the case of non-main entries?

For example, given the following module:

foo
├ package.json
├ main.js
├ bar.js

It seems the following case

var foo = require('foo');

is solved in a straightforward manner with a jsnext:main entry, what about below?

var bar = require('foo/bar')
// or
import bar from 'foo/bar';

Is there a way to specify both ES5 and ES6 versions of all the non-main modules?

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 24, 2015

@Rich-Harris Thanks for a thoroughly written summary!

RReverser commented Oct 24, 2015

@Rich-Harris Thanks for a thoroughly written summary!

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 24, 2015

@rtsao Good question. Since the foo in foo/bar is treated (by Node.js and Browserify, at least) as the directory name (i.e. it looks for node_modules/foo in parent directories), the bar part can only really have one meaning – a file called bar.js at the top of that directory. I don't think the gods of ES6 have yet decided what the spec should say about such things, but as far as Rollup (an ES6 module bundler) goes, it means the exact same thing (in fact Rollup itself imports a file from Acorn in exactly this fashion, per a suggestion from @RReverser :-)

In the past I've argued that requiring individual files in this manner should be considered an anti-pattern – it's brittle (package authors are less free to restructure their code), rarely documented or tested, and makes code less portable. To that we can add a new concern – the difficulty of selecting the right file, if a package author wishes to provide different version for different feature sets (whether that's a straightforward ES5/ES6 dichotomy or something more granularity), is multiplied.

So in summary...

Is there a way to specify both ES5 and ES6 versions of all the non-main modules?

...I don't think there is – but I don't think that's a bad thing, either. Requiring individual files is a hack that we've tolerated up till now because it's the only way to achieve granular imports – something that ES6 modules support by their very nature (or at least will if we can figure out how to distribute them!).

Rich-Harris commented Oct 24, 2015

@rtsao Good question. Since the foo in foo/bar is treated (by Node.js and Browserify, at least) as the directory name (i.e. it looks for node_modules/foo in parent directories), the bar part can only really have one meaning – a file called bar.js at the top of that directory. I don't think the gods of ES6 have yet decided what the spec should say about such things, but as far as Rollup (an ES6 module bundler) goes, it means the exact same thing (in fact Rollup itself imports a file from Acorn in exactly this fashion, per a suggestion from @RReverser :-)

In the past I've argued that requiring individual files in this manner should be considered an anti-pattern – it's brittle (package authors are less free to restructure their code), rarely documented or tested, and makes code less portable. To that we can add a new concern – the difficulty of selecting the right file, if a package author wishes to provide different version for different feature sets (whether that's a straightforward ES5/ES6 dichotomy or something more granularity), is multiplied.

So in summary...

Is there a way to specify both ES5 and ES6 versions of all the non-main modules?

...I don't think there is – but I don't think that's a bad thing, either. Requiring individual files is a hack that we've tolerated up till now because it's the only way to achieve granular imports – something that ES6 modules support by their very nature (or at least will if we can figure out how to distribute them!).

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 24, 2015

rarely documented or tested, and makes code less portable

Depends. In the case with Acorn, for example, we provide separate well-documented entry points for generic and loose parser.

Lodash provides documented separate entry points for each particular function so that you can import only needed ones.

There are many cases, in fact, where devs make good use of structured import paths.

RReverser commented Oct 24, 2015

rarely documented or tested, and makes code less portable

Depends. In the case with Acorn, for example, we provide separate well-documented entry points for generic and loose parser.

Lodash provides documented separate entry points for each particular function so that you can import only needed ones.

There are many cases, in fact, where devs make good use of structured import paths.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 24, 2015

I said 'rarely', not 'never'! I should explain the 'less portable' comment – if I wrote a module that imported acorn/dist/acorn_loose (or acorn/src/loose/index), it would only run or bundle if the runtime/bundler had access to the filesystem. I couldn't (for example) use the same module in the browser, loading Acorn from a CDN. Whereas import { parse } or import { loose as parse } means the same thing everywhere.

I think we should focus on the issue of entry points though – on Twitter you mentioned the possibility of different entry points being selected based on feature detection. Can you outline how that might work? Are there any existing proposals?

Rich-Harris commented Oct 24, 2015

I said 'rarely', not 'never'! I should explain the 'less portable' comment – if I wrote a module that imported acorn/dist/acorn_loose (or acorn/src/loose/index), it would only run or bundle if the runtime/bundler had access to the filesystem. I couldn't (for example) use the same module in the browser, loading Acorn from a CDN. Whereas import { parse } or import { loose as parse } means the same thing everywhere.

I think we should focus on the issue of entry points though – on Twitter you mentioned the possibility of different entry points being selected based on feature detection. Can you outline how that might work? Are there any existing proposals?

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 24, 2015

I referred to experiments like @getify's https://featuretests.io/ using which we can easily determine supported features on current runtime. Then, we can either explicitly define set of features needed for our code (or analyze it and build such set of features automatically) and compare two sets (just check whether supported features set is subset of used by our code). Thus we will be able to determine whether it can be executed natively or should we provide transpiled version (assume that modules are transpiled in any case for now).

Moreover, this way we could extend to various entry points by simply providing list of { features: string[], entry: string } items in package.json, where entrys are pre-built for most popular feature sets so that we don't send unneeded polyfills/helpers/etc. to browser.

RReverser commented Oct 24, 2015

I referred to experiments like @getify's https://featuretests.io/ using which we can easily determine supported features on current runtime. Then, we can either explicitly define set of features needed for our code (or analyze it and build such set of features automatically) and compare two sets (just check whether supported features set is subset of used by our code). Thus we will be able to determine whether it can be executed natively or should we provide transpiled version (assume that modules are transpiled in any case for now).

Moreover, this way we could extend to various entry points by simply providing list of { features: string[], entry: string } items in package.json, where entrys are pre-built for most popular feature sets so that we don't send unneeded polyfills/helpers/etc. to browser.

@getify

This comment has been minimized.

Show comment
Hide comment
@getify

getify Oct 24, 2015

btw, the library that powers https://featuretests.io is: https://github.com/getify/es-feature-tests

That lib (and npm package of the same name) ships not only the library to do the feature testing, but also the testify CLI tool to scan files for what ES6 features are needed. You should be able to scan for the ES6 features needed, then compare that list to what the runtime test results show, and thus decide which file needs to be loaded.

FWIW, the way I kind of do this is:

  1. have a sub-directory as an entry point, like bar being a directory instead of bar.js
  2. then in the directory, a dummy package.json with nothing but a main: "index.js".
  3. At build time, you could use testify to produce the list of features needed, annotate that into package.json.
  4. Then index.js performs the runtime tests, and compares to what's in package.json, and decides which of the various files in the sub-directory is appropriate to include and run.

getify commented Oct 24, 2015

btw, the library that powers https://featuretests.io is: https://github.com/getify/es-feature-tests

That lib (and npm package of the same name) ships not only the library to do the feature testing, but also the testify CLI tool to scan files for what ES6 features are needed. You should be able to scan for the ES6 features needed, then compare that list to what the runtime test results show, and thus decide which file needs to be loaded.

FWIW, the way I kind of do this is:

  1. have a sub-directory as an entry point, like bar being a directory instead of bar.js
  2. then in the directory, a dummy package.json with nothing but a main: "index.js".
  3. At build time, you could use testify to produce the list of features needed, annotate that into package.json.
  4. Then index.js performs the runtime tests, and compares to what's in package.json, and decides which of the various files in the sub-directory is appropriate to include and run.
@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 24, 2015

Thanks @getify, that's awesome – I wasn't aware of featuretests.io, and I love the idea of being able to easily load untranspiled code in e.g. modern browsers. Do you have any example projects where you're using this? Would love to take a look at the setup.

For the bundling case we're considering here, runtime feature detection is obviously a no-go – have you had any thoughts on what this process might look like from an npm run build script's perspective?

I do worry that we're potentially adding overwhelming complexity for package authors, who already have an extraordinarily difficult job. For example, say I'm writing a which depends on your package b. a's build script uses an ES6 module bundler to generate a browser-ready distributable, so it looks in b's package.json for an entry point that flags moduleImport and moduleExport features. But because that entry point of b also uses parameterDestructuring, a now only works in environments that support parameterDestructuring. So at the end of the build process, I probably have to run testify over the result and produce a separate entry point, and the complexity is paid forward to anyone who uses a in their own code. (Or do I pass configuration to my build script that says 'disregard any entry points that require the following features, which we're not going to bother with'?)

Whereas jsnext:main is comparatively easy to grok and concerns itself only with distribution (assuming it doesn't mean jsnext features other than modules), leaving it to package authors to decide which environments to support. Pragmatically, I think an imperfect-but-good-enough solution that has a chance of being adopted (and already has been, e.g. lodash, d3) is preferable to a technically excellent solution that would place a greater burden on package authors – especially since we're talking about an intermediate step on the road to our glorious ES6+ future.

Rich-Harris commented Oct 24, 2015

Thanks @getify, that's awesome – I wasn't aware of featuretests.io, and I love the idea of being able to easily load untranspiled code in e.g. modern browsers. Do you have any example projects where you're using this? Would love to take a look at the setup.

For the bundling case we're considering here, runtime feature detection is obviously a no-go – have you had any thoughts on what this process might look like from an npm run build script's perspective?

I do worry that we're potentially adding overwhelming complexity for package authors, who already have an extraordinarily difficult job. For example, say I'm writing a which depends on your package b. a's build script uses an ES6 module bundler to generate a browser-ready distributable, so it looks in b's package.json for an entry point that flags moduleImport and moduleExport features. But because that entry point of b also uses parameterDestructuring, a now only works in environments that support parameterDestructuring. So at the end of the build process, I probably have to run testify over the result and produce a separate entry point, and the complexity is paid forward to anyone who uses a in their own code. (Or do I pass configuration to my build script that says 'disregard any entry points that require the following features, which we're not going to bother with'?)

Whereas jsnext:main is comparatively easy to grok and concerns itself only with distribution (assuming it doesn't mean jsnext features other than modules), leaving it to package authors to decide which environments to support. Pragmatically, I think an imperfect-but-good-enough solution that has a chance of being adopted (and already has been, e.g. lodash, d3) is preferable to a technically excellent solution that would place a greater burden on package authors – especially since we're talking about an intermediate step on the road to our glorious ES6+ future.

@getify

This comment has been minimized.

Show comment
Hide comment
@getify

getify Oct 24, 2015

For the bundling case we're considering here, runtime feature detection is obviously a no-go

I don't understand all the context, but at glance I don't understand why not?

Why couldn't the bundle you ship be built like I suggest, with tests on the end system to determine which file version to load? You could even put some caching in there, where it caches the results in the package.json file or whatever (similar to how the results are cached in-browser).

getify commented Oct 24, 2015

For the bundling case we're considering here, runtime feature detection is obviously a no-go

I don't understand all the context, but at glance I don't understand why not?

Why couldn't the bundle you ship be built like I suggest, with tests on the end system to determine which file version to load? You could even put some caching in there, where it caches the results in the package.json file or whatever (similar to how the results are cached in-browser).

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 24, 2015

Because if you're creating a browser bundle, all the code has to be present, unless you can assume the presence of a particular module loader that conditionally loads another file asynchronously, with all the extra configuration and complexity that implies. And that code would contain syntax errors for non-modern browsers.

Having multiple entry points (i.e. multiple bundles, for different feature sets) gives developers the option to say 'I'm going to make my life easier and just lob <script src='bundle.legacy.js'> on the page', or to rig up a module loader that is appropriately configured to load different bundles depending on available features.

Rich-Harris commented Oct 24, 2015

Because if you're creating a browser bundle, all the code has to be present, unless you can assume the presence of a particular module loader that conditionally loads another file asynchronously, with all the extra configuration and complexity that implies. And that code would contain syntax errors for non-modern browsers.

Having multiple entry points (i.e. multiple bundles, for different feature sets) gives developers the option to say 'I'm going to make my life easier and just lob <script src='bundle.legacy.js'> on the page', or to rig up a module loader that is appropriately configured to load different bundles depending on available features.

@getify

This comment has been minimized.

Show comment
Hide comment
@getify

getify Oct 24, 2015

Sorry, I assumed incorrectly this was about use in node, since we were talking about package.json.

FWIW, I do the same sort of stuff (feature test in the browser, conditionally load diff files based on results) and yes I just dynamically load the script files. I use my LABjs loader for that. It doesn't need module loading, since I just load UMD style "module" files as normal scripts.

getify commented Oct 24, 2015

Sorry, I assumed incorrectly this was about use in node, since we were talking about package.json.

FWIW, I do the same sort of stuff (feature test in the browser, conditionally load diff files based on results) and yes I just dynamically load the script files. I use my LABjs loader for that. It doesn't need module loading, since I just load UMD style "module" files as normal scripts.

@mattdesl

This comment has been minimized.

Show comment
Hide comment
@mattdesl

mattdesl Oct 26, 2015

Good discussion.

Here are my issues with jsnext:main

  • The name itself is confusing. Is it constantly describing the next version of JavaScript? Or is it just referring to ES2015?
  • It isn't very future-proof. Eventually, we will be able to just point "main" to ES6, at which point all the modules using jsnext will seem a little dated.
  • Historically, adding tooling-specific config to package.json has not turned out to be a good thing. See "browserify" field, which doesn't play nicely with other bundlers, for example.

I really would love to find a solution that doesn't involve much overhead for module authors. There's some big problems with transpiling, and it's why I still author modules in ES5 instead of using the ES6/jsnext approach.

  • Testing and interactive development is hard. Things like npm link, nodemon and browser dev servers break down when you are working on several ES6 modules at once.
  • It creates a bad dev experience for consumers. e.g. Very often I end up tweaking and/or reading some node_modules/foo/index.js file from Sublime; but this is not fun with transpiled code.
  • Trying to support ES5 and ES6 in tandem adds some new challenges for module authors. These issues don't exist with the current ES5 workflows. A good example is how to cleanly handle foo/bar.js exports in both environments, see here.
  • Then there is the topic of polyfills. If you write some code that relies on Promise, Math/Array/RegExp/String methods, etc, then your ES5 distribution will implicitly peer-depend on their polyfills. You can either include the polyfills in your ES5 distribution (more bloat), or force the consumer to add them (more complexity in using the module).
  • Babel is going with exports.defaults moving forward which can (in my experience) create a different API usage depending on whether the consumer is ES5 or ES6

mattdesl commented Oct 26, 2015

Good discussion.

Here are my issues with jsnext:main

  • The name itself is confusing. Is it constantly describing the next version of JavaScript? Or is it just referring to ES2015?
  • It isn't very future-proof. Eventually, we will be able to just point "main" to ES6, at which point all the modules using jsnext will seem a little dated.
  • Historically, adding tooling-specific config to package.json has not turned out to be a good thing. See "browserify" field, which doesn't play nicely with other bundlers, for example.

I really would love to find a solution that doesn't involve much overhead for module authors. There's some big problems with transpiling, and it's why I still author modules in ES5 instead of using the ES6/jsnext approach.

  • Testing and interactive development is hard. Things like npm link, nodemon and browser dev servers break down when you are working on several ES6 modules at once.
  • It creates a bad dev experience for consumers. e.g. Very often I end up tweaking and/or reading some node_modules/foo/index.js file from Sublime; but this is not fun with transpiled code.
  • Trying to support ES5 and ES6 in tandem adds some new challenges for module authors. These issues don't exist with the current ES5 workflows. A good example is how to cleanly handle foo/bar.js exports in both environments, see here.
  • Then there is the topic of polyfills. If you write some code that relies on Promise, Math/Array/RegExp/String methods, etc, then your ES5 distribution will implicitly peer-depend on their polyfills. You can either include the polyfills in your ES5 distribution (more bloat), or force the consumer to add them (more complexity in using the module).
  • Babel is going with exports.defaults moving forward which can (in my experience) create a different API usage depending on whether the consumer is ES5 or ES6
@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 26, 2015

Thanks for weighing in @mattdesl:

The name itself is confusing. Is it constantly describing the next version of JavaScript? Or is it just referring to ES2015?

Ask @caridy – I learned about it from him 😀 (some background here). My suggestion above is that it should not refer to futuristic language features (i.e. ES6/7 source code), but only to the method of distribution, i.e. export default myLib instead of module.exports = myLib or the UMD equivalent.

It isn't very future-proof. Eventually, we will be able to just point "main" to ES6, at which point all the modules using jsnext will seem a little dated.

That's exactly the plan – this is just a transitional measure that allows package authors to distribute ES6-tool-friendly code without breaking support for RequireJS/Browserify/etc.

Historically, adding tooling-specific config to package.json has not turned out to be a good thing. See "browserify" field, which doesn't play nicely with other bundlers, for example.

A thousand times yes. Tool-specific config is a bad idea that has caused endless problems. The point of jsnext:main is that it's not tool-specific – the goal is to reach a consensus among tools so that (for example) Webpack 2 and Rollup can both do automatic tree-shaking.

Rich-Harris commented Oct 26, 2015

Thanks for weighing in @mattdesl:

The name itself is confusing. Is it constantly describing the next version of JavaScript? Or is it just referring to ES2015?

Ask @caridy – I learned about it from him 😀 (some background here). My suggestion above is that it should not refer to futuristic language features (i.e. ES6/7 source code), but only to the method of distribution, i.e. export default myLib instead of module.exports = myLib or the UMD equivalent.

It isn't very future-proof. Eventually, we will be able to just point "main" to ES6, at which point all the modules using jsnext will seem a little dated.

That's exactly the plan – this is just a transitional measure that allows package authors to distribute ES6-tool-friendly code without breaking support for RequireJS/Browserify/etc.

Historically, adding tooling-specific config to package.json has not turned out to be a good thing. See "browserify" field, which doesn't play nicely with other bundlers, for example.

A thousand times yes. Tool-specific config is a bad idea that has caused endless problems. The point of jsnext:main is that it's not tool-specific – the goal is to reach a consensus among tools so that (for example) Webpack 2 and Rollup can both do automatic tree-shaking.

@getify

This comment has been minimized.

Show comment
Hide comment
@getify

getify Oct 26, 2015

My suggestion above is that it should not refer to futuristic language features (i.e. ES6/7 source code), but only to the method of distribution

It should be renamed to "esmodule" then.

getify commented Oct 26, 2015

My suggestion above is that it should not refer to futuristic language features (i.e. ES6/7 source code), but only to the method of distribution

It should be renamed to "esmodule" then.

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 26, 2015

but only to the method of distribution, i.e. export default myLib

This is not really a method of distribution, as it still always needs to be transpiled to either CommonJS/AMD/UMD/Systemjs until further spec additions & changes. Such field would just mean that file still needs to be transpiled to one of these targets, and even for ES6<->CommonJS there are too many ways to represent such transpilation to be sure that tools will be compatible with each other if they use this field.

RReverser commented Oct 26, 2015

but only to the method of distribution, i.e. export default myLib

This is not really a method of distribution, as it still always needs to be transpiled to either CommonJS/AMD/UMD/Systemjs until further spec additions & changes. Such field would just mean that file still needs to be transpiled to one of these targets, and even for ES6<->CommonJS there are too many ways to represent such transpilation to be sure that tools will be compatible with each other if they use this field.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 26, 2015

It should be renamed to "esmodule" then.

I like that 👍

as it still always needs to be transpiled

The same is true of main and CommonJS, if you use packages in the browser.

there are too many ways to represent such transpilation

Can you elaborate? We're talking about ES6-aware tools being able to use ES6 modules, and the spec is pretty unambiguous on import/export syntax.

Rich-Harris commented Oct 26, 2015

It should be renamed to "esmodule" then.

I like that 👍

as it still always needs to be transpiled

The same is true of main and CommonJS, if you use packages in the browser.

there are too many ways to represent such transpilation

Can you elaborate? We're talking about ES6-aware tools being able to use ES6 modules, and the spec is pretty unambiguous on import/export syntax.

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 26, 2015

Spec is unambigous on syntax, but we don't have any Loader API defined yet. Syntax is not enough to define how they will load and interact with each other (it's just that we currently transpile to filesystem actions on Node.js and to URLs in browser, while we don't know what will be allowed in spec and in which form), and the only way is to use one of existing module systems which do have loaders, which IMO decreases the value of using separate syntax just for modules.

RReverser commented Oct 26, 2015

Spec is unambigous on syntax, but we don't have any Loader API defined yet. Syntax is not enough to define how they will load and interact with each other (it's just that we currently transpile to filesystem actions on Node.js and to URLs in browser, while we don't know what will be allowed in spec and in which form), and the only way is to use one of existing module systems which do have loaders, which IMO decreases the value of using separate syntax just for modules.

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 26, 2015

I believe common misconception about modules is that if they're defined, we have only one way how string sources will work - through the concepts we're used to, but people forget that 1) this is not defined yet and 2) even now we have different concepts, and filesystem / URL paths not necessarily match.

RReverser commented Oct 26, 2015

I believe common misconception about modules is that if they're defined, we have only one way how string sources will work - through the concepts we're used to, but people forget that 1) this is not defined yet and 2) even now we have different concepts, and filesystem / URL paths not necessarily match.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 27, 2015

I hear what you're saying, but given that this is an opt-in enhancement (for both package authors and consumers), it seems unnecessarily cautious. Do we really need a standards body to tell us that resolve('../foo.js', 'bar/baz.js') === 'foo.js'? The web works best when spec, implementation and tooling evolve together – it seems a shame to forego the benefits of ES6 modules (and in the process weaken the incentive for browser vendors and standards people to finish up the spec) because of hypothetical concerns.

Rich-Harris commented Oct 27, 2015

I hear what you're saying, but given that this is an opt-in enhancement (for both package authors and consumers), it seems unnecessarily cautious. Do we really need a standards body to tell us that resolve('../foo.js', 'bar/baz.js') === 'foo.js'? The web works best when spec, implementation and tooling evolve together – it seems a shame to forego the benefits of ES6 modules (and in the process weaken the incentive for browser vendors and standards people to finish up the spec) because of hypothetical concerns.

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 27, 2015

The web works best when spec, implementation and tooling evolve together

But that's the problem - there is no spec apart from syntax, and we can do is just guessing and building tooling around own assumptions.

Anyway, let's go back to initial point. What I'm saying is that I don't see any reason for separating transpilation of ES6 modules from transpilation of all the other features, because usually if one already uses one feature, he uses others as well, and transpiles altogether. If you want to force authors to add custom flag, it shouldn't be "transpile everything for browsers, and transpile everything except modules for our tooling", it should be set of flags / configs which would allow us to take care of entire process without additional maintenance pain for library developers.

RReverser commented Oct 27, 2015

The web works best when spec, implementation and tooling evolve together

But that's the problem - there is no spec apart from syntax, and we can do is just guessing and building tooling around own assumptions.

Anyway, let's go back to initial point. What I'm saying is that I don't see any reason for separating transpilation of ES6 modules from transpilation of all the other features, because usually if one already uses one feature, he uses others as well, and transpiles altogether. If you want to force authors to add custom flag, it shouldn't be "transpile everything for browsers, and transpile everything except modules for our tooling", it should be set of flags / configs which would allow us to take care of entire process without additional maintenance pain for library developers.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 27, 2015

But that's the problem - there is no spec apart from syntax, and we can do is just guessing and building tooling around own assumptions.

Well, there's a polyfill for System.import and an (albeit very early) draft. My understanding is that an application will typically need to configure the loader anyway, which ought to make any transient inconsistencies non-catastrophic.

I don't see any reason for separating transpilation of ES6 modules from transpilation of all the other features

There is a natural divide between module syntax and other language features – import and export are about how code gets into your application, everything else is about what it does when it gets there. Put another way, there's only one correct way to transpile let [a,b] = foo(), but there are several ways transpile import and export, depending on which environment you're targeting.

If you want to force authors to add custom flag

Please. No-one is talking here about 'forcing' anyone to do anything – this is about trying to reach a community consensus so that we can benefit from ES6 modules without introducing yet more tooling-specific config.

Rich-Harris commented Oct 27, 2015

But that's the problem - there is no spec apart from syntax, and we can do is just guessing and building tooling around own assumptions.

Well, there's a polyfill for System.import and an (albeit very early) draft. My understanding is that an application will typically need to configure the loader anyway, which ought to make any transient inconsistencies non-catastrophic.

I don't see any reason for separating transpilation of ES6 modules from transpilation of all the other features

There is a natural divide between module syntax and other language features – import and export are about how code gets into your application, everything else is about what it does when it gets there. Put another way, there's only one correct way to transpile let [a,b] = foo(), but there are several ways transpile import and export, depending on which environment you're targeting.

If you want to force authors to add custom flag

Please. No-one is talking here about 'forcing' anyone to do anything – this is about trying to reach a community consensus so that we can benefit from ES6 modules without introducing yet more tooling-specific config.

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 27, 2015

Ok, I used the wrong word, it had to be "spread" / "popularize" / whatever. The point is when it's just for modules for usage in specific bundlers, then it's still tooling-specific config (yes, they're specific, because as you pointed out modules can be transpiled in many ways, and there are already other bundlers/transpilers/plugins for almost any of those ways).

This separation is not a natural divide from the perspective of developer - it's just another inconvenience that stays in a way between writing code and execution on any platform. Right now it's done through single transpilation step into CommonJS/AMD/SystemJS depending on target or, well, if using optimized browser builds, also dropping a webpack which would take all the hard work. What you propose is adding yet another step in the middle by customizing transpilation step so that it wouldn't transpile modules, then adding into the project tools that do understand ES6 modules and this field, and finally bundling again. This might look like it's for a good from perspective of sharing some info between those tools, but not from perspective of end-user (developer) as it's still extra steps for no apparent reasons (transpilers already generate CommonJS/AMD and bundlers already understand those formats).

RReverser commented Oct 27, 2015

Ok, I used the wrong word, it had to be "spread" / "popularize" / whatever. The point is when it's just for modules for usage in specific bundlers, then it's still tooling-specific config (yes, they're specific, because as you pointed out modules can be transpiled in many ways, and there are already other bundlers/transpilers/plugins for almost any of those ways).

This separation is not a natural divide from the perspective of developer - it's just another inconvenience that stays in a way between writing code and execution on any platform. Right now it's done through single transpilation step into CommonJS/AMD/SystemJS depending on target or, well, if using optimized browser builds, also dropping a webpack which would take all the hard work. What you propose is adding yet another step in the middle by customizing transpilation step so that it wouldn't transpile modules, then adding into the project tools that do understand ES6 modules and this field, and finally bundling again. This might look like it's for a good from perspective of sharing some info between those tools, but not from perspective of end-user (developer) as it's still extra steps for no apparent reasons (transpilers already generate CommonJS/AMD and bundlers already understand those formats).

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 27, 2015

but not from perspective of end-user (developer) as it's still extra steps...

A voluntary step, which likely comprises a single extra line in your package.json if you've already got a build step.

...for no apparent reasons

You personally might not see the benefit, but others do!

Rich-Harris commented Oct 27, 2015

but not from perspective of end-user (developer) as it's still extra steps...

A voluntary step, which likely comprises a single extra line in your package.json if you've already got a build step.

...for no apparent reasons

You personally might not see the benefit, but others do!

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 27, 2015

Size is something that can be easily decreased with custom tooling for CommonJS as well, just not really the main concern for major bundlers.

Anyway, let's go back to initial point.

I feel I'm being ignored or you're not willing to discuss generic feature flags anymore :/

RReverser commented Oct 27, 2015

Size is something that can be easily decreased with custom tooling for CommonJS as well, just not really the main concern for major bundlers.

Anyway, let's go back to initial point.

I feel I'm being ignored or you're not willing to discuss generic feature flags anymore :/

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 27, 2015

I feel I'm being ignored or you're not willing to discuss generic feature flags anymore :/

Far from it – I'm very eager to hear more detail of what this would look like, in terms of how we identify features, how package authors should generate the feature flags, whether they should be encouraged to generate multiple entry points for different feature sets, etc. I did raise some concerns above about practicality, since we share the goal of minimising the additional burden on package authors, but if they're misplaced then I'd be glad to be proven wrong.

Rich-Harris commented Oct 27, 2015

I feel I'm being ignored or you're not willing to discuss generic feature flags anymore :/

Far from it – I'm very eager to hear more detail of what this would look like, in terms of how we identify features, how package authors should generate the feature flags, whether they should be encouraged to generate multiple entry points for different feature sets, etc. I did raise some concerns above about practicality, since we share the goal of minimising the additional burden on package authors, but if they're misplaced then I'd be glad to be proven wrong.

@RReverser

This comment has been minimized.

Show comment
Hide comment
@RReverser

RReverser Oct 27, 2015

Well, I guess me & @getify provided enough implementation details above. As for which feature sets to choose, that can be done either by authors or auto-generating from analytics or on the fly with caching or just ES6/ES5 - that's up to consumer. As for which features are used in our code - as was said, testify can detect them and build map of booleans of what do we expect from engine, so that code will know which still need to be transpiled if any, and which can be used as-is.

RReverser commented Oct 27, 2015

Well, I guess me & @getify provided enough implementation details above. As for which feature sets to choose, that can be done either by authors or auto-generating from analytics or on the fly with caching or just ES6/ES5 - that's up to consumer. As for which features are used in our code - as was said, testify can detect them and build map of booleans of what do we expect from engine, so that code will know which still need to be transpiled if any, and which can be used as-is.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Oct 27, 2015

Well, we could do both. They're not mutually exclusive, right? What do you propose as the field name in package.json? Does existing tooling support it?

Rich-Harris commented Oct 27, 2015

Well, we could do both. They're not mutually exclusive, right? What do you propose as the field name in package.json? Does existing tooling support it?

@dchambers

This comment has been minimized.

Show comment
Hide comment
@dchambers

dchambers Nov 16, 2015

Hi @Rich-Harris,

I just brought myself up to speed on this topic, and am going to be lazy and ask you some questions if you don't mind, rather than spending hours researching:

  1. If jsnext:main points to source code that is transpiled down to ES5 (except for import and export) that means we can't have sourcemaps for any of the libraries we import right?
  2. If jsnext:main points to a bundle that includes an inline sourcemap then:
    1. Is it possible for tools (e.g. rollup) to consume this sourcemap as input for the final bundle sourcemap?
    2. Would we lose the ability to do tree-shaking?
  3. Couldn't jsnext:main point to the source code if the tool (e.g. rollup) used the local install of Babel for each library being imported, and might this still allow sourcemaps and tree-shaking to function?

dchambers commented Nov 16, 2015

Hi @Rich-Harris,

I just brought myself up to speed on this topic, and am going to be lazy and ask you some questions if you don't mind, rather than spending hours researching:

  1. If jsnext:main points to source code that is transpiled down to ES5 (except for import and export) that means we can't have sourcemaps for any of the libraries we import right?
  2. If jsnext:main points to a bundle that includes an inline sourcemap then:
    1. Is it possible for tools (e.g. rollup) to consume this sourcemap as input for the final bundle sourcemap?
    2. Would we lose the ability to do tree-shaking?
  3. Couldn't jsnext:main point to the source code if the tool (e.g. rollup) used the local install of Babel for each library being imported, and might this still allow sourcemaps and tree-shaking to function?
@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Nov 16, 2015

@dchambers sure:

that means we can't have sourcemaps for any of the libraries we import right?

Not necessarily, though it definitely does complicate things a bit. I wrote a tool called Sorcery which combines sourcemaps, allowing you to have inputs with existing sourcemaps or multiple sourcemap-generating transformation steps.

Is it possible for tools (e.g. rollup) to consume this sourcemap as input for the final bundle sourcemap?

Rollup doesn't currently handle input sourcemaps, but it would work with Sorcery. I'm of the view that tools generally shouldn't attempt to handle input sourcemaps.

Would we lose the ability to do tree-shaking?

No. As far as a tool like Rollup is concerned, the result of statically analyzing an ES6 bundle is exactly the same as analyzing a bunch of separate modules that compose said bundle.

Couldn't jsnext:main point to the source code if the tool (e.g. rollup) used the local install of Babel for each library being imported, and might this still allow sourcemaps and tree-shaking to function?

In theory yes – in fact Rollup itself transpiles Acorn's source code for the sake of a smaller build. But in the general case, asking packages to worry about transpiling the source code of packages they consume just adds hellacious complexity – it adds a build-time performance penalty, and there's no standard way to approach the problem. Pragmatically speaking, it makes more sense for packages to distribute code that's ready to run, import and export being the exception (because they don't have an ES5 equivalent unless you make assumptions about the target environment).

Rich-Harris commented Nov 16, 2015

@dchambers sure:

that means we can't have sourcemaps for any of the libraries we import right?

Not necessarily, though it definitely does complicate things a bit. I wrote a tool called Sorcery which combines sourcemaps, allowing you to have inputs with existing sourcemaps or multiple sourcemap-generating transformation steps.

Is it possible for tools (e.g. rollup) to consume this sourcemap as input for the final bundle sourcemap?

Rollup doesn't currently handle input sourcemaps, but it would work with Sorcery. I'm of the view that tools generally shouldn't attempt to handle input sourcemaps.

Would we lose the ability to do tree-shaking?

No. As far as a tool like Rollup is concerned, the result of statically analyzing an ES6 bundle is exactly the same as analyzing a bunch of separate modules that compose said bundle.

Couldn't jsnext:main point to the source code if the tool (e.g. rollup) used the local install of Babel for each library being imported, and might this still allow sourcemaps and tree-shaking to function?

In theory yes – in fact Rollup itself transpiles Acorn's source code for the sake of a smaller build. But in the general case, asking packages to worry about transpiling the source code of packages they consume just adds hellacious complexity – it adds a build-time performance penalty, and there's no standard way to approach the problem. Pragmatically speaking, it makes more sense for packages to distribute code that's ready to run, import and export being the exception (because they don't have an ES5 equivalent unless you make assumptions about the target environment).

@dchambers

This comment has been minimized.

Show comment
Hide comment
@dchambers

dchambers Nov 16, 2015

Once again, thanks for the taking the time to provide more excellent answers. I can see you favour simplified ES6 bundles for performance and simplicity reasons, but it's not actually clear to me what tools produce these (I'm prosuming an ES6 bundle exports only the main library exports, and all internal importing and exporting is hidden)?

But in the general case, asking packages to worry about transpiling the source code of packages they consume just adds hellacious complexity – it adds a build-time performance penalty, and there's no standard way to approach the problem. Pragmatically speaking, it makes more sense for packages to distribute code that's ready to run, import and export being the exception (because they don't have an ES5 equivalent unless you make assumptions about the target environment).

Although you yourself don't think this is the best way to go, might creating full ES6 bundles be an alternate solution since:

  1. You won't need to worry about having the same version of Babel.
  2. Transpilation performance will be much better because the node_modules dance will be avoided.
  3. You'll get working sourcemaps for all ES2015 features, and will only see transpiled code for newer or experimental features.
  4. End-developers have the opportunity not to transpile everything down if their runtime targets support some or all of the ES2015 feature-set.

dchambers commented Nov 16, 2015

Once again, thanks for the taking the time to provide more excellent answers. I can see you favour simplified ES6 bundles for performance and simplicity reasons, but it's not actually clear to me what tools produce these (I'm prosuming an ES6 bundle exports only the main library exports, and all internal importing and exporting is hidden)?

But in the general case, asking packages to worry about transpiling the source code of packages they consume just adds hellacious complexity – it adds a build-time performance penalty, and there's no standard way to approach the problem. Pragmatically speaking, it makes more sense for packages to distribute code that's ready to run, import and export being the exception (because they don't have an ES5 equivalent unless you make assumptions about the target environment).

Although you yourself don't think this is the best way to go, might creating full ES6 bundles be an alternate solution since:

  1. You won't need to worry about having the same version of Babel.
  2. Transpilation performance will be much better because the node_modules dance will be avoided.
  3. You'll get working sourcemaps for all ES2015 features, and will only see transpiled code for newer or experimental features.
  4. End-developers have the opportunity not to transpile everything down if their runtime targets support some or all of the ES2015 feature-set.
@GGAlanSmithee

This comment has been minimized.

Show comment
Hide comment
@GGAlanSmithee

GGAlanSmithee Nov 16, 2015

@Rich-Harris I find your point of view interesting regarding serving "transpiled-except-modules" js.
I've always thought of it as either you serve bundled, transpiled code (by package.json's main), or you serve the "raw" code (by package.json's jsnext:main). I was a bit surprised by your answer to rollup/rollup-plugin-babel#4 for this reason (I'm yet to test your suggestion).

I think the "general" view is kind of like mine. There was an answer saying something like this in the babel issue tracker that I can't find atm (since the babel issue tracker has gone missing? (will link later))

Another interresting thing is polyfills. Should these be included in npm distributions? Currently I don't include polyfills, but rather tell users they will need to include it themselves - not optimal! There is some discussion on this here though (as far as babel is concerned) babel/babel#2717.

GGAlanSmithee commented Nov 16, 2015

@Rich-Harris I find your point of view interesting regarding serving "transpiled-except-modules" js.
I've always thought of it as either you serve bundled, transpiled code (by package.json's main), or you serve the "raw" code (by package.json's jsnext:main). I was a bit surprised by your answer to rollup/rollup-plugin-babel#4 for this reason (I'm yet to test your suggestion).

I think the "general" view is kind of like mine. There was an answer saying something like this in the babel issue tracker that I can't find atm (since the babel issue tracker has gone missing? (will link later))

Another interresting thing is polyfills. Should these be included in npm distributions? Currently I don't include polyfills, but rather tell users they will need to include it themselves - not optimal! There is some discussion on this here though (as far as babel is concerned) babel/babel#2717.

@GGAlanSmithee

This comment has been minimized.

Show comment
Hide comment
@GGAlanSmithee

GGAlanSmithee Nov 17, 2015

Here is the link to the issue, turns out Babel moved their issue tracker https://phabricator.babeljs.io/T3070. Look at the first answer. And here is the link to the discussion about polyfills https://phabricator.babeljs.io/T2717

GGAlanSmithee commented Nov 17, 2015

Here is the link to the issue, turns out Babel moved their issue tracker https://phabricator.babeljs.io/T3070. Look at the first answer. And here is the link to the discussion about polyfills https://phabricator.babeljs.io/T2717

@screendriver

This comment has been minimized.

Show comment
Hide comment
@screendriver

screendriver Mar 1, 2017

I believe that module is not flexible enough. module only describes the module format, not the language features. Modules are written in ES2015 module syntax instead of CommonJS and the rest of the code will be still transpiled to ES5.

But what if I target ES2015 or ES2017 in the near future in my bundle? Then my dependencies are still already transpiled to ES5 with the whole overhead of transpiled code and source maps. But that's not necessary when I target ES2015. You could argument with "ok, then let us publish the raw source code as it is" but then my project has to transpile all dependencies (and the dependencies of the dependencies). Despite the huge build time there are a lot more issues when the consuming project has to transpile all dependencies: I have to know what language features are used in every single of my dependencies, what transpiler they use (if they publish their .babelrc or tsconfig.json). So my build system is getting really complicated, huge and error prone.

So what about a new entry in the package.json? For example:

{
  "name": "foo",
  "version": "1.0.0",
  "main": "dist/index.js",
  "module": "dist-es/index.js",
  "module:es2015": "dist-es2015/index.js",
  "module:es2016": "dist-es2016/index.js",
  "module:umd": "dist-umd/index.js"
}

With this approach we could tell the bundler (webpack, rollup, jspm, fuse-box...) in the consuming project of this library which target it should pick and nothing need to transpiled. For example if I want to target ES2015 in my project I could tell Babel or TypeScript to transpile my code to ES2015 and my bundler to pick module:es2015 from my dependencies if it exists. If not it can fallback to module and if this one does not exist it could fallback to main.

The only drawback here is that the build / transpilation of this library requires a little bit more setup because you have multiple targets. But that's not a big deal in my eyes.

🤔

screendriver commented Mar 1, 2017

I believe that module is not flexible enough. module only describes the module format, not the language features. Modules are written in ES2015 module syntax instead of CommonJS and the rest of the code will be still transpiled to ES5.

But what if I target ES2015 or ES2017 in the near future in my bundle? Then my dependencies are still already transpiled to ES5 with the whole overhead of transpiled code and source maps. But that's not necessary when I target ES2015. You could argument with "ok, then let us publish the raw source code as it is" but then my project has to transpile all dependencies (and the dependencies of the dependencies). Despite the huge build time there are a lot more issues when the consuming project has to transpile all dependencies: I have to know what language features are used in every single of my dependencies, what transpiler they use (if they publish their .babelrc or tsconfig.json). So my build system is getting really complicated, huge and error prone.

So what about a new entry in the package.json? For example:

{
  "name": "foo",
  "version": "1.0.0",
  "main": "dist/index.js",
  "module": "dist-es/index.js",
  "module:es2015": "dist-es2015/index.js",
  "module:es2016": "dist-es2016/index.js",
  "module:umd": "dist-umd/index.js"
}

With this approach we could tell the bundler (webpack, rollup, jspm, fuse-box...) in the consuming project of this library which target it should pick and nothing need to transpiled. For example if I want to target ES2015 in my project I could tell Babel or TypeScript to transpile my code to ES2015 and my bundler to pick module:es2015 from my dependencies if it exists. If not it can fallback to module and if this one does not exist it could fallback to main.

The only drawback here is that the build / transpilation of this library requires a little bit more setup because you have multiple targets. But that's not a big deal in my eyes.

🤔

@rauschma

This comment has been minimized.

Show comment
Hide comment
@rauschma

rauschma Mar 1, 2017

This is how I’d do it:

"engines": [
    {
        "node": ">=0.10.3 <0.12",
        "main": "./es5/index.js",
        "bin": { "foo": "./es5/bin/foo.js" }
    },
    {
        "ecmascript": ">=2015",
        "main": "./es2015/index.js",
        "bin": { "foo": "./es2015/bin/foo.js" }
    }
],

Or maybe:

"engines": {
    "node >= 0.10.3, node < 0.12": {
        "main": "./es5/index.js",
        "bin": { "foo": "./es5/bin/foo.js" }
    },
    "ecmascript >= 2015": {
        "main": "./es2015/index.js",
        "bin": { "foo": "./es2015/bin/foo.js" }
    }
},

Details: http://www.2ality.com/2016/03/multi-platform-npm-packages.html

rauschma commented Mar 1, 2017

This is how I’d do it:

"engines": [
    {
        "node": ">=0.10.3 <0.12",
        "main": "./es5/index.js",
        "bin": { "foo": "./es5/bin/foo.js" }
    },
    {
        "ecmascript": ">=2015",
        "main": "./es2015/index.js",
        "bin": { "foo": "./es2015/bin/foo.js" }
    }
],

Or maybe:

"engines": {
    "node >= 0.10.3, node < 0.12": {
        "main": "./es5/index.js",
        "bin": { "foo": "./es5/bin/foo.js" }
    },
    "ecmascript >= 2015": {
        "main": "./es2015/index.js",
        "bin": { "foo": "./es2015/bin/foo.js" }
    }
},

Details: http://www.2ality.com/2016/03/multi-platform-npm-packages.html

@rtsao

This comment has been minimized.

Show comment
Hide comment
@rtsao

rtsao Mar 1, 2017

What is the "browser field" equivalent for module?

main and browser are pretty much universally supported, and module is becoming well supported, but what about module for browsers?

rtsao commented Mar 1, 2017

What is the "browser field" equivalent for module?

main and browser are pretty much universally supported, and module is becoming well supported, but what about module for browsers?

@theefer

This comment has been minimized.

Show comment
Hide comment
@theefer

theefer Mar 2, 2017

@rauschma That looks nicer, but still not sure it would account for describing all the transitional features people are using (e.g. ::bind, async/await, etc)?

theefer commented Mar 2, 2017

@rauschma That looks nicer, but still not sure it would account for describing all the transitional features people are using (e.g. ::bind, async/await, etc)?

@rtsao

This comment has been minimized.

Show comment
Hide comment
@rtsao

rtsao Mar 14, 2017

To answer my own question, I found this workaround (reproduced below): rollup/rollup-plugin-node-resolve#8 (comment)

"main": "dist/node/index.js",
"jsnext:main": "dist/node/index.es2015.js",
"browser": {
  "dist/node/index.js": "dist/browser/index.js",
  "dist/node/index.es2015.js": "dist/browser/index.es2015.js"
}

rtsao commented Mar 14, 2017

To answer my own question, I found this workaround (reproduced below): rollup/rollup-plugin-node-resolve#8 (comment)

"main": "dist/node/index.js",
"jsnext:main": "dist/node/index.es2015.js",
"browser": {
  "dist/node/index.js": "dist/browser/index.js",
  "dist/node/index.es2015.js": "dist/browser/index.es2015.js"
}
@dandv

This comment has been minimized.

Show comment
Hide comment
@dandv

dandv Jun 8, 2017

Any chance to update the OP and mention that pkg.module now supersedes jsnext:main?

Asking because I landed here from a Google search for jsnext:main, and there was no hint at the current state of the art.

dandv commented Jun 8, 2017

Any chance to update the OP and mention that pkg.module now supersedes jsnext:main?

Asking because I landed here from a Google search for jsnext:main, and there was no hint at the current state of the art.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Jun 8, 2017

Good idea — have updated it, thanks

Rich-Harris commented Jun 8, 2017

Good idea — have updated it, thanks

@DanielSWolf

This comment has been minimized.

Show comment
Hide comment
@DanielSWolf

DanielSWolf Jun 26, 2017

Does NPM support the pck.module syntax? If so, since what version?

I'm using NPM 4.2. If I publish a package that only defines pck.module, not pck.main, NPM will publish an empty package, that is, a package containing only package.json, not the file pointed to by pck.module.

DanielSWolf commented Jun 26, 2017

Does NPM support the pck.module syntax? If so, since what version?

I'm using NPM 4.2. If I publish a package that only defines pck.module, not pck.main, NPM will publish an empty package, that is, a package containing only package.json, not the file pointed to by pck.module.

@Rich-Harris

This comment has been minimized.

Show comment
Hide comment
@Rich-Harris

Rich-Harris Jun 26, 2017

@DanielSWolf npm only supports pkg.mainpkg.module (and pkg.browser and its various analogs) are only understood by tools.

Rich-Harris commented Jun 26, 2017

@DanielSWolf npm only supports pkg.mainpkg.module (and pkg.browser and its various analogs) are only understood by tools.

aubergene added a commit to aubergene/d3-line-chunked that referenced this issue Oct 26, 2017

iainbeeston added a commit to iainbeeston/foundation-sites that referenced this issue Jan 10, 2018

@toverux toverux referenced this issue Mar 7, 2018

Closed

Release v1.0.0 #12

@notruth notruth referenced this issue Apr 11, 2018

Closed

ES modules #107

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment