New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Meta: Getting the most out of preset-env #6328

Open
probablyup opened this Issue Sep 28, 2017 · 8 comments

Comments

Projects
None yet
6 participants
@probablyup

probablyup commented Sep 28, 2017

I had a great convo with @hzoo about this topic over the weekend and wanted to distill it down into some concrete action steps if there’s interest:

The concept

The “env” preset is an awesome evolution of the Babel ecosystem. It allows for finely-targeted transpilation and more efficient inclusion of necessary helpers, which leads to smaller overall bundle sizes. However, due to the way most Babel complimentary libraries work (babel-loader, etc) and their best practices, the contents of node_modules cannot receive the same optimization.

I will propose a series of changes to Babel that enable more sophisticated transpilation (and in some cases, detranspilation), not just in your own files but in consumed node_modules as well.

Suggested changes

  1. All transform results should be tagged with a proprietary JS comment, describing the sequence of changes.

  2. Define a first-party, opt-in mechanism for reversing transforms.

  3. Encourage Babel bundling libraries (babel-loader) to allow traversal of node_modules.

  4. When encountering a prior transform that is unnecessary under the current env target(s), reverse it if possible.

Happy to hear thoughts and constructive criticism!

@babel-bot

This comment has been minimized.

Show comment
Hide comment
@babel-bot

babel-bot Sep 28, 2017

Collaborator

Hey @probablyup! We really appreciate you taking the time to report an issue. The collaborators
on this project attempt to help as many people as possible, but we're a limited number of volunteers,
so it's possible this won't be addressed swiftly.

If you need any help, or just have general Babel or JavaScript questions, we have a vibrant Slack
community that typically always has someone willing to help. You can sign-up here
for an invite.

Collaborator

babel-bot commented Sep 28, 2017

Hey @probablyup! We really appreciate you taking the time to report an issue. The collaborators
on this project attempt to help as many people as possible, but we're a limited number of volunteers,
so it's possible this won't be addressed swiftly.

If you need any help, or just have general Babel or JavaScript questions, we have a vibrant Slack
community that typically always has someone willing to help. You can sign-up here
for an invite.

@Andarist

This comment has been minimized.

Show comment
Hide comment
@Andarist

Andarist Sep 28, 2017

Member

Great starter! I've felt for quite some time that 'best practices' should be revisited. Always seemed strange that:

  • babel-helpers are duplicated in all transpiled by babel dependencies (also quite always duplicated even across files of the same project)
  • as mentioned by you - cannot target efficiently newer envs, because libraries are transpiled to es5 and we end up with sub-optimal code for those parts

The problem arises especially after reading https://philipwalton.com/articles/deploying-es2015-code-in-production-today/

I feel the idea of reversing transforms is not an ideal solution for the problem though. The costs of maintenance both transforms and reversed transforms is just too big.

It seems just way easier to consume untranspiled (or partially transpiled, i.e. to the latest syntax) libraries. That could ofc be done by just publishing es6 code to npm and omitting ignore: ['node_modules'] from the loaders. But there are 2 problems with that:

  • performance obviously, some kind of caching for transpiled node_modules should be introduced to reduce build times
  • not everyone uses bundlers so we would have to ship just another version of the same libraries

The second problem is present even today - many libraries ship 2 versions of their code:

  • transpiled + transpiled to commonjs ("main" entry in package.json)
  • transpiled + untranspiled es6 modules ("module" entry in package.json)

Shipping yet another version (untranspiled es6+) of the library seems unreasonable. Users already need the files designated for their target environment.

Don't know whats the ideal solution for this problem, one idea I had (but needing great cooperation between verious tools :/) was adding "target" capability to the packagers. Library publishers would publish their code with different targets, independent copies. Consumers (apps, other libraries) would download only version of the library for their specified "target" env (new field in package.json, with a reasonable default ofc).

Member

Andarist commented Sep 28, 2017

Great starter! I've felt for quite some time that 'best practices' should be revisited. Always seemed strange that:

  • babel-helpers are duplicated in all transpiled by babel dependencies (also quite always duplicated even across files of the same project)
  • as mentioned by you - cannot target efficiently newer envs, because libraries are transpiled to es5 and we end up with sub-optimal code for those parts

The problem arises especially after reading https://philipwalton.com/articles/deploying-es2015-code-in-production-today/

I feel the idea of reversing transforms is not an ideal solution for the problem though. The costs of maintenance both transforms and reversed transforms is just too big.

It seems just way easier to consume untranspiled (or partially transpiled, i.e. to the latest syntax) libraries. That could ofc be done by just publishing es6 code to npm and omitting ignore: ['node_modules'] from the loaders. But there are 2 problems with that:

  • performance obviously, some kind of caching for transpiled node_modules should be introduced to reduce build times
  • not everyone uses bundlers so we would have to ship just another version of the same libraries

The second problem is present even today - many libraries ship 2 versions of their code:

  • transpiled + transpiled to commonjs ("main" entry in package.json)
  • transpiled + untranspiled es6 modules ("module" entry in package.json)

Shipping yet another version (untranspiled es6+) of the library seems unreasonable. Users already need the files designated for their target environment.

Don't know whats the ideal solution for this problem, one idea I had (but needing great cooperation between verious tools :/) was adding "target" capability to the packagers. Library publishers would publish their code with different targets, independent copies. Consumers (apps, other libraries) would download only version of the library for their specified "target" env (new field in package.json, with a reasonable default ofc).

@probablyup

This comment has been minimized.

Show comment
Hide comment
@probablyup

probablyup Sep 28, 2017

The costs of maintenance both transforms and reversed transforms is just too big.

Definitely nontrivial, but it could be incrementally adopted.

It seems just way easier to consume untranspiled (or partially transpiled, i.e. to the latest syntax) libraries.

This definitely would be less effort, but I could see a lot of confusion on the part of library maintainers on exactly how much to transpile. If you don't transpile at all, you can't really use bleeding edge syntaxes because there's no guarantee the consumer will have that transform plugin enabled (either via "env" or separately.)

IMO it'd be a lot safer to ship fully-transpiled code as we do today, but simply undo the transforms where we can get away with it. It would require less overall ecosystem changes.

probablyup commented Sep 28, 2017

The costs of maintenance both transforms and reversed transforms is just too big.

Definitely nontrivial, but it could be incrementally adopted.

It seems just way easier to consume untranspiled (or partially transpiled, i.e. to the latest syntax) libraries.

This definitely would be less effort, but I could see a lot of confusion on the part of library maintainers on exactly how much to transpile. If you don't transpile at all, you can't really use bleeding edge syntaxes because there's no guarantee the consumer will have that transform plugin enabled (either via "env" or separately.)

IMO it'd be a lot safer to ship fully-transpiled code as we do today, but simply undo the transforms where we can get away with it. It would require less overall ecosystem changes.

@Andarist

This comment has been minimized.

Show comment
Hide comment
@Andarist

Andarist Sep 28, 2017

Member

Definitely nontrivial, but it could be incrementally adopted.

Sure, but it doubles the work and might introduce new bugs, regressions etc. The project, as extremely popular one, and the team has imho enough of work as it is.

This definitely would be less effort, but I could see a lot of confusion on the part of library maintainers on exactly how much to transpile.

This is just a matter for establishing good practices and documenting it. Helping to set up properly most popular projects would also help the community, rest would follow in their own time -
incremental adoption in the community.

If you don't transpile at all

Well, I wasnt speaking about not transpiling at all. The safest way is to ship both (actually 3) builds:

  • transpiled commonjs (default - "main" of package.json)
  • transpiled with es6 modules (opt-in - "module" in package.json)
  • transpiled to latest syntax (opt-in - "latest", or any other name, in package.json - probably needs to be more than a single keyword)

Downside of this is shipping "the same" 3 times. Hence my proposal of the targets - this would need a major community change though.

you can't really use bleeding edge syntaxes because there's no guarantee the consumer will have that transform plugin enabled (either via "env" or separately.)

That's why I have proposed transpiling to the latest syntax as recommended best practice.

IMO it'd be a lot safer to ship fully-transpiled code as we do today, but simply undo the transforms where we can get away with it.

Dont agree about it being safer, as mentioned - transforms are already well tested and reversed transforms do not even exist yet, making them dangerous after release right from the start for quite some until all corner cases would get solved.

It would require less overall ecosystem changes.

Possibly, but both approaches would need ecosystem change, so its not really that much valuable imho to restrain ourselves in the scope of the change. I agree I might be quite radical sometimes in my views, therefore others' input would be highly appreciated!

Member

Andarist commented Sep 28, 2017

Definitely nontrivial, but it could be incrementally adopted.

Sure, but it doubles the work and might introduce new bugs, regressions etc. The project, as extremely popular one, and the team has imho enough of work as it is.

This definitely would be less effort, but I could see a lot of confusion on the part of library maintainers on exactly how much to transpile.

This is just a matter for establishing good practices and documenting it. Helping to set up properly most popular projects would also help the community, rest would follow in their own time -
incremental adoption in the community.

If you don't transpile at all

Well, I wasnt speaking about not transpiling at all. The safest way is to ship both (actually 3) builds:

  • transpiled commonjs (default - "main" of package.json)
  • transpiled with es6 modules (opt-in - "module" in package.json)
  • transpiled to latest syntax (opt-in - "latest", or any other name, in package.json - probably needs to be more than a single keyword)

Downside of this is shipping "the same" 3 times. Hence my proposal of the targets - this would need a major community change though.

you can't really use bleeding edge syntaxes because there's no guarantee the consumer will have that transform plugin enabled (either via "env" or separately.)

That's why I have proposed transpiling to the latest syntax as recommended best practice.

IMO it'd be a lot safer to ship fully-transpiled code as we do today, but simply undo the transforms where we can get away with it.

Dont agree about it being safer, as mentioned - transforms are already well tested and reversed transforms do not even exist yet, making them dangerous after release right from the start for quite some until all corner cases would get solved.

It would require less overall ecosystem changes.

Possibly, but both approaches would need ecosystem change, so its not really that much valuable imho to restrain ourselves in the scope of the change. I agree I might be quite radical sometimes in my views, therefore others' input would be highly appreciated!

@Kovensky

This comment has been minimized.

Show comment
Hide comment
@Kovensky

Kovensky Sep 29, 2017

Member

Yes, the only way to have any approaching-sane behavior with this "latest" target idea is to have packages transpile all non-standard syntax (JSX, type annotations, anything not stage-4) down to the latest standard at the time (e.g. es2017 as of this writing). This will ensure that a single application of preset-env with no options (or only with targets option) will produce code that is usable by the targets.

The difficulty remains with polyfills -- this is frankly chaos in package land. Everyone picks a different one, people keep on using old polyfills, or keep on using possibly inlined polyfills that are unnecessary because you already provide e.g. Object.assign, etc.

I've always been in favor of not loading polyfills in your package and just requiring that your package's user provides it in the environment via their own polyfill, but if you have too many dependencies that kinda devolves into having to load core-js/shim 🤔

Member

Kovensky commented Sep 29, 2017

Yes, the only way to have any approaching-sane behavior with this "latest" target idea is to have packages transpile all non-standard syntax (JSX, type annotations, anything not stage-4) down to the latest standard at the time (e.g. es2017 as of this writing). This will ensure that a single application of preset-env with no options (or only with targets option) will produce code that is usable by the targets.

The difficulty remains with polyfills -- this is frankly chaos in package land. Everyone picks a different one, people keep on using old polyfills, or keep on using possibly inlined polyfills that are unnecessary because you already provide e.g. Object.assign, etc.

I've always been in favor of not loading polyfills in your package and just requiring that your package's user provides it in the environment via their own polyfill, but if you have too many dependencies that kinda devolves into having to load core-js/shim 🤔

@jridgewell

This comment has been minimized.

Show comment
Hide comment
@jridgewell

jridgewell Oct 2, 2017

Member

Undoing a transform is impossibly hard, there are just way to many dynamic aspects of JS to take into account.


Sorry for the close, I fatfingered it.

Member

jridgewell commented Oct 2, 2017

Undoing a transform is impossibly hard, there are just way to many dynamic aspects of JS to take into account.


Sorry for the close, I fatfingered it.

@jridgewell jridgewell reopened this Oct 2, 2017

@probablyup

This comment has been minimized.

Show comment
Hide comment
@probablyup

probablyup Oct 2, 2017

Undoing a transform is impossibly hard

Some are definitely harder than others, but I don't think that's an axiom. The incremental portion of this concept is the most important aspect to this regard... that way it's an opt-in process for transforms that are linear enough to be undone.

probablyup commented Oct 2, 2017

Undoing a transform is impossibly hard

Some are definitely harder than others, but I don't think that's an axiom. The incremental portion of this concept is the most important aspect to this regard... that way it's an opt-in process for transforms that are linear enough to be undone.

@gaearon

This comment has been minimized.

Show comment
Hide comment
@gaearon

gaearon Jan 17, 2018

Member

I wrote a long-winded answer about how we plan to approach this in Create React App. It's not about "reversing" but captures my current thinking on how we could approach compiling node_modules. It might be obvious to everybody except me, but I hope maybe somebody reading this thread in the future could find it helpful: parcel-bundler/parcel#559 (comment).

Member

gaearon commented Jan 17, 2018

I wrote a long-winded answer about how we plan to approach this in Create React App. It's not about "reversing" but captures my current thinking on how we could approach compiling node_modules. It might be obvious to everybody except me, but I hope maybe somebody reading this thread in the future could find it helpful: parcel-bundler/parcel#559 (comment).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment