Skip to content
This repository has been archived by the owner on May 11, 2018. It is now read-only.

Feature request: transform a require('babel-polyfill') call to specific requires base on the env config #20

Closed
keyz opened this issue Oct 8, 2016 · 30 comments

Comments

@keyz
Copy link

keyz commented Oct 8, 2016

First of all, babel-preset-env is awesome. Thanks for making this happen!

I'm thinking about how to make polyfills better. If someone uses babel-preset-env it means he/she has a few specific platforms to target and it's probably not necessary to require the entire babel-polyfill library.

Since we now know what the targets are, can we transform a require('babel-polyfill'); call to specific requires? For example, if someone targets IE 11 then

require('babel-polyfill');

gets transformed to

require('babel-polyfill/array-from');
require('babel-polyfill/array-of');
require('babel-polyfill/object-assign');
require('babel-polyfill/regenerator-runtime');
/* ... other polyfills that IE 11 needs */

The example above is not comprehensive and there should be a better way to require individual polyfill libraries, but you get the idea. If someone targets Chrome 55 then there are only very few polyfills that need to be applied.

It's probably gonna be a new transform plugin (transform-polyfill-require? :D) and won't live in this repo, but it should use the compat-table data here. Feedbacks are welcome!

Related tweet: https://twitter.com/left_pad/status/784611480947810304

@hzoo
Copy link
Member

hzoo commented Oct 8, 2016

Yeah I guess this would be a very specific plugin since I would think you only have require('babel-polyfill');/import once in your app (at the entry point). Crazy but we should try it out to see!

@keyz keyz changed the title Transform a require('babel-polyfill') call to specific requires base on the env config Feature request: transform a require('babel-polyfill') call to specific requires base on the env config Oct 8, 2016
@ljharb
Copy link
Member

ljharb commented Nov 1, 2016

The reality is that every browser needs every polyfill, because they all have quirks in their implementations that need correcting.

@tizmagik
Copy link

tizmagik commented Nov 1, 2016

The reality is that every browser needs every polyfill, because they all have quirks in their implementations that need correcting.

I'm not sure that's true, but regardless, a polyfill wouldn't solve that anyway since IINM most polyfills are defined like so (e.g. only polyfill if there is no native implementation):

Array.prototype.includes = Array.prototype.includes || function includesPolyfill() { ... }

@ljharb
Copy link
Member

ljharb commented Nov 1, 2016

Imperfect ones, sure. Polyfills/shims of sufficient quality actually runtime-test their implementation (when a buggy one exists) to ensure it behaves properly: https://github.com/ljharb/object.assign/blob/master/polyfill.js#L41-L50

@tizmagik
Copy link

tizmagik commented Nov 2, 2016

Gotcha, thanks for the clarification @ljharb -- it looks like corejs (which babel-polyfill uses) does do some runtime implementation checking in some cases, so that's good to know! 👍

@Munter
Copy link

Munter commented Nov 30, 2016

Would such a feature have the potential of also reducing the babel configuration considerably? I'm thinking that when we can couple the knowledge about the target environments, the environments capabilities and the knowledge of what capabilities require which plugins or polyfills, it could be possible the this project could just "do the right thing" and not put the cognitive load of mapping ES features to babel plugins and configuring babel

@hzoo
Copy link
Member

hzoo commented Dec 2, 2016

I have some progress (working on this again)! (Will need help on the actual data part, but I'm able to get the data from the compat-table). Will need to determine how to split up everything.

screen shot 2016-12-01 at 8 34 51 pm

EDIT: moving tasks to PR #56

@hzoo
Copy link
Member

hzoo commented Dec 2, 2016

The reality is that every browser needs every polyfill, because they all have quirks in their implementations that need correcting.

Also interesting is #54 @ljharb

@hzoo hzoo mentioned this issue Dec 2, 2016
11 tasks
@hzoo
Copy link
Member

hzoo commented Dec 2, 2016

Might as well post what I have #56

Will need help with the first bullet ^ (add the rest of the built-ins. If anyone would like to help please do (either copy/paste in a comment or PR my PR)

Working on write the babel plugin to transform the import/require statement for import babel-polyfill

screen shot 2016-12-02 at 3 00 37 pm

@hzoo
Copy link
Member

hzoo commented Dec 2, 2016

Ok a lot of this is done - the last part is making sure the data is correct and makes sense (please review if you can) data/builtInFeatures.js in the PR

@stevewillard
Copy link

This is awesome! Thank you!

@hzoo
Copy link
Member

hzoo commented Dec 2, 2016

@Kovensky mentioned in slack that this is probably better done as a webpack plugin so can work on that as well after this.

once you get the needed APIs for that last item, it could be nice to expose it so I could try to write a webpack plugin to do it 🙂
generate an entry point that requires the appropriate babel-polyfill subset and then requires the rest of the app

@jakwuh
Copy link

jakwuh commented Dec 7, 2016

@hzoo this is really awesome, thank you for your work.

I've spent some time working on this problem as well and have concluded that loading shims should be async in browser env. Such approach will save hundreds of KBs if I target some older browsers which need quiet a lot of shimming. I've ended up writing a lightweight automatic shimmer but for now the list of possible shims to load should be filled up manually.

The idea I am thinking about now is that it would be great to write a webpack plugin (as @Kovensky mentioned) which will generate an entry point that check in runtime if polyfill is needed and load only needed shims asynchronously (e.g. with require.ensure). And a list of checks could be generated automatically based on the babel-preset-env.

@hzoo, @Kovensky what do you think about that? If someone is already working on that it would be a pleasure for me to join.

@Jessidhia
Copy link
Member

Jessidhia commented Dec 7, 2016

The problem with loading the shims asynchronously is that you also need to call the entry point asynchronously. This might break code that relies on doing (even just a little) sync initialization. You also get into problems with how to optimize the chunks (applying it naively might generate like 30 different chunks, with probably duplicate code in each of them).

It could be an option, though.

@jakwuh
Copy link

jakwuh commented Dec 7, 2016

@Kovensky this is how I see the generated entry point:

var shim = function (cb) {};

shim(function() {
 // all shims are prepared
 require(/* entry point here */);
})

This is possible, because webpack does not execute modules if it is not asked to. Thus we could initially load all shims and after that execute a real entry point, be it sync or async.

@Jessidhia
Copy link
Member

Jessidhia commented Dec 7, 2016

Delaying the execution until the shims are available is executing it async :)

To be more specific, it does not block the parser, allowing it to continue reading the document and potentially execute other script tags. If, for example, other script tags depend on globals set up by the webpack bundle, they might crash or otherwise misbehave.

To make things more complicated, webpack 2's require.ensure is Promise-based, so the Promise shim would always have to be loaded synchronously regardless of anything.

This also means that it will, always, execute asynchronously, even if the Promise resolves immediately because all shims are cached.

@jakwuh
Copy link

jakwuh commented Dec 7, 2016

@Kovensky thanks for detailed explanations, I got it.

I think we still could do a better shimming by not simply inlining shims in each module like babel does, but requiring needed shims and then compiling them w/ webpack. And also I agree that async shimming should be an option.

@hzoo
Copy link
Member

hzoo commented Dec 7, 2016

@jakwuh Ok cool we can iterate on the solutions for sure. Just wanted to try out one idea first

Maybe we should make a new issue for additional possibilities/options?

@ljharb
Copy link
Member

ljharb commented Dec 7, 2016

Shims/polyfills can only work reliably if they're synchronously loaded first - before any other code runs - so they have time to patch or provide all the builtins, before other code has time to cache a reference to them.

@jakwuh
Copy link

jakwuh commented Dec 7, 2016

@ljharb but why chrome 54 users (35% market) should download 100s of KBs of shims which are needed only for <IE9 users (just for an example)?

@ljharb
Copy link
Member

ljharb commented Dec 7, 2016

@jakwuh because even latest Chrome has spec bugs - every browser might need every shim, and the only way you can ensure that they all behave the same is if you send all of them, to every user, every time, and do all the feature-detection at runtime. That download cost is only paid once anyways, since it's cached.

@jakwuh
Copy link

jakwuh commented Dec 7, 2016

@ljharb or you can split each polyfill into 2 parts: feature detection and implementation (this is how it is usually done) and by default ship only feature detection code and then load implementations on-demand.

Sure, all polyfills are cached but nevertheless one could not rely completely on cache. When your codebase is 1MB and shims cost 1MB too it makes difference to the users. Though this is just my opinion and it could be wrong, so thank you for sharing yours.

@ljharb
Copy link
Member

ljharb commented Dec 7, 2016

Except that "on demand" is async, and simply doesn't work with shims/polyfills. This approach has been tried (and usually abandoned) multiple times before (I don't have links off hand).

@zloirock
Copy link
Member

zloirock commented Dec 7, 2016

@ljharb if you know that latest Chrome has a spec bug (and compat-table should know about it) and your target is latest Chrome - with this transform you will load a polyfill which fixes this bug, I don't see any problems here.

@ljharb
Copy link
Member

ljharb commented Dec 7, 2016

@zloirock oh sure, in the case where you have previously feature-detected a runtime that's on your "supported" list, and you thus include that transform in your preset, that's somewhat reasonable. However, that still doesn't take into account that a new browser version could appear - untested by the compat table - which has new bugs. This is a reasonable middle ground, but it's still not quite as robust and safe as sending everything.

@chicoxyzzy
Copy link
Member

we probably should add a property to compat-table tests which will indicate if test result was nested from previous version or they are verified

@ljharb
Copy link
Member

ljharb commented Dec 7, 2016

I agree - I've suggested that before. I think that gives the best balance between not having to verify everything on every engine, but being able to systematically do so over time.

@zloirock
Copy link
Member

zloirock commented Dec 7, 2016

@ljharb

However, that still doesn't take into account that a new browser version could appear - untested by the compat table - which has new bugs.

I agree, but feature detection in polyfills also can not take into account unknown at this moment bugs, so, in this case, even loading a full version of polyfill will not completely help here, only with obvious bugs or with a reiteration of old bugs.

@ljharb
Copy link
Member

ljharb commented Dec 7, 2016

@zloirock you're correct that it can't take into account unknown kinds of bugs, but there are many kinds of known bugs that are common enough (ie, easy enough to create in implementations) that could still pop up, and that shims would still cover. I'll admit this isn't as strong of an argument though :-)

@hzoo
Copy link
Member

hzoo commented Dec 12, 2016

Oh yeah this is done now in #56 via 1.0 release. We can make other issues to refine it or bugs

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

10 participants