Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Pre dead code elimination before module graph and bundle graph been built by native javascript parser, for smaller bundle size. #16672

Closed
icy0307 opened this issue Jan 26, 2023 · 4 comments
Labels

Comments

@icy0307
Copy link

icy0307 commented Jan 26, 2023

Feature request

I know in the current webpack build process, dead code removal happens last, which relies on minifier like terser to drop the dead code on per chunk level thanks to @alexander-akait kind explanation.
However, this can lead to significantly bigger chunk sizes due to lack of information.

Take the below code for example, let's say we have two different builds, one for PC users and one for mobiles. Each of the files refers to different esm modules of common node modules packages like lodash-es and MUI.
This common chunk is spilt into a separate chunk by SplitChunkPlugin for better cache opportunity.

export const treeShakingTestAsync = async () => {
    console.log('treeShakingTestAsync');
    //  Be replaced with 'false' by DefinePlugin
    if (__MOBILE__) {
        const { Baz } = await import(/* webpackChunkName: "bundle_mobile" */'./mobile');
        console.log(Baz);
    } else {
        const { foo } = await import(/* webpackChunkName: "bundle_pc" */ './pc');
        console.log(foo);
    }

};

pc.ts and mobile.ts both use some of the code from a common module.

// pc.ts
export { bar , foo } from './lib-modules';
// mobile.ts
export { Baz } from './lib-modules';

lib-modules.ts mimics a large esm package index file.

// lib-modules
export function foo() { console.log('FuncFoo') };
export function bar() {
    console.log('FuncFoo')
};
export class Baz {
    private c = 'ClassBaz'
}

For PC build, we won't want Baz to appear in the final bundle.
But If we perform dead code removal purely on chunks level, after the module graph is built, Baz is already marked as 'exported' and used. Terser or any other minifier would not know that it is never used, because this information happens across files, only webpack level's javascript compiler knows such kind of thing.

And in the above situation, webpack does pre-optimize dead code removal.
It prevents further dependency build processes in dead branches.
However, this case is just a simple example of what can pre dead code elimination can achieve.

There are still a lot of scenarios not covered.

  1. Per File level optimization, like if-return.
export const treeShakingTestAsync = async () => {
    console.log('treeShakingTestAsync');


    const { foo } = await import(/* webpackChunkName: "bundle_pc" */ './pc');
    console.log(foo);
    // Code wont't be reached after if_return, but still get to be bundled, and Baz get to be bundled as well, due to splitchunk.
    if (false) {
        return;
    }
    const { Baz } = await import(/* webpackChunkName: "bundle_mobile" */'./mobile');
    console.log(Baz);
};
  1. unused exports
export const treeShakingTestAsync = async () => {
    console.log('treeShakingTestAsync');
    
    const { foo } = await import(/* webpackChunkName: "bundle_pc" */ './pc');
    console.log(foo);
};

export const unusedExport = async () => {
    console.log('unusedExport');
    // this never be refered from entry.
    const { Baz } = await import(/* webpackChunkName: "bundle_mobile" */ './mobile');
    console.log(Baz);
};

What is the expected behavior?
dead code won't appear in module graph.

What is motivation or use case for adding/changing the behavior?
Much Smaller bundle size for giant project.
I work on a large private company SPA codebase with its bundle size over 70MB before gizpped, with many reasonable or unreasonable splitchunk rules. I found that most unused code can be avoided by simply just not letting the PC user download the Mobile version code.
Furthermore, more modules involved not only affect download size but also compile time. With 5K esm modules, the webpack_ requires process alone takes nearly one whole second.
Even concatenated modules won't help, because the file is so large, way above the 170 KB suggested best practice, so we have to spilt the initial chunk into several chunks.
But more split chunk rules lead to more unused code which ends up with huge code size bloat.

How should this be implemented in your opinion?
Webpack's level javascript parser code removal to avoid unnecessary modules in the module graph and chunks in the chunk graph.
If for some reason this cannot be achieved, would running terser as loader after ts loader in order to drop dead code in advance be a dumb idea?
@sokra @alexander-akait
Are you willing to work on this yourself?
yes

@timocov
Copy link

timocov commented Jan 28, 2023

Per File level optimization, like if-return.

Looks like #14347

@ScriptedAlchemy
Copy link
Member

This is a problem I’ve encountered too.

For 1) it can be difficult to know whats shakeable if you encounter the scenario where the condition were at runtime value, not if(false) for instance. But like if(document) or something - don’t have a perfect example right now.

However statically inferred values should be possible.

How I typically have solved this is with a pitching loader where I’d call this.loadModule and break up the same file into multiple module ids.

  1. on unused exports - if I understand correctly, you mean that by dynamic importing - webpack doesn’t tree shake unused modules you destruct off the dynamic import the same way it would if it were a static import?

I know you can manually tree shake import() with magic comments. I think it’s “usedExports” if I remember correctly

@webpack-bot
Copy link
Contributor

This issue had no activity for at least three months.

It's subject to automatic issue closing if there is no activity in the next 15 days.

@alexander-akait
Copy link
Member

let's close in favor #14347, I added more scenarios, I agree we need to improve this, feel free to send a PR if you want to help, sorry for delay in answers

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

5 participants