Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Phoenix digest does not work with automatically generated webpack chunks #2996

Closed
chengyin opened this issue Aug 7, 2018 · 14 comments
Closed

Comments

@chengyin
Copy link

chengyin commented Aug 7, 2018

Webpack supports code splitting with dynamic import out of box, which is very helpful for reducing SPA initial loading size.

However those automatically generated chunks are not digested properly. The original file are still used instead of the hash-named and gzipped ones since the references to chunks live in the JS files instead of the eex files.

I'm not 100% sure if this should be Phoenix's concern. However I think to an extend this is similar to #1936. I also haven't found a workaround.

I generated a sample Phoenix app to demonstrate this behavior: http://github.com/chengyin/webpack-code-splitting-phoenix

Environment

  • Elixir version (elixir -v): Erlang/OTP 21 [erts-10.0.3] [source] [64-bit] [smp:8:8] [ds:8:8:10] [async-threads:1] [hipe] [dtrace]

Elixir 1.6.6 (compiled with OTP 20)

  • Phoenix version (mix deps): (master at fdcd4ef)
  • NodeJS version (node -v): v8.6.0
  • NPM version (npm -v): 6.2.0
  • Operating system: 10.13.6 (17G65)
@AlexRiedler
Copy link

After many of my experiences on front-end with Rails framework as well. I am wondering if we should be even generating digests inside phoenix to be honest. We could leave it up to webpack/bunch/... instead to generate the digest file. I think it would avoid us re-inventing parts of the wheel.

e.g. being able to read something like https://www.npmjs.com/package/webpack-manifest-plugin ; I am also curious if it would work for this case. @chengyin you have any experience in the front-end side of this for dynamic imports?

@chengyin
Copy link
Author

Hi @AlexRiedler,

To your point, Webpack certainly prefers to manage all static assets for production usage. For dynamic importing specifically, Webpack requires knowledge about the chunk being imported therefore it must manage chunk generation for the imported piece.

I think your idea works:

  • All static assets are managed and generated by Webpack (through different loaders)
  • Production mode webpack generates hash versions and gz versions, that plugin generate manifest
  • Phoenix read the manifest to replace references in .eex files

The lack of responses here makes me wonder if this is the right place for this discussion?

@AlexRiedler
Copy link

@chengyin I think its the right place for discussion, I would be willing to put in the effort to move this forward if others think it makes sense to pull it out. I am just not sure if there is any argument to keep it in (maybe there is something I am not thinking about). I also am on the #elixir-lang channel on freenode / https://elixir-lang.slack.com/ phoenix channel as well if you want to discuss in a more chat based manner.

@happysalada
Copy link

I've been experiencing exactly that problem recently, love your idea!

@AlexRiedler
Copy link

okay, looking into this more; if we take the npm plugin I gave ; and work around the paths a bit I can actually get it to work with the existing setup weirdly enough (and make it not collide); basically supporting both systems depending on which one you want (which means we can defer whether to remove this from pheonix). Let me try to build an npm package :D

@chrismccord
Copy link
Member

Code splitting is a complex topic and entirely coupled to the build tool, so I don't think it's something we can or should handle on our side digest wise. I also don't feel we should drop our digester in favor of a webpack setup. While we ship with webpack by default, our digester is build-tool agnostic, which makes things like <script src="<%= static_path(@conn, "/js/app.js") %> just work, regardless of js tooling. To make static_path work with a webpack based digester would couple us directly to webpack, where today's approach is entirely decoupled, including down to the watcher config in dev. Thanks for the issue and example app, but unless this could be accomplished in a way that doesn't couple us directly to webpack, I don't think there's a way to move forward.

@happysalada
Copy link

Unfortunate. I guess it makes for a good blog post then!

@AlexRiedler
Copy link

@chengyin @happysalada (sorry chris)

On my example project https://github.com/AlexRiedler/my_project I have something that MOSTLY works, see commits:

AlexRiedler/my_project@4e69c31
AlexRiedler/my_project@e1610b8

basically it uses the entrypoints of webpack to make it work, I am not sure if this works for the chunking in which you are referring to; but maybe? Otherwise we would need to write some better type of manifesting in a node module. Thoughts?

@hisapy
Copy link

hisapy commented Dec 20, 2018

Hi @AlexRiedler

I think your approach of reading Webpack manifest could totally work. Perhaps webpack_static_path is a better name for your helper.

Moreover, since the code-splitting is all handled by Webpack, the chunk files should be named as generated by Webpack compilation. In this sense, running mix phx.digest now creates unneeded files for the dynamic chunks.

For example, for the dynamic chunk MyProjects.b450b5a3.chunk.js mix phx.digest generates MyProjects.b450b5a3.chunk-f8b81f951517ec62f736e94fd1bfa1ac.js, which is not used because Webpack runtime doesn't know about it. I believe that if we can generate gzip files with its corresponding manifest from webpack, then mix phx.digest might not be necessary anymore.

In my case, with CRAs, I'm using a fork of react-scripts that provides predictable bundle names so we can use them with static_path.

@hisapy
Copy link

hisapy commented Dec 21, 2018

I just noticed (after seeing this a thousand times) that the there is a config

cache_static_manifest: "priv/static/cache_manifest.json",

Maybe, there's no need for a another helper (?)

@ssorallen
Copy link

@hisapy The cache_static_manifest is generated after Phoenix fingerprints the files, but the hashes of those files need to be generated during Webpack compilation so that Webpack can dynamically load the correct filenames. To go from Webpack -> Phoenix -> Webpack there's a chicken-and-the-egg problem where Webpack would need to recompile if the filenames changed, which would then require Phoenix to rehash, and on and on to infinity.

It looks like Webpack needs to handle this compilation entirely.

I realize this thread of 1.5 years old. Have either of you evolved your approach to this? I have yet to find a comprehensive approach to Webpack + Phoenix w/ dynamic imports.

@hisapy
Copy link

hisapy commented Jun 30, 2020

Yeah, old thread ... I've been using a fork of react-scripts without any inconveniences as far as I can tell ... maybe the same approach can be used in non CRA apps.

@alizain
Copy link

alizain commented Mar 3, 2021

@chrismccord would you be open to considering a Behaviour for making <script src="<%= static_path(@conn, "/js/app.js") %> just work? We can ship a default implementation that uses Phoenix's cache_static_manifest.json, and provide a Webpack implementation as a separate package that uses Webpack's manifest.json instead.

I'm having the same issue, and I'd love to be able to solve this with something as simple as:

config/prod.exs

config :my_app, MyApp.Endpoint,
  cache_loader: Webpack.CacheLoader,
  cache_manifest: 'priv/static/manifest.json'

With this, Phoenix.Endpoint.Supervisor stops caring about the translation logic.

Ps. In this scenario, we're not changing anything with the Digester. Webpack does the cache-busting contenthash in the filename. People who use this method would lose out on automatic gziping, but I think that's a load-balancer/reverse-proxy concern anyway. If we wanted to solve both problems, I'd love to be able to provide options to the Digester to tell it to gzip only and not hash

@alizain
Copy link

alizain commented Mar 8, 2021

^ moved this conversation over to the phoenix-core mailing list. also have a better understanding about how the static pipeline works, so the idea has evolved.

https://groups.google.com/g/phoenix-core/c/XURqwE5tZE8/m/rjTC1aKqBAAJ

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants