Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix filesystem cache not working when package individually is set #1037

Merged
merged 3 commits into from Mar 29, 2022

Conversation

hieuunguyeen
Copy link
Contributor

@hieuunguyeen hieuunguyeen commented Jan 13, 2022

What did you implement:

Fix a bug that occurs when using webpack cache type "filesystem" when package.individually === true
The error displaying by webpack:

[webpack.cache.PackFileCacheStrategy] Caching failed for pack: Error: ENOENT: no such file or directory, rename '[...].cache/webpack/default-production/0.pack.gz_' -> '[...].cache/webpack/default-production/0.pack.gz'

Root cause: Webpack filesystem cache generate cache files to the same cache location, with the same cache name, no matter how many times it runs. When package individually and concurrently, multiple concurrent webpack runs results in overlapping of cache location for each individual functions, overwriting each other's caches.

How did you implement it:

Webpack will create a different cache folder name only if webpackConfig.cache.name is not the same (link)
Each time webpackCompile() is called, it needs a separate config that has a different cache.name
The best way to work this out is when individually caching, set the config that is used to package the function to have the cache.name as the function name, which is preformatted to be {serviceName}-{stage}-{functionName}

How can we verify it:

Enable cache.type = "filesystem"
Package a service with package.individually = true

Todos:

  • Write tests
  • Write documentation
  • Fix linting errors
  • Make sure code coverage hasn't dropped
  • Provide verification config / commands / resources
  • Enable "Allow edits from maintainers" for this PR
  • Update the messages below

Is this ready for review?: YES
Is it a breaking change?: NO

Copy link
Member

@vicary vicary left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for contributing!

I tried doing this and my 300 lambdas (8MB each zipped) pumped out >50GB of cache, using unique cache keys for each entry essentially multiplies your tree-shaked node_module by the number of entries. I never thought people would find this useful.

Despite the caveat, this indeed fix a broken feature.

@j0k3r Would you like to comment on the way it passes in the error class? I remember that you made this change for v3 compat.

@vicary vicary requested a review from j0k3r January 13, 2022 19:34
@hieuunguyeen
Copy link
Contributor Author

@vicary have you tried gzip compression? Without specifying compression method, the default cache would be in .pack which is huge.
The reason I find this useful is that our project has grown so large that build time is horrendous. Shared cache would improve our CI/CD build speed quite a lot.

@vicary
Copy link
Member

vicary commented Jan 13, 2022

@hieuunguyeen I believe we are talking about the same thing here, the current implementation in serverless-webpack uses MultiCompiler which does not appear to share cached chunks between build configs.

I think the only way it would work as expected is to rewire the array of configs into an entry object. This fix only helps from the 2nd build and on, where each function picks up its own last build cache.

@j0k3r
Copy link
Member

j0k3r commented Jan 24, 2022

@j0k3r Would you like to comment on the way it passes in the error class? I remember that you made this change for v3 compat.

I didn't.
@medikoo did it in its PR (not yet merged): https://github.com/serverless-heaven/serverless-webpack/pull/1013/files#diff-54595458990a4e2e3a6b29f7ff36d7f60214fec64e21bdc0a195290552f030e8

@j0k3r
Copy link
Member

j0k3r commented Jan 24, 2022

Looks ok to me too, but I think we should wait for the PR with modern log from Serverless v3 before merging it.

@j0k3r
Copy link
Member

j0k3r commented Mar 24, 2022

@hieuunguyeen sorry for being late, the PR I mentioned got merged today.
Can you rebase against the master to fix conflicts? Thanks 🙏

@j0k3r j0k3r added the awaiting reply Awaiting for a reply from the OP label Mar 24, 2022
@j0k3r j0k3r added this to the 5.7.0 milestone Mar 24, 2022
@j0k3r j0k3r removed the awaiting reply Awaiting for a reply from the OP label Mar 24, 2022
@j0k3r
Copy link
Member

j0k3r commented Mar 29, 2022

@hieuunguyeen Could you rebase and squash one more time? And we'll be good to go! Thanks 🙏

@j0k3r
Copy link
Member

j0k3r commented Mar 29, 2022

Could you update your tests as we switched from mocha to jest?

@j0k3r j0k3r merged commit 1457a28 into serverless-heaven:master Mar 29, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants