Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: incorrect usage of Hash.digest #347

Merged
merged 1 commit into from Apr 5, 2019

Conversation

4 participants
@fkhadra
Copy link
Contributor

fkhadra commented Apr 5, 2019

Hey,

Hash.digest expect an encoding value as parameter(latin1 | hex | base64).

The absolute path of the file was passed to Hash.digest which is incorrect. To generate a hash
based on the pathname we have to call Hash.update first.

The desired encoding format is then passed to Hash.digest. This remove the need to call toString method.

fix: incorrect usage of Hash.digest
Hash.digest expect an encoding value as parameter(latin1 | hex | base64).
The absolute path of the file was passed to Hash.digest which is incorrect. To generate a hash
based on the pathname we have to call Hash.update first.
The desired encoding format is then passed to Hash.digest. This remove
the need to call toString method.

@fkhadra fkhadra requested review from guybedford and styfle as code owners Apr 5, 2019

@codecov-io

This comment has been minimized.

Copy link

codecov-io commented Apr 5, 2019

Codecov Report

Merging #347 into master will not change coverage.
The diff coverage is n/a.

Impacted file tree graph

@@           Coverage Diff           @@
##           master     #347   +/-   ##
=======================================
  Coverage   69.37%   69.37%           
=======================================
  Files          12       12           
  Lines         382      382           
=======================================
  Hits          265      265           
  Misses        117      117
Impacted Files Coverage Δ
src/cli.js 62.35% <ø> (ø) ⬆️

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update d0494e7...b19baa8. Read the comment docs.

@styfle

styfle approved these changes Apr 5, 2019

Copy link
Member

styfle left a comment

This looks correct

@styfle

This comment has been minimized.

Copy link
Member

styfle commented Apr 5, 2019

@guybedford It looks like hashing/caching was added in #162 but does that mean it never working properly?

@guybedford

This comment has been minimized.

Copy link
Collaborator

guybedford commented Apr 5, 2019

So yes, this never worked... Thanks so much for the fix!

Effectively we have always been outputting the ncc run build to /tmpdir/d41d8cd98f00b204e9800998ecf8427e always.

The main important thing is that the directory name is random enough that multiple runs of ncc run can happen in parallel without collision (although even that would only cause issues when deferred asset loads are attempted), but I guess we haven't really hit that specific case for this to be a problem.

We could entirely use a random generator here too, which would probably be even less likely to cause the conflict above.

The other concern over fully random directories, was if the process was interrupted before it can clean up, then we leave behind the temporary folder, so there is a risk of file system bloat which was trying to be avoided by at least having a predictable hash generation based on the name.

Hope that explains somewhat the constraints in play here, and why the issue hasn't been critical.

@fkhadra

This comment has been minimized.

Copy link
Contributor Author

fkhadra commented Apr 5, 2019

I'm glad it help. Keep up the great work !

@styfle styfle merged commit 1d7bba6 into zeit:master Apr 5, 2019

2 checks passed

ci/circleci: test-linux Your tests passed on CircleCI!
Details
ci/circleci: test-macos Your tests passed on CircleCI!
Details
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
You can’t perform that action at this time.