-
Notifications
You must be signed in to change notification settings - Fork 461
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Atomically write transformed JS files back to the filesystem #156
Atomically write transformed JS files back to the filesystem #156
Conversation
Instead of directly overwriting each JS file with its new contents, the worker code now writes the new contents to a temporary file and then does a filesystem move operation to overwrite the old file. This avoids the previous behavior where the file had invalid contents while it was being written (because `writeFile` is not atomic). This fixes a race condition I was seeing in a script I was running on a large codebase: https://github.com/alangpierce/bulk-decaffeinate/blob/master/jscodeshift-scripts/fix-imports.js The script fixes import statements across the codebase by reading the exports of the referenced files (using `readFileSync` and using jscodeshift to parse the other file). Because the other files were also being transformed by jscodeshift, sometimes (nondeterminsitically) some imports would fail to convert because the referenced file wouldn't be valid. After applying this fix and re-running the script on my codebase, it seems to completely fix the problem.
Thank you for your pull request and welcome to our community. We require contributors to sign our Contributor License Agreement, and we don't seem to have you on file. In order for us to review and merge your code, please sign up at https://code.facebook.com/cla - and if you have received this in error or have any questions, please drop us a line at cla@fb.com. Thanks! |
love this! |
@@ -146,13 +146,21 @@ function run(data) { | |||
console.log(out); // eslint-disable-line no-console | |||
} | |||
if (!options.dry) { | |||
fs.writeFile(file, out, function(err) { | |||
const tmpFile = `${file}.tmp.${Math.floor(Math.random() * 1000000)}`; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what do you think about using os.tmpdir
with a stricter permission (0600) or even just the tmp
module's file
method so we don't clutter the pwd in case of failure?
Currently if the write succeeds but the rename does not the randomly named transformed file will be left behind.
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Facebook open source project. Thanks! |
Sorry for the delay in getting to this. One issue I ran into with this change and other temp file approaches is that I want to preserve permissions on the original file. In particular, if the original file is an executable script, I want it to stay that way. It doesn't look like there's any way to preserve the destination file permissions on The write-file-atomic package seems to handle temp file creation (although not using |
Class transform - Preserve the import declaration when deleting pure mixin import if necessary
Instead of directly overwriting each JS file with its new contents, the worker
code now writes the new contents to a temporary file and then does a filesystem
move operation to overwrite the old file. This avoids the previous behavior
where the file had invalid contents while it was being written (because
writeFile
is not atomic).This fixes a race condition I was seeing in a script I was running on a large codebase:
https://github.com/alangpierce/bulk-decaffeinate/blob/master/jscodeshift-scripts/fix-imports.js
The script fixes import statements across the codebase by reading the exports
of the referenced files (using
readFileSync
and using jscodeshift to parsethe other file). Because the other files were also being transformed by
jscodeshift, sometimes (nondeterminsitically) some imports would fail to
convert because the referenced file wouldn't be valid. After applying this fix
and re-running the script on my codebase, it seems to completely fix the
problem.