Navigation Menu

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add a check that same file is not copied twice #3712

Merged
merged 1 commit into from Jun 26, 2017

Conversation

bestander
Copy link
Member

Summary

If the same file is copied twice during a bulk copy then we have a problem in out recursive logic.
Added an invariant to make sure we don't make extra IO operation.

Test plan

all tests pass

@@ -163,6 +163,7 @@ async function buildActionsForCopy(
const {src, dest, type} = data;
const onFresh = data.onFresh || noop;
const onDone = data.onDone || noop;
invariant(!files.has(dest), `The same file ${dest} can't be copied twice in one bulk copy`);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it efficient? It O(N²), right?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why would it be?
Set should be implemented by a HashSet and its has operation should be sublinear, O(1) in an ideal situation.
https://jsperf.com/array-indexof-vs-set-has

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I'm wrong, I thought it was an array (in which case it would be iterating over the array for each element to copy, hence square complexity), but with a set it's all fine 👍

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants