-
Notifications
You must be signed in to change notification settings - Fork 481
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Duplicate dataset, duplicate files deposited into different dataverses, same MD5, same UNF for tabular, different DOIs should not be allowed to complete upload #2621
Comments
Says who?? |
@landreev |
I have updated the issue to capture the entire case |
See "4100+ of these failures ARE ACTUALLY THE SAME FILE" at #3675 (comment) ... copies of the same file across many, many datasets. I guess we allow this. |
@sbarbosadataverse can we close this issue? The business logic is by design. |
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/GYWNQG
https://dataverse.harvard.edu/dataset.xhtml?persistentId=doi:10.7910/DVN/ZTPW0Y
These are the same datasets, deposited into different dataverses and they were allowed to be "uploaded and published" in the UI
need a filter to prevent this from happening.
The text was updated successfully, but these errors were encountered: