Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

5.14 Infinite loop in DDI export with tabular tag #9756

Conversation

qqmyers
Copy link
Member

@qqmyers qqmyers commented Aug 4, 2023

What this PR does / why we need it: As discovered immediately after release, there is an infinite loop in the DDI export when a tabular tag exists. The fix is self explanatory.

Which issue(s) this PR closes:

Closes #

Special notes for your reviewer:

Suggestions on how to test this: Upload a tabular file, wait for ingest. Select the file in the dataset file table. Hit the table level Edit/Metadata option and add a tabular tag. Then publish. With 5.14, this will result in a rapidly growing tmp file that will crash the machine when the ddi export starts to be generated. WIth the fix, this should not happen and you should get a ddi export.

Does this PR introduce a user interface change? If mockups are available, please link/include them here:

Is there a release notes update needed for this change?:

Additional documentation:

Copy link
Contributor

@landreev landreev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

AI bot says it's ok!

@coveralls
Copy link

Coverage Status

coverage: 20.371%. remained the same when pulling 7c9f062 on GlobalDataverseCommunityConsortium:5.14_Infinite_loop_fix into b215f56 on IQSS:develop.

@pdurbin pdurbin added this to Ready for Review ⏩ in IQSS/dataverse (TO BE RETIRED / DELETED in favor of project 34) via automation Aug 4, 2023
Copy link
Member

@pdurbin pdurbin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, thanks.

IQSS/dataverse (TO BE RETIRED / DELETED in favor of project 34) automation moved this from Ready for Review ⏩ to Ready for QA ⏩ Aug 4, 2023
@kcondon kcondon self-assigned this Aug 4, 2023
@kcondon kcondon merged commit 509671c into IQSS:develop Aug 4, 2023
8 of 9 checks passed
IQSS/dataverse (TO BE RETIRED / DELETED in favor of project 34) automation moved this from Ready for QA ⏩ to Done 🚀 Aug 4, 2023
@pdurbin
Copy link
Member

pdurbin commented Aug 4, 2023

I was able to test this bug with this:

diff --git a/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java b/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java
index 866524a260..a2c3a19c39 100644
--- a/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java
+++ b/src/test/java/edu/harvard/iq/dataverse/api/DatasetsIT.java
@@ -2391,6 +2391,13 @@ createDataset = UtilIT.createRandomDatasetViaNativeApi(dataverse1Alias, apiToken
 
         assertTrue("Failed test if Ingest Lock exceeds max duration " + pathToFile, UtilIT.sleepForLock(datasetId.longValue(), "Ingest", authorApiToken, UtilIT.MAXIMUM_INGEST_LOCK_DURATION));
 
+        JsonObjectBuilder json = Json.createObjectBuilder()
+                .add("dataFileTags", Json.createArrayBuilder().add("Genomics"));
+
+        Response addTabularTag = UtilIT.updateFileMetadata(fileId, json.build().toString(), authorApiToken);
+        addTabularTag.prettyPrint();
+        addTabularTag.then().assertThat().statusCode(OK.getStatusCode());
+
         Response restrictFile = UtilIT.restrictFile(fileId, true, authorApiToken);
         restrictFile.prettyPrint();
         restrictFile.then().assertThat().statusCode(OK.getStatusCode());

Running du -sh docker-dev-volumes showed the volume growing continuously.

I stopped the containers, switched to this PR, and the bug is gone.

@pdurbin pdurbin added this to the 5.14 milestone Aug 4, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
No open projects
Development

Successfully merging this pull request may close these issues.

None yet

5 participants