From 5b379195c6a4658032340fe9c0c882d38c15ca76 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 12:09:32 +0000
Subject: [PATCH 01/21] Merge develop without revert
---
.../11217-source-name-harvesting-client.md | 13 ++
.../source/_static/api/harvesting-client.json | 11 ++
doc/sphinx-guides/source/api/native-api.rst | 90 ++++++-----
.../source/installation/config.rst | 3 +
docker-compose-dev.yml | 1 +
.../iq/dataverse/HarvestingClientsPage.java | 140 ++++++++----------
.../iq/dataverse/api/HarvestingClients.java | 5 +-
.../harvest/client/HarvestingClient.java | 19 ++-
.../iq/dataverse/search/IndexServiceBean.java | 4 +-
.../iq/dataverse/util/json/JsonParser.java | 1 +
.../iq/dataverse/util/json/JsonPrinter.java | 1 +
src/main/java/propertyFiles/Bundle.properties | 2 +
src/main/resources/db/migration/V6.5.0.7.sql | 2 +
src/main/webapp/harvestclients.xhtml | 14 ++
.../iq/dataverse/api/HarvestingClientsIT.java | 17 ++-
15 files changed, 198 insertions(+), 125 deletions(-)
create mode 100644 doc/release-notes/11217-source-name-harvesting-client.md
create mode 100644 doc/sphinx-guides/source/_static/api/harvesting-client.json
create mode 100644 src/main/resources/db/migration/V6.5.0.7.sql
diff --git a/doc/release-notes/11217-source-name-harvesting-client.md b/doc/release-notes/11217-source-name-harvesting-client.md
new file mode 100644
index 00000000000..53347de694c
--- /dev/null
+++ b/doc/release-notes/11217-source-name-harvesting-client.md
@@ -0,0 +1,13 @@
+### Metadata Source Facet Can Now Differentiate Between Harvested Sources
+
+The behavior of the feature flag `index-harvested-metadata-source` and the "Metadata Source" facet, which were added and updated, respectively, in [Dataverse 6.3](https://github.com/IQSS/dataverse/releases/tag/v6.3) (through pull requests #10464 and #10651), have been updated. A new field called "Source Name" has been added to harvesting clients.
+
+Before Dataverse 6.3, all harvested content (datasets and files) appeared together under "Harvested" under the "Metadata Source" facet. This is still the behavior of Dataverse out of the box. Since Dataverse 6.3, enabling the `index-harvested-metadata-source` feature flag (and reindexing) resulted in harvested content appearing under the nickname for whatever harvesting client was used to bring in the content. This meant that instead of having all harvested content lumped together under "Harvested", content would appear under "client1", "client2", etc.
+
+Now, as this release, enabling the `index-harvested-metadata-source` feature flag, populating a new field for harvesting clients called "Source Name" ("sourceName" in the [API](https://dataverse-guide--11217.org.readthedocs.build/en/11217/api/native-api.html#create-a-harvesting-client)), and reindexing (see upgrade instructions below), results in the source name appearing under the "Metadata Source" facet rather than the harvesting client nickname. This gives you more control over the name that appears under the "Metadata Source" facet and allows you to group harvested content from various harvesting clients under the same name if you wish (by reusing the same source name).
+
+Previously, `index-harvested-metadata-source` was not documented in the guides, but now you can find information about it under [Feature Flags](https://dataverse-guide--11217.org.readthedocs.build/en/11217/installation/config.html#feature-flags). See also #10217 and #11217.
+
+## Upgrade instructions
+
+If you have enabled the `dataverse.feature.index-harvested-metadata-source` feature flag and given some of your harvesting clients a source name, you should reindex to have those source names appear under the "Metadata Source" facet.
diff --git a/doc/sphinx-guides/source/_static/api/harvesting-client.json b/doc/sphinx-guides/source/_static/api/harvesting-client.json
new file mode 100644
index 00000000000..82a817fc38f
--- /dev/null
+++ b/doc/sphinx-guides/source/_static/api/harvesting-client.json
@@ -0,0 +1,11 @@
+{
+ "nickName": "zenodo",
+ "dataverseAlias": "zenodoHarvested",
+ "harvestUrl": "https://zenodo.org/oai2d",
+ "archiveUrl": "https://zenodo.org",
+ "archiveDescription": "Moissonné depuis la collection LMOPS de l'entrepôt Zenodo. En cliquant sur ce jeu de données, vous serez redirigé vers Zenodo.",
+ "metadataFormat": "oai_dc",
+ "customHeaders": "x-oai-api-key: xxxyyyzzz",
+ "set": "user-lmops",
+ "allowHarvestingMissingCVV":true
+}
diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst
index d8eddc34a01..0be446ec2ce 100644
--- a/doc/sphinx-guides/source/api/native-api.rst
+++ b/doc/sphinx-guides/source/api/native-api.rst
@@ -5556,7 +5556,7 @@ Create a Harvesting Set
To create a harvesting set you must supply a JSON file that contains the following fields:
-- Name: Alpha-numeric may also contain -, _, or %, but no spaces. Must also be unique in the installation.
+- Name: Alpha-numeric may also contain -, _, or %, but no spaces. It must also be unique in the installation.
- Definition: A search query to select the datasets to be harvested. For example, a query containing authorName:YYY would include all datasets where ‘YYY’ is the authorName.
- Description: Text that describes the harvesting set. The description appears in the Manage Harvesting Sets dashboard and in API responses. This field is optional.
@@ -5652,20 +5652,43 @@ The following API can be used to create and manage "Harvesting Clients". A Harve
List All Configured Harvesting Clients
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Shows all the Harvesting Clients configured::
+Shows all the harvesting clients configured.
- GET http://$SERVER/api/harvest/clients/
+.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of export below.
+
+.. code-block:: bash
+
+ export SERVER_URL=https://demo.dataverse.org
+
+ curl "$SERVER_URL/api/harvest/clients"
+
+The fully expanded example above (without the environment variables) looks like this:
+
+.. code-block:: bash
+
+ curl "https://demo.dataverse.org/api/harvest/clients"
Show a Specific Harvesting Client
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
-Shows a Harvesting Client with a defined nickname::
+Shows a harvesting client by nickname.
- GET http://$SERVER/api/harvest/clients/$nickname
+.. code-block:: bash
+
+ export SERVER_URL=https://demo.dataverse.org
+ export NICKNAME=myclient
+
+ curl "$SERVER_URL/api/harvest/clients/$NICKNAME"
+
+The fully expanded example above (without the environment variables) looks like this:
.. code-block:: bash
- curl "http://localhost:8080/api/harvest/clients/myclient"
+ curl "https://demo.dataverse.org/api/harvest/clients/myclient"
+
+The output will look something like the following.
+
+.. code-block:: bash
{
"status":"OK",
@@ -5681,6 +5704,7 @@ Shows a Harvesting Client with a defined nickname::
"type": "oai",
"dataverseAlias": "fooData",
"nickName": "myClient",
+ "sourceName": "",
"set": "fooSet",
"useOaiIdentifiersAsPids": false
"schedule": "none",
@@ -5694,16 +5718,12 @@ Shows a Harvesting Client with a defined nickname::
}
+.. _create-a-harvesting-client:
+
Create a Harvesting Client
~~~~~~~~~~~~~~~~~~~~~~~~~~
-
-To create a new harvesting client::
-
- POST http://$SERVER/api/harvest/clients/$nickname
-
-``nickName`` is the name identifying the new client. It should be alpha-numeric and may also contain -, _, or %, but no spaces. Must also be unique in the installation.
-You must supply a JSON file that describes the configuration, similarly to the output of the GET API above. The following fields are mandatory:
+To create a harvesting client you must supply a JSON file that describes the configuration, similarly to the output of the GET API above. The following fields are mandatory:
- dataverseAlias: The alias of an existing collection where harvested datasets will be deposited
- harvestUrl: The URL of the remote OAI archive
@@ -5712,6 +5732,7 @@ You must supply a JSON file that describes the configuration, similarly to the o
The following optional fields are supported:
+- sourceName: When ``index-harvested-metadata-source`` is enabled (see :ref:`feature-flags`), sourceName will override the nickname in the Metadata Source facet. It can be used to group the content from many harvesting clients under the same name.
- archiveDescription: What the name suggests. If not supplied, will default to "This Dataset is harvested from our partners. Clicking the link will take you directly to the archival source of the data."
- set: The OAI set on the remote server. If not supplied, will default to none, i.e., "harvest everything".
- style: Defaults to "default" - a generic OAI archive. (Make sure to use "dataverse" when configuring harvesting from another Dataverse installation).
@@ -5720,38 +5741,35 @@ The following optional fields are supported:
- useOaiIdentifiersAsPids: Defaults to false; if set to true, the harvester will attempt to use the identifier from the OAI-PMH record header as the **first choice** for the persistent id of the harvested dataset. When set to false, Dataverse will still attempt to use this identifier, but only if none of the `` entries in the OAI_DC record contain a valid persistent id (this is new as of v6.5).
Generally, the API will accept the output of the GET version of the API for an existing client as valid input, but some fields will be ignored. For example, as of writing this there is no way to configure a harvesting schedule via this API.
-
-An example JSON file would look like this::
- {
- "nickName": "zenodo",
- "dataverseAlias": "zenodoHarvested",
- "harvestUrl": "https://zenodo.org/oai2d",
- "archiveUrl": "https://zenodo.org",
- "archiveDescription": "Moissonné depuis la collection LMOPS de l'entrepôt Zenodo. En cliquant sur ce jeu de données, vous serez redirigé vers Zenodo.",
- "metadataFormat": "oai_dc",
- "customHeaders": "x-oai-api-key: xxxyyyzzz",
- "set": "user-lmops",
- "allowHarvestingMissingCVV":true
- }
+You can download this :download:`harvesting-client.json <../_static/api/harvesting-client.json>` file to use as a starting point.
-Something important to keep in mind about this API is that, unlike the harvesting clients GUI, it will create a client with the values supplied without making any attempts to validate them in real time. In other words, for the `harvestUrl` it will accept anything that looks like a well-formed url, without making any OAI calls to verify that the name of the set and/or the metadata format entered are supported by it. This is by design, to give an admin an option to still be able to create a client, in a rare case when it cannot be done via the GUI because of some real time failures in an exchange with an otherwise valid OAI server. This however puts the responsibility on the admin to supply the values already confirmed to be valid.
+.. literalinclude:: ../_static/api/harvesting-client.json
+Something important to keep in mind about this API is that, unlike the harvesting clients GUI, it will create a client with the values supplied without making any attempts to validate them in real time. In other words, for the `harvestUrl` it will accept anything that looks like a well-formed url, without making any OAI calls to verify that the name of the set and/or the metadata format entered are supported by it. This is by design, to give an admin an option to still be able to create a client, in a rare case when it cannot be done via the GUI because of some real time failures in an exchange with an otherwise valid OAI server. This however puts the responsibility on the admin to supply the values already confirmed to be valid.
.. note:: See :ref:`curl-examples-and-environment-variables` if you are unfamiliar with the use of export below.
+
+``nickName`` in the JSON file and ``$NICKNAME`` in the URL path below is the name identifying the new client. It should be alpha-numeric and may also contain -, _, or %, but no spaces. It must be unique in the installation.
+
.. code-block:: bash
export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
export SERVER_URL=http://localhost:8080
+ export NICKNAME=zenodo
- curl -H "X-Dataverse-key:$API_TOKEN" -X POST -H "Content-Type: application/json" "$SERVER_URL/api/harvest/clients/zenodo" --upload-file client.json
+ curl -H "X-Dataverse-key:$API_TOKEN" -X POST -H "Content-Type: application/json" "$SERVER_URL/api/harvest/clients/$NICKNAME" --upload-file harvesting-client.json
The fully expanded example above (without the environment variables) looks like this:
.. code-block:: bash
- curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X POST -H "Content-Type: application/json" "http://localhost:8080/api/harvest/clients/zenodo" --upload-file "client.json"
+ curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X POST -H "Content-Type: application/json" "http://localhost:8080/api/harvest/clients/zenodo" --upload-file "harvesting-client.json"
+
+The output will look something like the following.
+
+.. code-block:: bash
{
"status": "OK",
@@ -5785,15 +5803,21 @@ Similar to the API above, using the same JSON format, but run on an existing cli
Delete a Harvesting Client
~~~~~~~~~~~~~~~~~~~~~~~~~~
-Self-explanatory:
-
.. code-block:: bash
- curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X DELETE "http://localhost:8080/api/harvest/clients/$nickName"
+ export API_TOKEN=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx
+ export SERVER_URL=http://localhost:8080
+ export NICKNAME=zenodo
-Only users with superuser permissions may delete harvesting clients.
+ curl -H "X-Dataverse-key:$API_TOKEN" -X DELETE "$SERVER_URL/api/harvest/clients/$NICKNAME"
+The fully expanded example above (without the environment variables) looks like this:
+
+.. code-block:: bash
+ curl -H "X-Dataverse-key:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx" -X DELETE "http://localhost:8080/api/harvest/clients/zenodo"
+
+Only users with superuser permissions may delete harvesting clients.
.. _pids-api:
diff --git a/doc/sphinx-guides/source/installation/config.rst b/doc/sphinx-guides/source/installation/config.rst
index 0f5aec8c942..6c683a82ba0 100644
--- a/doc/sphinx-guides/source/installation/config.rst
+++ b/doc/sphinx-guides/source/installation/config.rst
@@ -3493,6 +3493,9 @@ please find all known feature flags below. Any of these flags can be activated u
* - globus-use-experimental-async-framework
- Activates a new experimental implementation of Globus polling of ongoing remote data transfers that does not rely on the instance staying up continuously for the duration of the transfers and saves the state information about Globus upload requests in the database. Added in v6.4. Affects :ref:`:GlobusPollingInterval`. Note that the JVM option :ref:`dataverse.files.globus-monitoring-server` described above must also be enabled on one (and only one, in a multi-node installation) Dataverse instance.
- ``Off``
+ * - index-harvested-metadata-source
+ - Index the nickname or the source name (See the optional ``sourceName`` field in :ref:`create-a-harvesting-client`) of the harvesting client as the "metadata source" of harvested datasets and files. If enabled, the Metadata Source facet will show separate groupings of the content harvested from different sources (by harvesting client nickname or source name) instead of the default behavior where there is one "Harvested" grouping for all harvested content.
+ - ``Off``
**Note:** Feature flags can be set via any `supported MicroProfile Config API source`_, e.g. the environment variable
``DATAVERSE_FEATURE_XXX`` (e.g. ``DATAVERSE_FEATURE_API_SESSION_AUTH=1``). These environment variables can be set in your shell before starting Payara. If you are using :doc:`Docker for development `, you can set them in the `docker compose `_ file.
diff --git a/docker-compose-dev.yml b/docker-compose-dev.yml
index fdde14cdee5..0de90f7ec2a 100644
--- a/docker-compose-dev.yml
+++ b/docker-compose-dev.yml
@@ -17,6 +17,7 @@ services:
SKIP_DEPLOY: "${SKIP_DEPLOY}"
DATAVERSE_JSF_REFRESH_PERIOD: "1"
DATAVERSE_FEATURE_API_BEARER_AUTH: "1"
+ DATAVERSE_FEATURE_INDEX_HARVESTED_METADATA_SOURCE: "1"
DATAVERSE_FEATURE_API_BEARER_AUTH_PROVIDE_MISSING_CLAIMS: "1"
DATAVERSE_MAIL_SYSTEM_EMAIL: "dataverse@localhost"
DATAVERSE_MAIL_MTA_HOST: "smtp"
diff --git a/src/main/java/edu/harvard/iq/dataverse/HarvestingClientsPage.java b/src/main/java/edu/harvard/iq/dataverse/HarvestingClientsPage.java
index f008db1403f..1effd137e0e 100644
--- a/src/main/java/edu/harvard/iq/dataverse/HarvestingClientsPage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/HarvestingClientsPage.java
@@ -78,7 +78,7 @@ public class HarvestingClientsPage implements java.io.Serializable {
private Long dataverseId = null;
private HarvestingClient selectedClient;
private boolean setListTruncated = false;
-
+
//private static final String solrDocIdentifierDataset = "dataset_";
public enum PageMode {
@@ -242,6 +242,7 @@ public void editClient(HarvestingClient harvestingClient) {
setSelectedClient(harvestingClient);
this.newNickname = harvestingClient.getName();
+ this.sourceName = harvestingClient.getSourceName();
this.newHarvestingUrl = harvestingClient.getHarvestingUrl();
this.customHeader = harvestingClient.getCustomHttpHeaders();
this.initialSettingsValidated = false;
@@ -323,10 +324,9 @@ public void deleteClient() {
}
public void createClient(ActionEvent ae) {
-
- HarvestingClient newHarvestingClient = new HarvestingClient(); // will be set as type OAI by default
-
- newHarvestingClient.setName(newNickname);
+
+ // will be set as type OAI by default
+ HarvestingClient newHarvestingClient = fillHarvestingClient(new HarvestingClient());
if (getSelectedDestinationDataverse() == null) {
JsfHelper.JH.addMessage(FacesMessage.SEVERITY_ERROR,BundleUtil.getStringFromBundle("harvest.create.error"));
@@ -338,35 +338,6 @@ public void createClient(ActionEvent ae) {
}
getSelectedDestinationDataverse().getHarvestingClientConfigs().add(newHarvestingClient);
- newHarvestingClient.setHarvestingUrl(newHarvestingUrl);
- newHarvestingClient.setCustomHttpHeaders(customHeader);
- if (!StringUtils.isEmpty(newOaiSet)) {
- newHarvestingClient.setHarvestingSet(newOaiSet);
- }
- newHarvestingClient.setMetadataPrefix(newMetadataFormat);
- newHarvestingClient.setHarvestStyle(newHarvestingStyle);
-
- if (isNewHarvestingScheduled()) {
- newHarvestingClient.setScheduled(true);
-
- if (isNewHarvestingScheduledWeekly()) {
- newHarvestingClient.setSchedulePeriod(HarvestingClient.SCHEDULE_PERIOD_WEEKLY);
- if (getWeekDayNumber() == null) {
- // create a "week day is required..." error message, etc.
- // but we may be better off not even giving them an opportunity
- // to leave the field blank - ?
- }
- newHarvestingClient.setScheduleDayOfWeek(getWeekDayNumber());
- } else {
- newHarvestingClient.setSchedulePeriod(HarvestingClient.SCHEDULE_PERIOD_DAILY);
- }
-
- if (getHourOfDay() == null) {
- // see the comment above, about the day of week. same here.
- }
- newHarvestingClient.setScheduleHourOfDay(getHourOfDay());
- }
-
// make default archive url (used to generate links pointing back to the
// archival sources, when harvested datasets are displayed in search results),
// from the harvesting url:
@@ -412,51 +383,9 @@ public void createClient(ActionEvent ae) {
// this saves an existing client that the user has edited:
public void saveClient(ActionEvent ae) {
-
- HarvestingClient harvestingClient = getSelectedClient();
-
- if (harvestingClient == null) {
- // TODO:
- // tell the user somehow that the client cannot be saved, and advise
- // them to save the settings they have entered.
- // as of now - we will show an error message, but only after the
- // edit form has been closed.
- }
-
- // nickname is not editable for existing clients:
- //harvestingClient.setName(newNickname);
- harvestingClient.setHarvestingUrl(newHarvestingUrl);
- harvestingClient.setCustomHttpHeaders(customHeader);
- harvestingClient.setHarvestingSet(newOaiSet);
- harvestingClient.setMetadataPrefix(newMetadataFormat);
- harvestingClient.setHarvestStyle(newHarvestingStyle);
-
- if (isNewHarvestingScheduled()) {
- harvestingClient.setScheduled(true);
-
- if (isNewHarvestingScheduledWeekly()) {
- harvestingClient.setSchedulePeriod(HarvestingClient.SCHEDULE_PERIOD_WEEKLY);
- if (getWeekDayNumber() == null) {
- // create a "week day is required..." error message, etc.
- // but we may be better off not even giving them an opportunity
- // to leave the field blank - ?
- }
- harvestingClient.setScheduleDayOfWeek(getWeekDayNumber());
- } else {
- harvestingClient.setSchedulePeriod(HarvestingClient.SCHEDULE_PERIOD_DAILY);
- }
-
- if (getHourOfDay() == null) {
- // see the comment above, about the day of week. same here.
- }
- harvestingClient.setScheduleHourOfDay(getHourOfDay());
- } else {
- harvestingClient.setScheduled(false);
- }
-
- // will try to save it now:
-
try {
+ HarvestingClient harvestingClient = fillHarvestingClient(getSelectedClient());
+
harvestingClient = engineService.submit( new UpdateHarvestingClientCommand(dvRequestService.getDataverseRequest(), harvestingClient));
configuredHarvestingClients = harvestingClientService.getAllHarvestingClients();
@@ -477,9 +406,50 @@ public void saveClient(ActionEvent ae) {
}
setPageMode(PageMode.VIEW);
-
+
}
-
+
+ /**
+ * Based on a new harvestingClient instance or an existing one, it will update basics fields with new UI fields values
+ * @param harvestingClient new or existing harvestingClient to update
+ * @return harvestingClient with updated values
+ */
+ private HarvestingClient fillHarvestingClient(HarvestingClient harvestingClient) {
+ // update nickname if it's a new object otherwise is not editable for existing clients
+ if(harvestingClient.getId() == null) {
+ harvestingClient.setName(newNickname);
+ }
+ harvestingClient.setSourceName(sourceName);
+ harvestingClient.setHarvestingUrl(newHarvestingUrl);
+ harvestingClient.setCustomHttpHeaders(customHeader);
+ if (!StringUtils.isEmpty(newOaiSet)) {
+ harvestingClient.setHarvestingSet(newOaiSet);
+ }
+ harvestingClient.setMetadataPrefix(newMetadataFormat);
+ harvestingClient.setHarvestStyle(newHarvestingStyle);
+
+ harvestingClient.setScheduled(isNewHarvestingScheduled());
+ if (isNewHarvestingScheduled()) {
+ if (isNewHarvestingScheduledWeekly()) {
+ harvestingClient.setSchedulePeriod(HarvestingClient.SCHEDULE_PERIOD_WEEKLY);
+ if (getWeekDayNumber() == null) {
+ // create a "week day is required..." error message, etc.
+ // but we may be better off not even giving them an opportunity
+ // to leave the field blank - ?
+ }
+ harvestingClient.setScheduleDayOfWeek(getWeekDayNumber());
+ } else {
+ harvestingClient.setSchedulePeriod(HarvestingClient.SCHEDULE_PERIOD_DAILY);
+ }
+
+ if (getHourOfDay() == null) {
+ // see the comment above, about the day of week. same here.
+ }
+ harvestingClient.setScheduleHourOfDay(getHourOfDay());
+ }
+ return harvestingClient;
+ }
+
public void validateMetadataFormat(FacesContext context, UIComponent toValidate, Object rawValue) {
String value = (String) rawValue;
UIInput input = (UIInput) toValidate;
@@ -717,6 +687,7 @@ public void backToStepThree() {
UIInput selectedDataverseMenu;
private String newNickname = "";
+ private String sourceName = "";
private String newHarvestingUrl = "";
private String customHeader = null;
private boolean initialSettingsValidated = false;
@@ -741,6 +712,7 @@ public void backToStepThree() {
public void initNewClient(ActionEvent ae) {
//this.selectedClient = new HarvestingClient();
this.newNickname = "";
+ this.sourceName = "";
this.newHarvestingUrl = "";
this.customHeader = null;
this.initialSettingsValidated = false;
@@ -842,6 +814,14 @@ public int getHarvestingScheduleRadio() {
public void setHarvestingScheduleRadio(int harvestingScheduleRadio) {
this.harvestingScheduleRadio = harvestingScheduleRadio;
}
+
+ public String getSourceName() {
+ return sourceName;
+ }
+
+ public void setSourceName(String sourceName) {
+ this.sourceName = sourceName;
+ }
public boolean isNewHarvestingScheduled() {
return this.harvestingScheduleRadio != harvestingScheduleRadioNone;
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java b/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java
index dfc9f48dd1a..e4300099244 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java
@@ -278,7 +278,10 @@ public Response modifyHarvestingClient(@Context ContainerRequestContext crc, Str
// Go through the supported editable fields and update the client accordingly:
// TODO: We may want to reevaluate whether we really want/need *all*
// of these fields to be editable.
-
+
+ if (newHarvestingClient.getSourceName() != null) {
+ harvestingClient.setSourceName(newHarvestingClient.getSourceName());
+ }
if (newHarvestingClient.getHarvestingUrl() != null) {
harvestingClient.setHarvestingUrl(newHarvestingClient.getHarvestingUrl());
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java b/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java
index e73310650b4..9d949b6a0dd 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java
@@ -29,13 +29,11 @@
import jakarta.persistence.NamedQueries;
import jakarta.persistence.NamedQuery;
import jakarta.persistence.OneToMany;
-import jakarta.persistence.OneToOne;
import jakarta.persistence.OrderBy;
import jakarta.persistence.Table;
-import jakarta.persistence.Temporal;
-import jakarta.persistence.TemporalType;
import jakarta.validation.constraints.Pattern;
import jakarta.validation.constraints.Size;
+import org.apache.commons.lang3.StringUtils;
import org.hibernate.validator.constraints.NotBlank;
/**
@@ -192,6 +190,20 @@ public void setHarvestingUrl(String harvestingUrl) {
this.harvestingUrl = harvestingUrl.trim();
}
}
+
+ private String sourceName;
+
+ public String getSourceName() {
+ return sourceName;
+ }
+
+ public void setSourceName(String sourceName) {
+ this.sourceName = sourceName;
+ }
+
+ public String getMetadataSource() {
+ return StringUtils.isNotBlank(this.sourceName) ? this.sourceName : this.name;
+ }
private String archiveUrl;
@@ -476,5 +488,4 @@ public boolean equals(Object object) {
public String toString() {
return "edu.harvard.iq.dataverse.harvest.client.HarvestingClient[ id=" + id + " ]";
}
-
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java
index a2149b44c41..5fbcd6ea520 100644
--- a/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/search/IndexServiceBean.java
@@ -1005,7 +1005,7 @@ public SolrInputDocuments toSolrDocs(IndexableDataset indexableDataset, Set#{bundle['harvestclients.newClientDialog.nickname.helptext']}
+
+
diff --git a/src/test/java/edu/harvard/iq/dataverse/api/HarvestingClientsIT.java b/src/test/java/edu/harvard/iq/dataverse/api/HarvestingClientsIT.java
index f84c5ad1a20..7e06d6a9181 100644
--- a/src/test/java/edu/harvard/iq/dataverse/api/HarvestingClientsIT.java
+++ b/src/test/java/edu/harvard/iq/dataverse/api/HarvestingClientsIT.java
@@ -184,14 +184,19 @@ public void testCreateEditDeleteClient() throws InterruptedException {
@Test
public void testHarvestingClientRun_AllowHarvestingMissingCVV_False() throws InterruptedException {
- harvestingClientRun(false);
+ harvestingClientRun(false, false);
}
@Test
public void testHarvestingClientRun_AllowHarvestingMissingCVV_True() throws InterruptedException {
- harvestingClientRun(true);
+ harvestingClientRun(true, false);
}
- private void harvestingClientRun(boolean allowHarvestingMissingCVV) throws InterruptedException {
+ @Test
+ public void testHarvestingClientRun_AllowHarvestingMissingCVV_True_WithSourceName() throws InterruptedException {
+ harvestingClientRun(true, true);
+ }
+
+ private void harvestingClientRun(boolean allowHarvestingMissingCVV, boolean testingSourceName) throws InterruptedException {
int expectedNumberOfSetsHarvested = allowHarvestingMissingCVV ? DATASETS_IN_CONTROL_SET : DATASETS_IN_CONTROL_SET - 1;
// This test will create a client and attempt to perform an actual
@@ -203,16 +208,18 @@ private void harvestingClientRun(boolean allowHarvestingMissingCVV) throws Inte
// from confirming the expected HTTP status code.
String nickName = "h" + UtilIT.getRandomString(6);
+ String sourceName = testingSourceName ? "AnotherSourceName" : "";
clientApiPath = String.format(HARVEST_CLIENTS_API+"%s", nickName);
String clientJson = String.format("{\"dataverseAlias\":\"%s\","
+ "\"type\":\"oai\","
+ + "\"sourceName\":\"%s\","
+ "\"harvestUrl\":\"%s\","
+ "\"archiveUrl\":\"%s\","
+ "\"set\":\"%s\","
+ "\"allowHarvestingMissingCVV\":%s,"
+ "\"metadataFormat\":\"%s\"}",
- harvestCollectionAlias, HARVEST_URL, ARCHIVE_URL, CONTROL_OAI_SET, allowHarvestingMissingCVV, HARVEST_METADATA_FORMAT);
+ harvestCollectionAlias, sourceName, HARVEST_URL, ARCHIVE_URL, CONTROL_OAI_SET, allowHarvestingMissingCVV, HARVEST_METADATA_FORMAT);
Response createResponse = given()
.header(UtilIT.API_TOKEN_HTTP_HEADER, adminUserAPIKey)
@@ -290,7 +297,7 @@ private void harvestingClientRun(boolean allowHarvestingMissingCVV) throws Inte
Thread.sleep(1000L);
// Requires the index-harvested-metadata-source Flag feature to be enabled to search on the nickName
// Otherwise, the search must be performed with metadataSource:Harvested
- Response searchHarvestedDatasets = UtilIT.search("metadataSource:" + nickName, normalUserAPIKey);
+ Response searchHarvestedDatasets = UtilIT.search("metadataSource:" + (testingSourceName ? sourceName : nickName), normalUserAPIKey);
searchHarvestedDatasets.then().assertThat().statusCode(OK.getStatusCode());
searchHarvestedDatasets.prettyPrint();
// Get all global ids for cleanup
From 404eb0b06275765c029a4bdf827ab09b76b51e6a Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 12:19:28 +0000
Subject: [PATCH 02/21] Restore displayOnCreate field option changes
---
.../10476-display-on-create-field-option.md | 6 +
doc/sphinx-guides/source/api/native-api.rst | 7 +-
.../iq/dataverse/DatasetFieldServiceBean.java | 17 ++-
.../edu/harvard/iq/dataverse/DatasetPage.java | 1 +
.../edu/harvard/iq/dataverse/Dataverse.java | 6 +
.../DataverseFieldTypeInputLevel.java | 12 +-
...taverseFieldTypeInputLevelServiceBean.java | 9 ++
.../harvard/iq/dataverse/DataversePage.java | 132 +++++++++++-------
.../iq/dataverse/DataverseServiceBean.java | 4 +
.../dataverse/MetadataBlockServiceBean.java | 47 +++++--
.../harvard/iq/dataverse/TemplatePage.java | 9 +-
.../harvard/iq/dataverse/api/Dataverses.java | 3 +-
.../iq/dataverse/util/json/JsonPrinter.java | 6 +-
.../iq/dataverse/api/DataversesIT.java | 33 +++++
.../edu/harvard/iq/dataverse/api/UtilIT.java | 19 ++-
15 files changed, 237 insertions(+), 74 deletions(-)
create mode 100644 doc/release-notes/10476-display-on-create-field-option.md
diff --git a/doc/release-notes/10476-display-on-create-field-option.md b/doc/release-notes/10476-display-on-create-field-option.md
new file mode 100644
index 00000000000..e4a38e181a2
--- /dev/null
+++ b/doc/release-notes/10476-display-on-create-field-option.md
@@ -0,0 +1,6 @@
+New feature: Collection administrators can now configure which metadata fields appear during dataset creation through the `displayOnCreate` property, even when fields are not required. This provides greater control over metadata visibility and can help improve metadata completeness.
+
+- The feature is currently available through the API endpoint `/api/dataverses/{alias}/inputLevels`
+- UI implementation will be available in a future release [#11221](https://github.com/IQSS/dataverse/issues/11221)
+
+For more information, see the [API Guide](https://guides.dataverse.org/en/latest/api/native-api.html#update-collection-input-levels) and issues [#10476](https://github.com/IQSS/dataverse/issues/10476) and [#11224](https://github.com/IQSS/dataverse/pull/11224).
\ No newline at end of file
diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst
index 0be446ec2ce..350cd2020ca 100644
--- a/doc/sphinx-guides/source/api/native-api.rst
+++ b/doc/sphinx-guides/source/api/native-api.rst
@@ -1116,12 +1116,14 @@ This endpoint expects a JSON with the following format::
{
"datasetFieldTypeName": "datasetFieldTypeName1",
"required": true,
- "include": true
+ "include": true,
+ "displayOnCreate": false
},
{
"datasetFieldTypeName": "datasetFieldTypeName2",
"required": true,
- "include": true
+ "include": true,
+ "displayOnCreate": true
}
]
@@ -1130,6 +1132,7 @@ Parameters:
- ``datasetFieldTypeName``: Name of the metadata field
- ``required``: Whether the field is required (boolean)
- ``include``: Whether the field is included (boolean)
+- ``displayOnCreate`` (optional): Whether the field is displayed during dataset creation, even when not required (boolean)
.. code-block:: bash
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetFieldServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DatasetFieldServiceBean.java
index 129f590ca75..32ce570ddaa 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetFieldServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetFieldServiceBean.java
@@ -941,6 +941,12 @@ private Predicate buildFieldPresentInDataversePredicate(Dataverse dataverse, boo
criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("required"))
);
+ // Predicate for displayOnCreate in input level
+ Predicate displayOnCreateInputLevelPredicate = criteriaBuilder.and(
+ criteriaBuilder.equal(datasetFieldTypeRoot, datasetFieldTypeInputLevelJoin.get("datasetFieldType")),
+ criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("displayOnCreate"))
+ );
+
// Create a subquery to check for the absence of a specific DataverseFieldTypeInputLevel.
Subquery
subquery = criteriaQuery.subquery(Long.class);
Root subqueryRoot = subquery.from(DataverseFieldTypeInputLevel.class);
@@ -963,10 +969,19 @@ private Predicate buildFieldPresentInDataversePredicate(Dataverse dataverse, boo
// Otherwise, use an always-true predicate (conjunction).
Predicate displayedOnCreatePredicate = onlyDisplayedOnCreate
? criteriaBuilder.or(
- criteriaBuilder.or(
+ // 1. Field marked as displayOnCreate in input level
+ displayOnCreateInputLevelPredicate,
+
+ // 2. Field without input level that is marked as displayOnCreate or required
+ criteriaBuilder.and(
+ hasNoInputLevelPredicate,
+ criteriaBuilder.or(
criteriaBuilder.isTrue(datasetFieldTypeRoot.get("displayOnCreate")),
fieldRequiredInTheInstallation
+ )
),
+
+ // 3. Field required by input level
requiredAsInputLevelPredicate
)
: criteriaBuilder.conjunction();
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
index 57afdec7752..411b55bf64b 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
@@ -1855,6 +1855,7 @@ private void updateDatasetFieldInputLevels() {
if (dsf != null){
// Yes, call "setInclude"
dsf.setInclude(oneDSFieldTypeInputLevel.isInclude());
+ dsf.getDatasetFieldType().setDisplayOnCreate(oneDSFieldTypeInputLevel.isDisplayOnCreate());
// remove from hash
mapDatasetFields.remove(oneDSFieldTypeInputLevel.getDatasetFieldType().getId());
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java
index 9bb8992e789..d2cb51d0072 100644
--- a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java
+++ b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java
@@ -438,6 +438,12 @@ public boolean isDatasetFieldTypeInInputLevels(Long datasetFieldTypeId) {
.anyMatch(inputLevel -> inputLevel.getDatasetFieldType().getId().equals(datasetFieldTypeId));
}
+ public boolean isDatasetFieldTypeDisplayOnCreateAsInputLevel(Long datasetFieldTypeId) {
+ return dataverseFieldTypeInputLevels.stream()
+ .anyMatch(inputLevel -> inputLevel.getDatasetFieldType().getId().equals(datasetFieldTypeId)
+ && inputLevel.isDisplayOnCreate());
+ }
+
public Template getDefaultTemplate() {
return defaultTemplate;
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevel.java b/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevel.java
index a3425987bf8..27cb1e00cad 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevel.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevel.java
@@ -58,14 +58,16 @@ public class DataverseFieldTypeInputLevel implements Serializable {
private DatasetFieldType datasetFieldType;
private boolean include;
private boolean required;
+ private boolean displayOnCreate;
public DataverseFieldTypeInputLevel () {}
- public DataverseFieldTypeInputLevel (DatasetFieldType fieldType, Dataverse dataverse, boolean required, boolean include) {
+ public DataverseFieldTypeInputLevel (DatasetFieldType fieldType, Dataverse dataverse, boolean required, boolean include, boolean displayOnCreate) {
this.datasetFieldType = fieldType;
this.dataverse = dataverse;
this.required = required;
this.include = include;
+ this.displayOnCreate = displayOnCreate;
}
public Long getId() {
@@ -115,6 +117,14 @@ public void setRequired(boolean required) {
this.required = required;
}
+ public boolean isDisplayOnCreate() {
+ return displayOnCreate;
+ }
+
+ public void setDisplayOnCreate(boolean displayOnCreate) {
+ this.displayOnCreate = displayOnCreate;
+ }
+
@Override
public boolean equals(Object object) {
// TODO: Warning - this method won't work in the case the id fields are not set
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevelServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevelServiceBean.java
index 1bd290ecc4d..64d51a19ba1 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevelServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataverseFieldTypeInputLevelServiceBean.java
@@ -117,4 +117,13 @@ public void create(DataverseFieldTypeInputLevel dataverseFieldTypeInputLevel) {
em.persist(dataverseFieldTypeInputLevel);
}
+ public DataverseFieldTypeInputLevel save(DataverseFieldTypeInputLevel inputLevel) {
+ if (inputLevel.getId() == null) {
+ em.persist(inputLevel);
+ return inputLevel;
+ } else {
+ return em.merge(inputLevel);
+ }
+ }
+
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataversePage.java b/src/main/java/edu/harvard/iq/dataverse/DataversePage.java
index 351d304bad3..1f8c3defa7e 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataversePage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataversePage.java
@@ -627,44 +627,17 @@ public String save() {
if (dataverse.isMetadataBlockRoot() && (mdb.isSelected() || mdb.isRequired())) {
selectedBlocks.add(mdb);
for (DatasetFieldType dsft : mdb.getDatasetFieldTypes()) {
- // currently we don't allow input levels for setting an optional field as conditionally required
- // so we skip looking at parents (which get set automatically with their children)
- if (!dsft.isHasChildren() && dsft.isRequiredDV()) {
- boolean addRequiredInputLevels = false;
- boolean parentAlreadyAdded = false;
+ if (!dsft.isChild()) {
+ // Save input level for parent field
+ saveInputLevels(listDFTIL, dsft, dataverse);
- if (!dsft.isHasParent() && dsft.isInclude()) {
- addRequiredInputLevels = !dsft.isRequired();
- }
- if (dsft.isHasParent() && dsft.getParentDatasetFieldType().isInclude()) {
- addRequiredInputLevels = !dsft.isRequired() || !dsft.getParentDatasetFieldType().isRequired();
- }
-
- if (addRequiredInputLevels) {
- listDFTIL.add(new DataverseFieldTypeInputLevel(dsft, dataverse,true, true));
-
- //also add the parent as required (if it hasn't been added already)
- // todo: review needed .equals() methods, then change this to use a Set, in order to simplify code
- if (dsft.isHasParent()) {
- DataverseFieldTypeInputLevel parentToAdd = new DataverseFieldTypeInputLevel(dsft.getParentDatasetFieldType(), dataverse, true, true);
- for (DataverseFieldTypeInputLevel dataverseFieldTypeInputLevel : listDFTIL) {
- if (dataverseFieldTypeInputLevel.getDatasetFieldType().getId() == parentToAdd.getDatasetFieldType().getId()) {
- parentAlreadyAdded = true;
- break;
- }
- }
- if (!parentAlreadyAdded) {
- // Only add the parent once. There's a UNIQUE (dataverse_id, datasetfieldtype_id)
- // constraint on the dataversefieldtypeinputlevel table we need to avoid.
- listDFTIL.add(parentToAdd);
- }
- }
+ // Handle child fields
+ if (dsft.isHasChildren()) {
+ for (DatasetFieldType child : dsft.getChildDatasetFieldTypes()) {
+ saveInputLevels(listDFTIL, child, dataverse);
+ }
}
}
- if ((!dsft.isHasParent() && !dsft.isInclude())
- || (dsft.isHasParent() && !dsft.getParentDatasetFieldType().isInclude())) {
- listDFTIL.add(new DataverseFieldTypeInputLevel(dsft, dataverse,false, false));
- }
}
}
}
@@ -1030,27 +1003,11 @@ private void refreshAllMetadataBlocks() {
for (DatasetFieldType dsft : mdb.getDatasetFieldTypes()) {
if (!dsft.isChild()) {
- DataverseFieldTypeInputLevel dsfIl = dataverseFieldTypeInputLevelService.findByDataverseIdDatasetFieldTypeId(dataverseIdForInputLevel, dsft.getId());
- if (dsfIl != null) {
- dsft.setRequiredDV(dsfIl.isRequired());
- dsft.setInclude(dsfIl.isInclude());
- } else {
- dsft.setRequiredDV(dsft.isRequired());
- dsft.setInclude(true);
- }
+ loadInputLevels(dsft, dataverseIdForInputLevel);
dsft.setOptionSelectItems(resetSelectItems(dsft));
if (dsft.isHasChildren()) {
for (DatasetFieldType child : dsft.getChildDatasetFieldTypes()) {
- DataverseFieldTypeInputLevel dsfIlChild = dataverseFieldTypeInputLevelService.findByDataverseIdDatasetFieldTypeId(dataverseIdForInputLevel, child.getId());
- if (dsfIlChild != null) {
- child.setRequiredDV(dsfIlChild.isRequired());
- child.setInclude(dsfIlChild.isInclude());
- } else {
- // in the case of conditionally required (child = true, parent = false)
- // we set this to false; i.e this is the default "don't override" value
- child.setRequiredDV(child.isRequired() && dsft.isRequired());
- child.setInclude(true);
- }
+ loadInputLevels(child, dataverseIdForInputLevel);
child.setOptionSelectItems(resetSelectItems(child));
}
}
@@ -1061,6 +1018,22 @@ private void refreshAllMetadataBlocks() {
setAllMetadataBlocks(retList);
}
+ private void loadInputLevels(DatasetFieldType dsft, Long dataverseIdForInputLevel) {
+ DataverseFieldTypeInputLevel dsfIl = dataverseFieldTypeInputLevelService
+ .findByDataverseIdDatasetFieldTypeId(dataverseIdForInputLevel, dsft.getId());
+
+ if (dsfIl != null) {
+ dsft.setRequiredDV(dsfIl.isRequired());
+ dsft.setInclude(dsfIl.isInclude());
+ dsft.setDisplayOnCreate(dsfIl.isDisplayOnCreate());
+ } else {
+ // If there is no input level, use the default values
+ dsft.setRequiredDV(dsft.isRequired());
+ dsft.setInclude(true);
+ dsft.setDisplayOnCreate(false);
+ }
+ }
+
public void validateAlias(FacesContext context, UIComponent toValidate, Object value) {
if (!StringUtils.isEmpty((String) value)) {
String alias = (String) value;
@@ -1337,4 +1310,57 @@ public Set> getPidProviderOptions() {
}
return options;
}
+
+ public void updateDisplayOnCreate(Long mdbId, Long dsftId, boolean currentValue) {
+ for (MetadataBlock mdb : allMetadataBlocks) {
+ if (mdb.getId().equals(mdbId)) {
+ for (DatasetFieldType dsft : mdb.getDatasetFieldTypes()) {
+ if (dsft.getId().equals(dsftId)) {
+ // Update value in memory
+ dsft.setDisplayOnCreate(!currentValue);
+
+ // Update or create input level
+ DataverseFieldTypeInputLevel existingLevel = dataverseFieldTypeInputLevelService
+ .findByDataverseIdDatasetFieldTypeId(dataverse.getId(), dsftId);
+
+ if (existingLevel != null) {
+ existingLevel.setDisplayOnCreate(!currentValue);
+ dataverseFieldTypeInputLevelService.save(existingLevel);
+ } else {
+ DataverseFieldTypeInputLevel newLevel = new DataverseFieldTypeInputLevel(
+ dsft,
+ dataverse,
+ dsft.isRequiredDV(),
+ true, // default include
+ !currentValue // new value of displayOnCreate
+ );
+ dataverseFieldTypeInputLevelService.save(newLevel);
+ }
+ }
+ }
+ }
+ }
+ }
+
+ private void saveInputLevels(List listDFTIL, DatasetFieldType dsft, Dataverse dataverse) {
+ // If the field already has an input level, update it
+ DataverseFieldTypeInputLevel existingLevel = dataverseFieldTypeInputLevelService
+ .findByDataverseIdDatasetFieldTypeId(dataverse.getId(), dsft.getId());
+
+ if (existingLevel != null) {
+ existingLevel.setDisplayOnCreate(dsft.isDisplayOnCreate());
+ existingLevel.setInclude(dsft.isInclude());
+ existingLevel.setRequired(dsft.isRequiredDV());
+ listDFTIL.add(existingLevel);
+ } else if (dsft.isInclude() || dsft.isDisplayOnCreate() || dsft.isRequiredDV()) {
+ // Only create new input level if there is any specific configuration
+ listDFTIL.add(new DataverseFieldTypeInputLevel(
+ dsft,
+ dataverse,
+ dsft.isRequiredDV(),
+ dsft.isInclude(),
+ dsft.isDisplayOnCreate()
+ ));
+ }
+ }
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataverseServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DataverseServiceBean.java
index f81266ded90..95c673f01cc 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataverseServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataverseServiceBean.java
@@ -959,9 +959,11 @@ public String getCollectionDatasetSchema(String dataverseAlias, Map childrenRequired = new ArrayList<>();
List childrenAllowed = new ArrayList<>();
@@ -971,11 +973,13 @@ public String getCollectionDatasetSchema(String dataverseAlias, Map listMetadataBlocksDisplayedOnCreate(Dataverse ownerDa
CriteriaQuery criteriaQuery = criteriaBuilder.createQuery(MetadataBlock.class);
Root metadataBlockRoot = criteriaQuery.from(MetadataBlock.class);
Join datasetFieldTypeJoin = metadataBlockRoot.join("datasetFieldTypes");
- Predicate displayOnCreatePredicate = criteriaBuilder.isTrue(datasetFieldTypeJoin.get("displayOnCreate"));
-
+
if (ownerDataverse != null) {
Root dataverseRoot = criteriaQuery.from(Dataverse.class);
- Join datasetFieldTypeInputLevelJoin = dataverseRoot.join("dataverseFieldTypeInputLevels", JoinType.LEFT);
+ Join datasetFieldTypeInputLevelJoin =
+ dataverseRoot.join("dataverseFieldTypeInputLevels", JoinType.LEFT);
+
+ // Subquery to check if the input level exists
+ Subquery inputLevelSubquery = criteriaQuery.subquery(Long.class);
+ Root subqueryRoot = inputLevelSubquery.from(DataverseFieldTypeInputLevel.class);
+ inputLevelSubquery.select(criteriaBuilder.literal(1L))
+ .where(
+ criteriaBuilder.equal(subqueryRoot.get("dataverse"), dataverseRoot),
+ criteriaBuilder.equal(subqueryRoot.get("datasetFieldType"), datasetFieldTypeJoin)
+ );
+ // Predicate for displayOnCreate in the input level
+ Predicate displayOnCreateInputLevelPredicate = criteriaBuilder.and(
+ datasetFieldTypeInputLevelJoin.get("datasetFieldType").in(metadataBlockRoot.get("datasetFieldTypes")),
+ criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("displayOnCreate")));
+
+ // Predicate for required fields
Predicate requiredPredicate = criteriaBuilder.and(
- datasetFieldTypeInputLevelJoin.get("datasetFieldType").in(metadataBlockRoot.get("datasetFieldTypes")),
- criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("required")));
+ datasetFieldTypeInputLevelJoin.get("datasetFieldType").in(metadataBlockRoot.get("datasetFieldTypes")),
+ criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("required")));
+
+ // Predicate for default displayOnCreate (when there is no input level)
+ Predicate defaultDisplayOnCreatePredicate = criteriaBuilder.and(
+ criteriaBuilder.not(criteriaBuilder.exists(inputLevelSubquery)),
+ criteriaBuilder.isTrue(datasetFieldTypeJoin.get("displayOnCreate")));
- Predicate unionPredicate = criteriaBuilder.or(displayOnCreatePredicate, requiredPredicate);
+ Predicate unionPredicate = criteriaBuilder.or(
+ displayOnCreateInputLevelPredicate,
+ requiredPredicate,
+ defaultDisplayOnCreatePredicate
+ );
criteriaQuery.where(criteriaBuilder.and(
- criteriaBuilder.equal(dataverseRoot.get("id"), ownerDataverse.getId()),
- metadataBlockRoot.in(dataverseRoot.get("metadataBlocks")),
- unionPredicate
+ criteriaBuilder.equal(dataverseRoot.get("id"), ownerDataverse.getId()),
+ metadataBlockRoot.in(dataverseRoot.get("metadataBlocks")),
+ unionPredicate
));
} else {
- criteriaQuery.where(displayOnCreatePredicate);
+ criteriaQuery.where(criteriaBuilder.isTrue(datasetFieldTypeJoin.get("displayOnCreate")));
}
criteriaQuery.select(metadataBlockRoot).distinct(true);
- TypedQuery typedQuery = em.createQuery(criteriaQuery);
- return typedQuery.getResultList();
+ return em.createQuery(criteriaQuery).getResultList();
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/TemplatePage.java b/src/main/java/edu/harvard/iq/dataverse/TemplatePage.java
index 44070dcbb41..94ab9e70330 100644
--- a/src/main/java/edu/harvard/iq/dataverse/TemplatePage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/TemplatePage.java
@@ -166,11 +166,16 @@ private void updateDatasetFieldInputLevels(){
}
for (DatasetField dsf: template.getFlatDatasetFields()){
- DataverseFieldTypeInputLevel dsfIl = dataverseFieldTypeInputLevelService.findByDataverseIdDatasetFieldTypeId(dvIdForInputLevel, dsf.getDatasetFieldType().getId());
- if (dsfIl != null){
+ DataverseFieldTypeInputLevel dsfIl = dataverseFieldTypeInputLevelService.findByDataverseIdDatasetFieldTypeId(
+ dvIdForInputLevel,
+ dsf.getDatasetFieldType().getId()
+ );
+ if (dsfIl != null) {
dsf.setInclude(dsfIl.isInclude());
+ dsf.getDatasetFieldType().setDisplayOnCreate(dsfIl.isDisplayOnCreate());
} else {
dsf.setInclude(true);
+ dsf.getDatasetFieldType().setDisplayOnCreate(false);
}
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java b/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
index 18c8cfac61a..d677ced2ffe 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
@@ -803,13 +803,14 @@ private List parseInputLevels(JsonArray inputLevel
boolean required = inputLevel.getBoolean("required");
boolean include = inputLevel.getBoolean("include");
+ boolean displayOnCreate = inputLevel.getBoolean("displayOnCreate", false);
if (required && !include) {
String errorMessage = MessageFormat.format(BundleUtil.getStringFromBundle("dataverse.inputlevels.error.cannotberequiredifnotincluded"), datasetFieldTypeName);
throw new WrappedResponse(badRequest(errorMessage));
}
- newInputLevels.add(new DataverseFieldTypeInputLevel(datasetFieldType, dataverse, required, include));
+ newInputLevels.add(new DataverseFieldTypeInputLevel(datasetFieldType, dataverse, required, include, displayOnCreate));
}
return newInputLevels;
diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
index 2caefee1589..f510c03de82 100644
--- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
@@ -686,8 +686,10 @@ public static JsonObjectBuilder json(MetadataBlock metadataBlock, boolean printO
DatasetFieldType parentDatasetFieldType = datasetFieldType.getParentDatasetFieldType();
boolean isRequired = parentDatasetFieldType == null ? datasetFieldType.isRequired() : parentDatasetFieldType.isRequired();
+ boolean displayOnCreateInOwnerDataverse = ownerDataverse != null && ownerDataverse.isDatasetFieldTypeDisplayOnCreateAsInputLevel(datasetFieldTypeId);
+
boolean displayCondition = printOnlyDisplayedOnCreateDatasetFieldTypes
- ? (datasetFieldType.isDisplayOnCreate() || isRequired || requiredAsInputLevelInOwnerDataverse)
+ ? (displayOnCreateInOwnerDataverse || isRequired || requiredAsInputLevelInOwnerDataverse)
: ownerDataverse == null || includedAsInputLevelInOwnerDataverse || isNotInputLevelInOwnerDataverse;
if (displayCondition) {
@@ -1458,6 +1460,7 @@ public static JsonArrayBuilder jsonDataverseFieldTypeInputLevels(List
Date: Thu, 6 Mar 2025 14:49:35 +0000
Subject: [PATCH 03/21] Add displayOnCreate option for dataset field types
- Added @Column annotation for displayOnCreate field in DatasetFieldType
- Updated JsonPrinter to include displayOnCreate in display condition
---
src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java | 1 +
.../java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java | 2 +-
2 files changed, 2 insertions(+), 1 deletion(-)
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java b/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java
index 16adf8e36bc..be276d36581 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java
@@ -273,6 +273,7 @@ public void setValidationFormat(String validationFormat) {
* Determines whether this field type is displayed in the form when creating
* the Dataset (or only later when editing after the initial creation).
*/
+ @Column(name = "displayoncreate", nullable = false)
private boolean displayOnCreate;
public boolean isDisplayOnCreate() {
diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
index f510c03de82..413c80c0323 100644
--- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
@@ -689,7 +689,7 @@ public static JsonObjectBuilder json(MetadataBlock metadataBlock, boolean printO
boolean displayOnCreateInOwnerDataverse = ownerDataverse != null && ownerDataverse.isDatasetFieldTypeDisplayOnCreateAsInputLevel(datasetFieldTypeId);
boolean displayCondition = printOnlyDisplayedOnCreateDatasetFieldTypes
- ? (displayOnCreateInOwnerDataverse || isRequired || requiredAsInputLevelInOwnerDataverse)
+ ? (displayOnCreateInOwnerDataverse || isRequired || requiredAsInputLevelInOwnerDataverse || datasetFieldType.isDisplayOnCreate())
: ownerDataverse == null || includedAsInputLevelInOwnerDataverse || isNotInputLevelInOwnerDataverse;
if (displayCondition) {
From 3c4e5ccb3e268f5f49c992f91f8b9a9c92da3873 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 17:12:50 +0000
Subject: [PATCH 04/21] Update DataversesIT test to modify metadata block
listing parameter
---
src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java | 2 +-
1 file changed, 1 insertion(+), 1 deletion(-)
diff --git a/src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java b/src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java
index 6a5307eafe7..9ec266138cb 100644
--- a/src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java
+++ b/src/test/java/edu/harvard/iq/dataverse/api/DataversesIT.java
@@ -1963,7 +1963,7 @@ public void testUpdateInputLevelDisplayOnCreate() {
.statusCode(OK.getStatusCode());
// Verify initial state
- Response listMetadataBlocks = UtilIT.listMetadataBlocks(dataverseAlias, true, true, apiToken);
+ Response listMetadataBlocks = UtilIT.listMetadataBlocks(dataverseAlias, false, true, apiToken);
listMetadataBlocks.then().assertThat()
.statusCode(OK.getStatusCode())
.body("data.size()", equalTo(1))
From 190f604b8277448b0356b18598659ee80ab50236 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 17:43:01 +0000
Subject: [PATCH 05/21] Update native API documentation for displayOnCreate
field option
- Clarify that required fields are always displayed regardless of displayOnCreate setting
---
doc/sphinx-guides/source/api/native-api.rst | 5 ++++-
1 file changed, 4 insertions(+), 1 deletion(-)
diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst
index 350cd2020ca..35a63e73537 100644
--- a/doc/sphinx-guides/source/api/native-api.rst
+++ b/doc/sphinx-guides/source/api/native-api.rst
@@ -1117,7 +1117,7 @@ This endpoint expects a JSON with the following format::
"datasetFieldTypeName": "datasetFieldTypeName1",
"required": true,
"include": true,
- "displayOnCreate": false
+ "displayOnCreate": false // Note: This setting is ignored for required fields
},
{
"datasetFieldTypeName": "datasetFieldTypeName2",
@@ -1127,6 +1127,9 @@ This endpoint expects a JSON with the following format::
}
]
+.. note::
+ Required fields will always be displayed regardless of their displayOnCreate setting, as this is necessary for dataset creation.
+
Parameters:
- ``datasetFieldTypeName``: Name of the metadata field
From 15fc52ce8ad1776f78ee0711b9a3f69c3e70a775 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 19:23:37 +0000
Subject: [PATCH 06/21] Implement null support for displayOnCreate field option
- Change displayOnCreate to nullable Boolean in DatasetFieldType
- Update API and service methods to handle null displayOnCreate values
- Modify native API documentation to explain null displayOnCreate behavior
- Add null checks in MetadataBlockServiceBean queries
---
doc/sphinx-guides/source/api/native-api.rst | 3 ++-
.../java/edu/harvard/iq/dataverse/DatasetFieldType.java | 8 ++++----
.../harvard/iq/dataverse/MetadataBlockServiceBean.java | 2 ++
.../harvard/iq/dataverse/api/DatasetFieldServiceApi.java | 2 +-
4 files changed, 9 insertions(+), 6 deletions(-)
diff --git a/doc/sphinx-guides/source/api/native-api.rst b/doc/sphinx-guides/source/api/native-api.rst
index 35a63e73537..7a03b7cf487 100644
--- a/doc/sphinx-guides/source/api/native-api.rst
+++ b/doc/sphinx-guides/source/api/native-api.rst
@@ -1117,7 +1117,7 @@ This endpoint expects a JSON with the following format::
"datasetFieldTypeName": "datasetFieldTypeName1",
"required": true,
"include": true,
- "displayOnCreate": false // Note: This setting is ignored for required fields
+ "displayOnCreate": null
},
{
"datasetFieldTypeName": "datasetFieldTypeName2",
@@ -1129,6 +1129,7 @@ This endpoint expects a JSON with the following format::
.. note::
Required fields will always be displayed regardless of their displayOnCreate setting, as this is necessary for dataset creation.
+ When displayOnCreate is null, the field's default display behavior is used.
Parameters:
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java b/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java
index be276d36581..ac661b2ada0 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetFieldType.java
@@ -273,14 +273,14 @@ public void setValidationFormat(String validationFormat) {
* Determines whether this field type is displayed in the form when creating
* the Dataset (or only later when editing after the initial creation).
*/
- @Column(name = "displayoncreate", nullable = false)
- private boolean displayOnCreate;
+ @Column(name = "displayoncreate", nullable = true)
+ private Boolean displayOnCreate;
- public boolean isDisplayOnCreate() {
+ public Boolean isDisplayOnCreate() {
return displayOnCreate;
}
- public void setDisplayOnCreate(boolean displayOnCreate) {
+ public void setDisplayOnCreate(Boolean displayOnCreate) {
this.displayOnCreate = displayOnCreate;
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/MetadataBlockServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/MetadataBlockServiceBean.java
index 920791796a7..aac5ba9ff10 100644
--- a/src/main/java/edu/harvard/iq/dataverse/MetadataBlockServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/MetadataBlockServiceBean.java
@@ -72,6 +72,7 @@ public List listMetadataBlocksDisplayedOnCreate(Dataverse ownerDa
// Predicate for displayOnCreate in the input level
Predicate displayOnCreateInputLevelPredicate = criteriaBuilder.and(
datasetFieldTypeInputLevelJoin.get("datasetFieldType").in(metadataBlockRoot.get("datasetFieldTypes")),
+ criteriaBuilder.isNotNull(datasetFieldTypeInputLevelJoin.get("displayOnCreate")),
criteriaBuilder.isTrue(datasetFieldTypeInputLevelJoin.get("displayOnCreate")));
// Predicate for required fields
@@ -82,6 +83,7 @@ public List listMetadataBlocksDisplayedOnCreate(Dataverse ownerDa
// Predicate for default displayOnCreate (when there is no input level)
Predicate defaultDisplayOnCreatePredicate = criteriaBuilder.and(
criteriaBuilder.not(criteriaBuilder.exists(inputLevelSubquery)),
+ criteriaBuilder.isNotNull(datasetFieldTypeJoin.get("displayOnCreate")),
criteriaBuilder.isTrue(datasetFieldTypeJoin.get("displayOnCreate")));
Predicate unionPredicate = criteriaBuilder.or(
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java b/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
index cbb0f4ffcfd..f29387aeb56 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
@@ -551,7 +551,7 @@ public static String getDataverseLangDirectory() {
*/
@POST
@Path("/setDisplayOnCreate")
- public Response setDisplayOnCreate(@QueryParam("datasetFieldType") String datasetFieldTypeIn, @QueryParam("setDisplayOnCreate") boolean setDisplayOnCreateIn) {
+ public Response setDisplayOnCreate(@QueryParam("datasetFieldType") String datasetFieldTypeIn, @QueryParam("setDisplayOnCreate") Boolean setDisplayOnCreateIn) {
DatasetFieldType dft = datasetFieldService.findByName(datasetFieldTypeIn);
if (dft == null) {
return error(Status.NOT_FOUND, "Cound not find a DatasetFieldType by looking up " + datasetFieldTypeIn);
From df39fb49a3886ae988382221a8a164e8edcd092e Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 19:44:25 +0000
Subject: [PATCH 07/21] fix: make displayOnCreate nullable in DatasetFieldType
- Modified JsonPrinter to handle null values for displayOnCreate
- Updated BriefJsonPrinter to use null-safe comparison
---
.../iq/dataverse/util/json/BriefJsonPrinter.java | 16 ++++++++--------
.../iq/dataverse/util/json/JsonPrinter.java | 3 ++-
2 files changed, 10 insertions(+), 9 deletions(-)
diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/BriefJsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/BriefJsonPrinter.java
index c16a46a1765..85bfe49846c 100644
--- a/src/main/java/edu/harvard/iq/dataverse/util/json/BriefJsonPrinter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/util/json/BriefJsonPrinter.java
@@ -24,14 +24,14 @@ public JsonObjectBuilder json( DatasetVersion dsv ) {
}
public JsonObjectBuilder json( MetadataBlock blk ) {
- return ( blk==null )
- ? null
- : jsonObjectBuilder().add("id", blk.getId())
- .add("displayName", blk.getDisplayName())
- .add("displayOnCreate", blk.isDisplayOnCreate())
- .add("name", blk.getName())
- ;
- }
+ if (blk == null) return null;
+ Boolean displayOnCreate = blk.isDisplayOnCreate();
+ return jsonObjectBuilder().add("id", blk.getId())
+ .add("displayName", blk.getDisplayName())
+ .add("displayOnCreate", displayOnCreate == null ? false : displayOnCreate)
+ .add("name", blk.getName())
+ ;
+ }
public JsonObjectBuilder json( Workflow wf ) {
return jsonObjectBuilder().add("id", wf.getId())
diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
index 413c80c0323..6daff82827c 100644
--- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
@@ -687,9 +687,10 @@ public static JsonObjectBuilder json(MetadataBlock metadataBlock, boolean printO
boolean isRequired = parentDatasetFieldType == null ? datasetFieldType.isRequired() : parentDatasetFieldType.isRequired();
boolean displayOnCreateInOwnerDataverse = ownerDataverse != null && ownerDataverse.isDatasetFieldTypeDisplayOnCreateAsInputLevel(datasetFieldTypeId);
+ Boolean fieldDisplayOnCreate = datasetFieldType.isDisplayOnCreate();
boolean displayCondition = printOnlyDisplayedOnCreateDatasetFieldTypes
- ? (displayOnCreateInOwnerDataverse || isRequired || requiredAsInputLevelInOwnerDataverse || datasetFieldType.isDisplayOnCreate())
+ ? (displayOnCreateInOwnerDataverse || isRequired || requiredAsInputLevelInOwnerDataverse || (fieldDisplayOnCreate != null && fieldDisplayOnCreate))
: ownerDataverse == null || includedAsInputLevelInOwnerDataverse || isNotInputLevelInOwnerDataverse;
if (displayCondition) {
From 944911b21312026518de7e8f1bf8e24a41b20828 Mon Sep 17 00:00:00 2001
From: Philip Durbin
Date: Thu, 6 Mar 2025 14:44:54 -0500
Subject: [PATCH 08/21] setDisplayOnCreate astroInstrument back to false #10476
---
.../java/edu/harvard/iq/dataverse/api/DatasetTypesIT.java | 7 +++++++
1 file changed, 7 insertions(+)
diff --git a/src/test/java/edu/harvard/iq/dataverse/api/DatasetTypesIT.java b/src/test/java/edu/harvard/iq/dataverse/api/DatasetTypesIT.java
index 7c73498dead..9c4ad6ffbbd 100644
--- a/src/test/java/edu/harvard/iq/dataverse/api/DatasetTypesIT.java
+++ b/src/test/java/edu/harvard/iq/dataverse/api/DatasetTypesIT.java
@@ -14,6 +14,7 @@
import static org.hamcrest.CoreMatchers.equalTo;
import static org.hamcrest.CoreMatchers.is;
import static org.hamcrest.CoreMatchers.nullValue;
+import org.junit.jupiter.api.AfterAll;
import static org.junit.jupiter.api.Assertions.assertEquals;
import org.junit.jupiter.api.BeforeAll;
import org.junit.jupiter.api.Test;
@@ -36,6 +37,12 @@ public static void setUpClass() {
ensureDatasetTypeIsPresent(INSTRUMENT, apiToken);
}
+ @AfterAll
+ public static void afterClass() {
+ // Other tests make assertions about displayOnCreate so revert it back to how it was.
+ UtilIT.setDisplayOnCreate("astroInstrument", false);
+ }
+
private static void ensureDatasetTypeIsPresent(String datasetType, String apiToken) {
Response getDatasetType = UtilIT.getDatasetType(datasetType);
getDatasetType.prettyPrint();
From 1db344bde52a8a9207b2dd3e6e6cda2216678102 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 19:59:53 +0000
Subject: [PATCH 09/21] Handle null displayOnCreate in MetadataBlock and
JsonPrinter
- Update MetadataBlock to safely check displayOnCreate with null values
- Modify JsonPrinter to default displayOnCreate to false when null
- Ensure consistent null-safe handling of displayOnCreate across components
---
.../java/edu/harvard/iq/dataverse/MetadataBlock.java | 3 ++-
.../edu/harvard/iq/dataverse/util/json/JsonPrinter.java | 9 ++++++---
2 files changed, 8 insertions(+), 4 deletions(-)
diff --git a/src/main/java/edu/harvard/iq/dataverse/MetadataBlock.java b/src/main/java/edu/harvard/iq/dataverse/MetadataBlock.java
index 0fd7c2efbc7..ea0eb1aeaa2 100644
--- a/src/main/java/edu/harvard/iq/dataverse/MetadataBlock.java
+++ b/src/main/java/edu/harvard/iq/dataverse/MetadataBlock.java
@@ -102,7 +102,8 @@ public void setDatasetFieldTypes(List datasetFieldTypes) {
public boolean isDisplayOnCreate() {
for (DatasetFieldType dsfType : datasetFieldTypes) {
- if (dsfType.isDisplayOnCreate()) {
+ Boolean displayOnCreate = dsfType.isDisplayOnCreate();
+ if (displayOnCreate != null && displayOnCreate) {
return true;
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
index 6daff82827c..c178e715c5e 100644
--- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
@@ -654,8 +654,10 @@ public static JsonObjectBuilder json(MetadataBlock metadataBlock, boolean printO
JsonObjectBuilder jsonObjectBuilder = jsonObjectBuilder()
.add("id", metadataBlock.getId())
.add("name", metadataBlock.getName())
- .add("displayName", metadataBlock.getDisplayName())
- .add("displayOnCreate", metadataBlock.isDisplayOnCreate());
+ .add("displayName", metadataBlock.getDisplayName());
+
+ Boolean displayOnCreate = metadataBlock.isDisplayOnCreate();
+ jsonObjectBuilder.add("displayOnCreate", displayOnCreate == null ? false : displayOnCreate);
List datasetFieldTypesList;
@@ -729,7 +731,8 @@ public static JsonObjectBuilder json(DatasetFieldType fld, Dataverse ownerDatave
JsonObjectBuilder fieldsBld = jsonObjectBuilder();
fieldsBld.add("name", fld.getName());
fieldsBld.add("displayName", fld.getDisplayName());
- fieldsBld.add("displayOnCreate", fld.isDisplayOnCreate());
+ Boolean displayOnCreate = fld.isDisplayOnCreate();
+ fieldsBld.add("displayOnCreate", displayOnCreate == null ? false : displayOnCreate);
fieldsBld.add("title", fld.getTitle());
fieldsBld.add("type", fld.getFieldType().toString());
fieldsBld.add("typeClass", typeClassString(fld));
From 973be9d03af9f372d3621c3a44b5557131ff3db3 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 20:44:50 +0000
Subject: [PATCH 10/21] Refine displayOnCreate logic in JsonPrinter
---
.../java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java | 4 +++-
1 file changed, 3 insertions(+), 1 deletion(-)
diff --git a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
index c178e715c5e..1bf031e5423 100644
--- a/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/util/json/JsonPrinter.java
@@ -692,7 +692,9 @@ public static JsonObjectBuilder json(MetadataBlock metadataBlock, boolean printO
Boolean fieldDisplayOnCreate = datasetFieldType.isDisplayOnCreate();
boolean displayCondition = printOnlyDisplayedOnCreateDatasetFieldTypes
- ? (displayOnCreateInOwnerDataverse || isRequired || requiredAsInputLevelInOwnerDataverse || (fieldDisplayOnCreate != null && fieldDisplayOnCreate))
+ ? (isRequired || requiredAsInputLevelInOwnerDataverse ||
+ displayOnCreateInOwnerDataverse ||
+ (ownerDataverse == null && fieldDisplayOnCreate != null && fieldDisplayOnCreate))
: ownerDataverse == null || includedAsInputLevelInOwnerDataverse || isNotInputLevelInOwnerDataverse;
if (displayCondition) {
From fe5abdb349bf3207ca576e288c7e53a98c3e3197 Mon Sep 17 00:00:00 2001
From: Alexis Guanche <74431162+Saixel@users.noreply.github.com>
Date: Thu, 6 Mar 2025 21:04:27 +0000
Subject: [PATCH 11/21] Update displayOnCreate method call in
metadataFragment.xhtml
- Change method call from property access to method invocation for displayOnCreate
- Maintain consistent null-safe handling of displayOnCreate field option
---
src/main/webapp/metadataFragment.xhtml | 4 +---
1 file changed, 1 insertion(+), 3 deletions(-)
diff --git a/src/main/webapp/metadataFragment.xhtml b/src/main/webapp/metadataFragment.xhtml
index f8367ce01f8..49547c14c43 100755
--- a/src/main/webapp/metadataFragment.xhtml
+++ b/src/main/webapp/metadataFragment.xhtml
@@ -244,9 +244,7 @@