Skip to content

Commit

Permalink
Merge branch 'develop' into 10341-croissant #10341
Browse files Browse the repository at this point in the history
  • Loading branch information
pdurbin committed Jun 4, 2024
2 parents c7a7057 + 23a4d9b commit 30f560e
Show file tree
Hide file tree
Showing 11 changed files with 60 additions and 54 deletions.
1 change: 1 addition & 0 deletions doc/release-notes/10568-Fix File Reingest.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
A bug that prevented the Ingest option in the File page Edit File menu from working has been fixed
1 change: 1 addition & 0 deletions doc/release-notes/5621_dataset image in header.md
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
Dataverse will use the Dataset thumbnail, if one is defined, rather than the generic Dataverse logo in the Open Graph metadata header. This means the image will be seen when, for example, the dataset is referenced in Facebook.
5 changes: 5 additions & 0 deletions doc/sphinx-guides/source/admin/harvestclients.rst
Original file line number Diff line number Diff line change
Expand Up @@ -47,3 +47,8 @@ What if a Run Fails?
Each harvesting client run logs a separate file per run to the app server's default logging directory (``/usr/local/payara6/glassfish/domains/domain1/logs/`` unless you've changed it). Look for filenames in the format ``harvest_TARGET_YYYY_MM_DD_timestamp.log`` to get a better idea of what's going wrong.

Note that you'll want to run a minimum of Dataverse Software 4.6, optimally 4.18 or beyond, for the best OAI-PMH interoperability.

Harvesting Non-OAI-PMH
~~~~~~~~~~~~~~~~~~~~~~

`DOI2PMH <https://github.com/IQSS/doi2pmh-server>`__ is a community-driven project intended to allow OAI-PMH harvesting from non-OAI-PMH sources.
7 changes: 7 additions & 0 deletions doc/sphinx-guides/source/api/apps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -133,6 +133,13 @@ https://github.com/libis/rdm-integration
PHP
---

DOI2PMH
~~~~~~~

The DOI2PMH server allow Dataverse instances to harvest DOI through OAI-PMH from otherwise unharvestable sources.

https://github.com/IQSS/doi2pmh-server

OJS
~~~

Expand Down
2 changes: 1 addition & 1 deletion doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1179,7 +1179,7 @@ See also :ref:`batch-exports-through-the-api` and the note below:
export PERSISTENT_IDENTIFIER=doi:10.5072/FK2/J8SJZB
export METADATA_FORMAT=ddi
curl "$SERVER_URL/api/datasets/export?exporter=$METADATA_FORMAT&persistentId=PERSISTENT_IDENTIFIER"
curl "$SERVER_URL/api/datasets/export?exporter=$METADATA_FORMAT&persistentId=$PERSISTENT_IDENTIFIER"
The fully expanded example above (without environment variables) looks like this:

Expand Down
12 changes: 3 additions & 9 deletions doc/sphinx-guides/source/developers/deployment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -91,17 +91,11 @@ Download `ec2-create-instance.sh`_ and put it somewhere reasonable. For the purp

.. _ec2-create-instance.sh: https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/ec2/ec2-create-instance.sh

To run it with default values you just need the script, but you may also want a current copy of the ansible `group vars <https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/defaults/main.yml>`_ file.
To run the script, you can make it executable (``chmod 755 ec2-create-instance.sh``) or run it with bash, like this with ``-h`` as an argument to print the help:

ec2-create-instance accepts a number of command-line switches, including:
``bash ~/Downloads/ec2-create-instance.sh -h``

* -r: GitHub Repository URL (defaults to https://github.com/IQSS/dataverse.git)
* -b: branch to build (defaults to develop)
* -p: pemfile directory (defaults to $HOME)
* -g: Ansible GroupVars file (if you wish to override role defaults)
* -h: help (displays usage for each available option)

``bash ~/Downloads/ec2-create-instance.sh -b develop -r https://github.com/scholarsportal/dataverse.git -g main.yml``
If you run the script without any arguments, it should spin up the latest version of Dataverse.

You will need to wait for 15 minutes or so until the deployment is finished, longer if you've enabled sample data and/or the API test suite. Eventually, the output should tell you how to access the Dataverse installation in a web browser or via SSH. It will also provide instructions on how to delete the instance when you are finished with it. Please be aware that AWS charges per minute for a running instance. You may also delete your instance from https://console.aws.amazon.com/console/home?region=us-east-1 .

Expand Down
53 changes: 27 additions & 26 deletions src/main/java/edu/harvard/iq/dataverse/DatasetServiceBean.java
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,6 @@
import edu.harvard.iq.dataverse.export.ExportService;
import edu.harvard.iq.dataverse.globus.GlobusServiceBean;
import edu.harvard.iq.dataverse.harvest.server.OAIRecordServiceBean;
import edu.harvard.iq.dataverse.pidproviders.PidProvider;
import edu.harvard.iq.dataverse.pidproviders.PidUtil;
import edu.harvard.iq.dataverse.search.IndexServiceBean;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.BundleUtil;
Expand All @@ -41,11 +39,10 @@
import jakarta.ejb.TransactionAttributeType;
import jakarta.inject.Named;
import jakarta.persistence.EntityManager;
import jakarta.persistence.LockModeType;
import jakarta.persistence.NoResultException;
import jakarta.persistence.NonUniqueResultException;
import jakarta.persistence.PersistenceContext;
import jakarta.persistence.Query;
import jakarta.persistence.StoredProcedureQuery;
import jakarta.persistence.TypedQuery;
import org.apache.commons.lang3.StringUtils;

Expand Down Expand Up @@ -115,28 +112,32 @@ public Dataset find(Object pk) {
* @return a dataset with pre-fetched file objects
*/
public Dataset findDeep(Object pk) {
return (Dataset) em.createNamedQuery("Dataset.findById")
.setParameter("id", pk)
// Optimization hints: retrieve all data in one query; this prevents point queries when iterating over the files
.setHint("eclipselink.left-join-fetch", "o.files.ingestRequest")
.setHint("eclipselink.left-join-fetch", "o.files.thumbnailForDataset")
.setHint("eclipselink.left-join-fetch", "o.files.dataTables")
.setHint("eclipselink.left-join-fetch", "o.files.auxiliaryFiles")
.setHint("eclipselink.left-join-fetch", "o.files.ingestReports")
.setHint("eclipselink.left-join-fetch", "o.files.dataFileTags")
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas")
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.fileCategories")
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.varGroups")
//.setHint("eclipselink.left-join-fetch", "o.files.guestbookResponses
.setHint("eclipselink.left-join-fetch", "o.files.embargo")
.setHint("eclipselink.left-join-fetch", "o.files.retention")
.setHint("eclipselink.left-join-fetch", "o.files.fileAccessRequests")
.setHint("eclipselink.left-join-fetch", "o.files.owner")
.setHint("eclipselink.left-join-fetch", "o.files.releaseUser")
.setHint("eclipselink.left-join-fetch", "o.files.creator")
.setHint("eclipselink.left-join-fetch", "o.files.alternativePersistentIndentifiers")
.setHint("eclipselink.left-join-fetch", "o.files.roleAssignments")
.getSingleResult();
try {
return (Dataset) em.createNamedQuery("Dataset.findById")
.setParameter("id", pk)
// Optimization hints: retrieve all data in one query; this prevents point queries when iterating over the files
.setHint("eclipselink.left-join-fetch", "o.files.ingestRequest")
.setHint("eclipselink.left-join-fetch", "o.files.thumbnailForDataset")
.setHint("eclipselink.left-join-fetch", "o.files.dataTables")
.setHint("eclipselink.left-join-fetch", "o.files.auxiliaryFiles")
.setHint("eclipselink.left-join-fetch", "o.files.ingestReports")
.setHint("eclipselink.left-join-fetch", "o.files.dataFileTags")
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas")
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.fileCategories")
.setHint("eclipselink.left-join-fetch", "o.files.fileMetadatas.varGroups")
//.setHint("eclipselink.left-join-fetch", "o.files.guestbookResponses
.setHint("eclipselink.left-join-fetch", "o.files.embargo")
.setHint("eclipselink.left-join-fetch", "o.files.retention")
.setHint("eclipselink.left-join-fetch", "o.files.fileAccessRequests")
.setHint("eclipselink.left-join-fetch", "o.files.owner")
.setHint("eclipselink.left-join-fetch", "o.files.releaseUser")
.setHint("eclipselink.left-join-fetch", "o.files.creator")
.setHint("eclipselink.left-join-fetch", "o.files.alternativePersistentIndentifiers")
.setHint("eclipselink.left-join-fetch", "o.files.roleAssignments")
.getSingleResult();
} catch (NoResultException | NonUniqueResultException ex) {
return null;
}
}

public List<Dataset> findByOwnerId(Long ownerId) {
Expand Down
21 changes: 10 additions & 11 deletions src/main/java/edu/harvard/iq/dataverse/FilePage.java
Original file line number Diff line number Diff line change
Expand Up @@ -522,10 +522,9 @@ public String ingestFile() throws CommandException{
return null;
}

DataFile dataFile = fileMetadata.getDataFile();
editDataset = dataFile.getOwner();
editDataset = file.getOwner();

if (dataFile.isTabularData()) {
if (file.isTabularData()) {
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("file.ingest.alreadyIngestedWarning"));
return null;
}
Expand All @@ -537,25 +536,25 @@ public String ingestFile() throws CommandException{
return null;
}

if (!FileUtil.canIngestAsTabular(dataFile)) {
if (!FileUtil.canIngestAsTabular(file)) {
JH.addMessage(FacesMessage.SEVERITY_WARN, BundleUtil.getStringFromBundle("file.ingest.cantIngestFileWarning"));
return null;

}

dataFile.SetIngestScheduled();
file.SetIngestScheduled();

if (dataFile.getIngestRequest() == null) {
dataFile.setIngestRequest(new IngestRequest(dataFile));
if (file.getIngestRequest() == null) {
file.setIngestRequest(new IngestRequest(file));
}

dataFile.getIngestRequest().setForceTypeCheck(true);
file.getIngestRequest().setForceTypeCheck(true);

// update the datafile, to save the newIngest request in the database:
datafileService.save(file);

// queue the data ingest job for asynchronous execution:
String status = ingestService.startIngestJobs(editDataset.getId(), new ArrayList<>(Arrays.asList(dataFile)), (AuthenticatedUser) session.getUser());
String status = ingestService.startIngestJobs(editDataset.getId(), new ArrayList<>(Arrays.asList(file)), (AuthenticatedUser) session.getUser());

if (!StringUtil.isEmpty(status)) {
// This most likely indicates some sort of a problem (for example,
Expand All @@ -565,9 +564,9 @@ public String ingestFile() throws CommandException{
// successfully gone through the process of trying to schedule the
// ingest job...

logger.warning("Ingest Status for file: " + dataFile.getId() + " : " + status);
logger.warning("Ingest Status for file: " + file.getId() + " : " + status);
}
logger.fine("File: " + dataFile.getId() + " ingest queued");
logger.fine("File: " + file.getId() + " ingest queued");

init();
JsfHelper.addInfoMessage(BundleUtil.getStringFromBundle("file.ingest.ingestQueued"));
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
import edu.harvard.iq.dataverse.UserNotification;
import edu.harvard.iq.dataverse.authorization.AuthenticatedUserLookup;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUser;
import edu.harvard.iq.dataverse.authorization.providers.oauth2.OAuth2TokenData;
import edu.harvard.iq.dataverse.authorization.users.ApiToken;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.batch.util.LoggingUtil;
Expand All @@ -25,7 +24,6 @@
import edu.harvard.iq.dataverse.engine.command.RequiredPermissions;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.exception.IllegalCommandException;
import edu.harvard.iq.dataverse.passwordreset.PasswordResetData;
import edu.harvard.iq.dataverse.search.IndexResponse;
import edu.harvard.iq.dataverse.search.savedsearch.SavedSearch;
import edu.harvard.iq.dataverse.workflows.WorkflowComment;
Expand Down Expand Up @@ -177,6 +175,7 @@ protected void executeImpl(CommandContext ctxt) throws CommandException {

ctxt.em().createNativeQuery("Delete from OAuth2TokenData where user_id ="+consumedAU.getId()).executeUpdate();

ctxt.em().createNativeQuery("DELETE FROM explicitgroup_authenticateduser consumed USING explicitgroup_authenticateduser ongoing WHERE consumed.containedauthenticatedusers_id="+ongoingAU.getId()+" AND ongoing.containedauthenticatedusers_id="+consumedAU.getId()).executeUpdate();
ctxt.em().createNativeQuery("UPDATE explicitgroup_authenticateduser SET containedauthenticatedusers_id="+ongoingAU.getId()+" WHERE containedauthenticatedusers_id="+consumedAU.getId()).executeUpdate();

ctxt.actionLog().changeUserIdentifierInHistory(consumedAU.getIdentifier(), ongoingAU.getIdentifier());
Expand Down
2 changes: 1 addition & 1 deletion src/main/webapp/dataset.xhtml
Original file line number Diff line number Diff line change
Expand Up @@ -86,7 +86,7 @@
<meta property="og:title" content="#{DatasetPage.title}" />
<meta property="og:type" content="article" />
<meta property="og:url" content="#{DatasetPage.dataverseSiteUrl}/dataset.xhtml?persistentId=#{dataset.globalId}" />
<meta property="og:image" content="#{DatasetPage.dataverseSiteUrl.concat(resource['images/dataverse-icon-1200.png'])}" />
<meta property="og:image" content="#{DatasetPage.dataset.getDatasetThumbnail(ImageThumbConverter.DEFAULT_PREVIEW_SIZE) == null ? DatasetPage.dataverseSiteUrl.concat(resource['images/dataverse-icon-1200.png']): DatasetPage.dataverseSiteUrl.concat('/api/datasets/:persistentId/thumbnail?persistentId=').concat(DatasetPage.dataset.getGlobalId().asString())}" />
<meta property="og:site_name" content="#{DatasetPage.publisher}" />
<meta property="og:description" content="#{(DatasetPage.description.length()>150 ? DatasetPage.description.substring(0,147).concat('...') : DatasetPage.description)}" />
<ui:repeat var="author" value="#{DatasetPage.datasetAuthors}">
Expand Down
7 changes: 3 additions & 4 deletions src/test/java/edu/harvard/iq/dataverse/api/UsersIT.java
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
import edu.harvard.iq.dataverse.authorization.DataverseRole;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.List;
import java.util.UUID;
import jakarta.json.Json;
Expand Down Expand Up @@ -206,15 +207,13 @@ public void testMergeAccounts(){
String aliasInOwner = "groupFor" + dataverseAlias;
String displayName = "Group for " + dataverseAlias;
String user2identifier = "@" + usernameConsumed;
String target2identifier = "@" + targetname;
Response createGroup = UtilIT.createGroup(dataverseAlias, aliasInOwner, displayName, superuserApiToken);
createGroup.prettyPrint();
createGroup.then().assertThat()
.statusCode(CREATED.getStatusCode());

String groupIdentifier = JsonPath.from(createGroup.asString()).getString("data.identifier");

List<String> roleAssigneesToAdd = new ArrayList<>();
roleAssigneesToAdd.add(user2identifier);
List<String> roleAssigneesToAdd = Arrays.asList(user2identifier, target2identifier);
Response addToGroup = UtilIT.addToGroup(dataverseAlias, aliasInOwner, roleAssigneesToAdd, superuserApiToken);
addToGroup.prettyPrint();
addToGroup.then().assertThat()
Expand Down

0 comments on commit 30f560e

Please sign in to comment.