Skip to content

Commit

Permalink
Merge branch 'develop' into 5093-datacite #5093
Browse files Browse the repository at this point in the history
Conflicts:
src/main/java/propertyFiles/Bundle.properties
  • Loading branch information
pdurbin committed Jun 1, 2020
2 parents 27da57a + 777461f commit c700f61
Show file tree
Hide file tree
Showing 26 changed files with 998 additions and 208 deletions.
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Closes #

**Suggestions on how to test this**:

**Does this PR introduce a user interface change?**:
**Does this PR introduce a user interface change? If mockups are available, please link/include them here**:

**Is there a release notes update needed for this change?**:

Expand Down
5 changes: 5 additions & 0 deletions doc/release-notes/6895-payara-release-note
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
##Payara 5

A major upgrade of the application server will provide security updates, access to new features like MicroProfile Config API, and will enable upgrades to other core technologies.

Note that moving from Glassfish to Payara will be required as part of the move to Dataverse 5.
4 changes: 4 additions & 0 deletions doc/release-notes/6896-relpub-astro.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
The astrophysics metadata block must be reloaded.

`curl http://localhost:8080/api/admin/datasetfield/load -X POST --data-binary @astrophysics.tsv -H "Content-type: text/tab-separated-values"`

17 changes: 10 additions & 7 deletions doc/sphinx-guides/source/installation/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -108,13 +108,16 @@ Version 9.6 is strongly recommended because it is the version developers and QA
# /usr/bin/systemctl start postgresql-9.6
# /usr/bin/systemctl enable postgresql-9.6

Note that the steps above are specific to RHEL/CentOS 7. For RHEL/CentOS 8 use::
The above steps are specific to RHEL/CentOS 7. For RHEL/CentOS 8 you must install Postgres 10 or higher::

# yum install -y https://download.postgresql.org/pub/repos/yum/reporpms/EL-8-x86_64/pgdg-redhat-repo-latest.noarch.rpm
# yum makecache fast
# yum install -y postgresql96-server
# service postgresql-9.6 initdb
# service postgresql-9.6 start
# yum install -y postgresql10-server
# /usr/pgsql-10/bin/postgresql-10-setup initdb
# systemctl start postgresql-10
# systemctl enable postgresql-10

Note that the Dataverse installer includes its own Postgres JDBC driver. If you choose to install the newest version of Postgres (12 as of this writing), you may need to grab a current JDBC driver from https://jdbc.postgresql.org/download.html before launching the install script.

Configuring Database Access for the Dataverse Application (and the Dataverse Installer)
=======================================================================================
Expand Down Expand Up @@ -234,11 +237,11 @@ For systems using init.d (like CentOS 6), download this :download:`Solr init scr
Securing Solr
=============

Our sample init script and systemd service file linked above tell Solr to only listen on localhost (127.0.0.1). We strongly recommend that you also use a firewall to block access to the Solr port (8983) from outside networks, for added redundancy.
Our sample init script and systemd service file linked above tell Solr to only listen on localhost (127.0.0.1). We strongly recommend that you also use a firewall to block access to the Solr port (8983) from outside networks, for added redundancy.

It is **very important** not to allow direct access to the Solr API from outside networks! Otherwise, any host that can reach the Solr port (8983 by default) can add or delete data, search unpublished data, and even reconfigure Solr. For more information, please see https://lucene.apache.org/solr/guide/7_3/securing-solr.html. A particularly serious security issue that has been identified recently allows a potential intruder to remotely execute arbitrary code on the system. See `RCE in Solr via Velocity Template <https://github.com/veracode-research/solr-injection#7-cve-2019-xxxx-rce-via-velocity-template-by-_s00py>`_ for more information.

If you're running your Dataverse instance across multiple service hosts you'll want to remove the jetty.host argument (``-j jetty.host=127.0.0.1``) from the startup command line, but make sure Solr is behind a firewall and only accessible by the Dataverse web application host(s), by specific ip address(es).
If you're running your Dataverse instance across multiple service hosts you'll want to remove the jetty.host argument (``-j jetty.host=127.0.0.1``) from the startup command line, but make sure Solr is behind a firewall and only accessible by the Dataverse web application host(s), by specific ip address(es).

We additionally recommend that the Solr service account's shell be disabled, as it isn't necessary for daily operation:

Expand All @@ -252,7 +255,7 @@ or simply prepend each command you would run as the Solr user with "sudo -u solr

# sudo -u solr command

Finally, we would like to reiterate that it is simply never a good idea to run Solr as root! Running the process as a non-privileged user would substantially minimize any potential damage even in the event that the instance is compromised.
Finally, we would like to reiterate that it is simply never a good idea to run Solr as root! Running the process as a non-privileged user would substantially minimize any potential damage even in the event that the instance is compromised.

jq
--
Expand Down
2 changes: 1 addition & 1 deletion scripts/api/data/metadatablocks/astrophysics.tsv
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
astroInstrument Instrument The instrument used to collect the data. text 2 TRUE TRUE TRUE TRUE FALSE FALSE astrophysics
astroObject Object Astronomical Objects represented in the data (Given as SIMBAD recognizable names preferred). text 3 TRUE FALSE TRUE TRUE FALSE FALSE astrophysics
resolution.Spatial Spatial Resolution The spatial (angular) resolution that is typical of the observations, in decimal degrees. text 4 TRUE FALSE FALSE TRUE FALSE FALSE astrophysics
resolution.Spectral Spectral Resolution The spectral resolution that is typical of the observations, given as the ratio λ/Δλ. text 5 TRUE FALSE FALSE TRUE FALSE FALSE astrophysics
resolution.Spectral Spectral Resolution The spectral resolution that is typical of the observations, given as the ratio \u03bb/\u0394\u03bb. text 5 TRUE FALSE FALSE TRUE FALSE FALSE astrophysics
resolution.Temporal Time Resolution The temporal resolution that is typical of the observations, given in seconds. text 6 FALSE FALSE FALSE FALSE FALSE FALSE astrophysics
coverage.Spectral.Bandpass Bandpass Conventional bandpass name text 7 TRUE TRUE TRUE TRUE FALSE FALSE astrophysics
coverage.Spectral.CentralWavelength Central Wavelength (m) The central wavelength of the spectral bandpass, in meters. Enter a floating-point number. float 8 TRUE FALSE TRUE TRUE FALSE FALSE astrophysics
Expand Down
48 changes: 48 additions & 0 deletions src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,7 @@
import javax.servlet.http.HttpServletResponse;

import org.apache.commons.lang.StringEscapeUtils;
import org.apache.commons.lang3.mutable.MutableBoolean;
import org.apache.commons.io.IOUtils;

import org.primefaces.component.tabview.TabView;
Expand Down Expand Up @@ -246,6 +247,9 @@ public enum DisplayMode {
private Long versionId;
private int selectedTabIndex;
private List<DataFile> newFiles = new ArrayList<>();
private List<DataFile> uploadedFiles = new ArrayList<>();
private MutableBoolean uploadInProgress = new MutableBoolean(false);

private DatasetVersion workingVersion;
private DatasetVersion clone;
private int releaseRadio = 1;
Expand Down Expand Up @@ -1114,6 +1118,22 @@ public void setNewFiles(List<DataFile> newFiles) {
this.newFiles = newFiles;
}

public List<DataFile> getUploadedFiles() {
return uploadedFiles;
}

public void setUploadedFiles(List<DataFile> uploadedFiles) {
this.uploadedFiles = uploadedFiles;
}

public MutableBoolean getUploadInProgress() {
return uploadInProgress;
}

public void setUploadInProgress(MutableBoolean inProgress) {
this.uploadInProgress = inProgress;
}

public Dataverse getLinkingDataverse() {
return linkingDataverse;
}
Expand Down Expand Up @@ -3555,6 +3575,34 @@ private String returnToDraftVersion(){
public String cancel() {
return returnToLatestVersion();
}

public void cancelCreate() {
//Stop any uploads in progress (so that uploadedFiles doesn't change)
uploadInProgress.setValue(false);

logger.fine("Cancelling: " + newFiles.size() + " : " + uploadedFiles.size());

//Files that have been finished and are now in the lower list on the page
for (DataFile newFile : newFiles.toArray(new DataFile[0])) {
FileUtil.deleteTempFile(newFile, dataset, ingestService);
}
logger.fine("Deleted newFiles");

//Files in the upload process but not yet finished
//ToDo - if files are added to uploadFiles after we access it, those files are not being deleted. With uploadInProgress being set false above, this should be a fairly rare race condition.
for (DataFile newFile : uploadedFiles.toArray(new DataFile[0])) {
FileUtil.deleteTempFile(newFile, dataset, ingestService);
}
logger.fine("Deleted uploadedFiles");

try {
String alias = dataset.getOwner().getAlias();
logger.info("alias: " + alias);
FacesContext.getCurrentInstance().getExternalContext().redirect("/dataverse.xhtml?alias=" + alias);
} catch (IOException ex) {
logger.info("Failed to issue a redirect to file download url.");
}
}

private HttpClient getClient() {
// TODO:
Expand Down
4 changes: 2 additions & 2 deletions src/main/java/edu/harvard/iq/dataverse/DatasetVersionUI.java
Original file line number Diff line number Diff line change
Expand Up @@ -110,8 +110,8 @@ public DatasetVersionUI initDatasetVersionUI(DatasetVersion datasetVersion, boo
if (this.datasetRelPublications.isEmpty()) {
for (DatasetFieldCompoundValue relPubVal : dsf.getDatasetFieldCompoundValues()) {
DatasetRelPublication datasetRelPublication = new DatasetRelPublication();
datasetRelPublication.setTitle(dsf.getDatasetFieldType().getTitle());
datasetRelPublication.setDescription(dsf.getDatasetFieldType().getDescription());
datasetRelPublication.setTitle(dsf.getDatasetFieldType().getLocaleTitle());
datasetRelPublication.setDescription(dsf.getDatasetFieldType().getLocaleDescription());
for (DatasetField subField : relPubVal.getChildDatasetFields()) {
if (subField.getDatasetFieldType().getName().equals(DatasetFieldConstant.publicationCitation)) {
datasetRelPublication.setText(subField.getValue());
Expand Down

0 comments on commit c700f61

Please sign in to comment.