, ,
, , , , , , ).
* **Category**: Select a category that best describes the type of dataverse this will be. For example, if this is a dataverse for an individual researcher's datasets, select Researcher. If this is a dataverse for an institution, select Organization & Institution.
- * **Choose the sets of Metadata Elements for datasets in this dataverse**: by default the metadata elements will be from the host dataverse that this new dataverse is created in. Dataverse offers metadata standards for multiple domains. To learn more about the metadata standards in Dataverse please check out the appendix (insert link here)
+ * **Choose the sets of Metadata Elements for datasets in this dataverse**: by default the metadata elements will be from the host dataverse that this new dataverse is created in. Dataverse offers metadata standards for multiple domains. To learn more about the metadata standards in Dataverse please check out the :doc:`/user/appendix`.
* **Select facets for this dataverse**: by default the facets that will appear on your dataverse landing page will be from the host dataverse that this new dataverse was created in. The facets are simply metadata fields that can be used to help others easily find dataverses and datasets within this dataverse. You can select as many facets as you would like.
#. Selected metadata elements are also used to pick which metadata fields you would like to use for creating templates for your datasets. Metadata fields can be hidden, or selected as required or optional. Once you have selected all the fields you would like to use, you can create your template(s) after you finish creating your dataverse.
#. Click "Create Dataverse" button and you're done!
@@ -92,19 +92,24 @@ Permissions
When you access a dataverse's permissions page, you will see there are three sections: Permissions, Users/Groups, and Roles.
|image2|
+
Clicking on Permissions will bring you to this page:
+
|image3|
+
By clicking on the Edit Access button, you are able to change the settings allowing no one or anyone to add either dataverses or datasets to a dataverse.
+
|image4|
+
The Edit Access pop up allows you to also select if someone adding a dataset to this dataverse should be allowed to publish it (Curator role) or if the dataset will be submitted to the administrator of this dataverse to be reviewed then published (Contributor role). These Access settings can be changed at any time.
Assign Role
-----------------------
-You can also give access to a Dataverse user to allow them to access an unpublished dataverse as well as other roles. To do this, click on the Assign Roles to Users/Groups button in the Users/Groups section. You can also give multiple users the same role at one time.
+You can also give access to a Dataverse user to allow them to access an unpublished dataverse as well as other roles. To do this, click on the Assign Roles to Users/Groups button in the Users/Groups section. You can also give multiple users the same role at one time. This roles can be removed at any time.
+
|image5|
-|image6|
-This roles can be removed at any time.
+|image6|
.. _dataset-templates:
@@ -176,10 +181,15 @@ is made public, it can no longer be unpublished.
.. |image1| image:: ./img/Dataverse-Diagram.png
.. |image2| image:: ./img/dvperms1.png
+ :class: img-responsive
.. |image3| image:: ./img/dv2.png
+ :class: img-responsive
.. |image4| image:: ./img/dv3.png
+ :class: img-responsive
.. |image5| image:: ./img/dv4.png
+ :class: img-responsive
.. |image6| image:: ./img/dv5.png
+ :class: img-responsive
diff --git a/pom.xml b/pom.xml
index 7812b866812..dd12a75e337 100644
--- a/pom.xml
+++ b/pom.xml
@@ -4,7 +4,7 @@
edu.harvard.iq
dataverse
- 4.5.1
+ 4.6
war
dataverse
diff --git a/scripts/api/data/metadatablocks/social_science.tsv b/scripts/api/data/metadatablocks/social_science.tsv
index b9fec245a1f..7ef714c0599 100644
--- a/scripts/api/data/metadatablocks/social_science.tsv
+++ b/scripts/api/data/metadatablocks/social_science.tsv
@@ -1,7 +1,7 @@
#metadataBlock name dataverseAlias displayName
socialscience Social Science and Humanities Metadata
#datasetField name title description watermark fieldType displayOrder displayFormat advancedSearchField allowControlledVocabulary allowmultiples facetable displayoncreate required parent metadatablock_id
- unitOfAnalysis Unit of Analysis Basic unit of analysis or observation that this Dataset describes, such as individuals, families/households, groups, institutions/organizations, administrative units, and more. For information about the DDI's controlled vocabulary for this element, please refer to the DDI web page at http://www.ddialliance.org/Specification/DDI-CV/. textbox 0 TRUE FALSE TRUE TRUE FALSE FALSE socialscience
+ unitOfAnalysis Unit of Analysis Basic unit of analysis or observation that this Dataset describes, such as individuals, families/households, groups, institutions/organizations, administrative units, and more. For information about the DDI's controlled vocabulary for this element, please refer to the DDI web page at http://www.ddialliance.org/controlled-vocabularies. textbox 0 TRUE FALSE TRUE TRUE FALSE FALSE socialscience
universe Universe Description of the population covered by the data in the file; the group of people or other elements that are the object of the study and to which the study results refer. Age, nationality, and residence commonly help to delineate a given universe, but any number of other factors may be used, such as age limits, sex, marital status, race, ethnic group, nationality, income, veteran status, criminal convictions, and more. The universe may consist of elements other than persons, such as housing units, court cases, deaths, countries, and so on. In general, it should be possible to tell from the description of the universe whether a given individual or element is a member of the population under study. Also known as the universe of interest, population of interest, and target population. textbox 1 TRUE FALSE TRUE TRUE FALSE FALSE socialscience
timeMethod Time Method The time method or time dimension of the data collection, such as panel, cross-sectional, trend, time- series, or other. text 2 TRUE FALSE FALSE TRUE FALSE FALSE socialscience
dataCollector Data Collector Individual, agency or organization responsible for administering the questionnaire or interview or compiling the data. FamilyName, GivenName or Organization text 3 FALSE FALSE FALSE FALSE FALSE FALSE socialscience
@@ -26,4 +26,4 @@
socialScienceNotes Notes General notes about this Dataset. none 22 FALSE FALSE FALSE FALSE FALSE FALSE socialscience
socialScienceNotesType Type Type of note. text 23 FALSE FALSE FALSE FALSE FALSE FALSE socialScienceNotes socialscience
socialScienceNotesSubject Subject Note subject. text 24 FALSE FALSE FALSE FALSE FALSE FALSE socialScienceNotes socialscience
- socialScienceNotesText Text Text for this note. textbox 25 FALSE FALSE FALSE FALSE FALSE FALSE socialScienceNotes socialscience
\ No newline at end of file
+ socialScienceNotesText Text Text for this note. textbox 25 FALSE FALSE FALSE FALSE FALSE FALSE socialScienceNotes socialscience
diff --git a/scripts/database/homebrew/delete-all b/scripts/database/homebrew/delete-all
index 9a2d0239b45..2d2c2110c6b 100755
--- a/scripts/database/homebrew/delete-all
+++ b/scripts/database/homebrew/delete-all
@@ -1,5 +1,6 @@
#!/bin/sh
/Applications/NetBeans/glassfish4/glassfish/bin/asadmin stop-domain
+rm -rf /Applications/NetBeans/glassfish4/glassfish/domains/domain1/generated
scripts/database/homebrew/drop-database
scripts/search/clear
rm -rf ~/dataverse/files
diff --git a/scripts/database/upgrades/upgrade_v4.5.1_to_v4.6.sql b/scripts/database/upgrades/upgrade_v4.5.1_to_v4.6.sql
new file mode 100644
index 00000000000..eb7d95477b6
--- /dev/null
+++ b/scripts/database/upgrades/upgrade_v4.5.1_to_v4.6.sql
@@ -0,0 +1,8 @@
+ALTER TABLE datafile ADD COLUMN checksumtype character varying(255);
+UPDATE datafile SET checksumtype = 'MD5';
+ALTER TABLE datafile ALTER COLUMN checksumtype SET NOT NULL;
+-- alternate statement for sbgrid.org and others interested in SHA-1 support
+-- note that in the database we use "SHA1" (no hyphen) but the GUI will show "SHA-1"
+--UPDATE datafile SET checksumtype = 'SHA1';
+ALTER TABLE datafile RENAME md5 TO checksumvalue;
+ALTER TABLE filemetadata ADD COLUMN directorylabel character varying(255);
diff --git a/scripts/installer/install b/scripts/installer/install
index 071b8e7558b..1c0289feb21 100755
--- a/scripts/installer/install
+++ b/scripts/installer/install
@@ -807,7 +807,7 @@ else
print TMPCMD $sql_command;
close TMPCMD;
- my $psql_commandline = $psql_exec . "/psql -h " . $CONFIG_DEFAULTS{'POSTGRES_SERVER'} . " -U postgres -d postgres -f /tmp/pgcmd.$$.tmp >/dev/null 2>&1";
+ my $psql_commandline = $psql_admin_exec . "/psql -h " . $CONFIG_DEFAULTS{'POSTGRES_SERVER'} . " -U postgres -d postgres -f /tmp/pgcmd.$$.tmp >/dev/null 2>&1";
my $out = qx($psql_commandline 2>&1);
my $exitcode = $?;
@@ -829,8 +829,8 @@ else
print "\nCreating Postgres database:\n";
my $psql_command =
- $psql_admin_exec
- . "/createdb -h " . $CONFIG_DEFAULTS{'POSTGRES_SERVER'} . " -U postgres "
+ $psql_exec
+ . "/createdb -h " . $CONFIG_DEFAULTS{'POSTGRES_SERVER'} . " -U $CONFIG_DEFAULTS{'POSTGRES_USER'} "
. $CONFIG_DEFAULTS{'POSTGRES_DATABASE'}
. " --owner="
. $CONFIG_DEFAULTS{'POSTGRES_USER'};
diff --git a/scripts/issues/3354/createDatasetWithSha1Files.sh b/scripts/issues/3354/createDatasetWithSha1Files.sh
new file mode 100755
index 00000000000..1792a9e33a3
--- /dev/null
+++ b/scripts/issues/3354/createDatasetWithSha1Files.sh
@@ -0,0 +1,5 @@
+#!/bin/sh
+# existing, works, no files, commenting out
+#curl -s -X POST -H "Content-type:application/json" -d @scripts/search/tests/data/dataset-finch1.json "http://localhost:8080/api/dataverses/root/datasets/?key=$API_TOKEN"
+# new, has files
+curl -s -X POST -H "Content-type:application/json" -d @scripts/issues/3354/datasetWithSha1Files.json "http://localhost:8080/api/dataverses/root/datasets/?key=$API_TOKEN"
diff --git a/scripts/issues/3354/datasetWithSha1Files.json b/scripts/issues/3354/datasetWithSha1Files.json
new file mode 100644
index 00000000000..95a4d3b88d0
--- /dev/null
+++ b/scripts/issues/3354/datasetWithSha1Files.json
@@ -0,0 +1,86 @@
+{
+ "datasetVersion": {
+ "files": [
+ {
+ "label": "foo.txt",
+ "dataFile": {
+ "filename": "foo.txt",
+ "contentType": "text/plain",
+ "storageIdentifier": "157484f9d6c-c36006fa39e5",
+ "originalFormatLabel": "UNKNOWN",
+ "checksum": {
+ "type": "SHA-1",
+ "value": "f1d2d2f924e986ac86fdf7b36c94bcdf32beec15"
+ }
+ }
+ }
+ ],
+ "metadataBlocks": {
+ "citation": {
+ "fields": [
+ {
+ "value": "Dataset with SHA-1 files",
+ "typeClass": "primitive",
+ "multiple": false,
+ "typeName": "title"
+ },
+ {
+ "value": [
+ {
+ "authorName": {
+ "value": "Finch, Fiona",
+ "typeClass": "primitive",
+ "multiple": false,
+ "typeName": "authorName"
+ },
+ "authorAffiliation": {
+ "value": "Birds Inc.",
+ "typeClass": "primitive",
+ "multiple": false,
+ "typeName": "authorAffiliation"
+ }
+ }
+ ],
+ "typeClass": "compound",
+ "multiple": true,
+ "typeName": "author"
+ },
+ {
+ "value": [
+ { "datasetContactEmail" : {
+ "typeClass": "primitive",
+ "multiple": false,
+ "typeName": "datasetContactEmail",
+ "value" : "finch@mailinator.com"
+ }
+ }],
+ "typeClass": "compound",
+ "multiple": true,
+ "typeName": "datasetContact"
+ },
+ {
+ "value": [ {
+ "dsDescriptionValue":{
+ "value": "Some people prefer SHA-1 to MD5 for file fixity.",
+ "multiple":false,
+ "typeClass": "primitive",
+ "typeName": "dsDescriptionValue"
+ }}],
+ "typeClass": "compound",
+ "multiple": true,
+ "typeName": "dsDescription"
+ },
+ {
+ "value": [
+ "Other"
+ ],
+ "typeClass": "controlledVocabulary",
+ "multiple": true,
+ "typeName": "subject"
+ }
+ ],
+ "displayName": "Citation Metadata"
+ }
+ }
+ }
+}
diff --git a/scripts/issues/3354/mydata b/scripts/issues/3354/mydata
new file mode 100755
index 00000000000..eb76d06e3f9
--- /dev/null
+++ b/scripts/issues/3354/mydata
@@ -0,0 +1,3 @@
+#!/bin/sh
+# FIXME: Make this into a REST Assured test.
+curl -s "http://localhost:8080/api/mydata/retrieve?key=$API_TOKEN&role_ids=1&dvobject_types=DataFile&published_states=Published&published_states=Unpublished&published_states=Draft&published_states=In+Review&published_states=Deaccessioned" | jq .data.items
diff --git a/scripts/migration/migrate_passwords.sql b/scripts/migration/migrate_passwords.sql
new file mode 100644
index 00000000000..ec2176ea8a6
--- /dev/null
+++ b/scripts/migration/migrate_passwords.sql
@@ -0,0 +1,5 @@
+update builtinuser
+set passwordencryptionversion = 0,
+encryptedpassword= _dvn3_vdcuser.encryptedpassword
+from _dvn3_vdcuser
+where _dvn3_vdcuser.username=builtinuser.username;
diff --git a/scripts/migration/migration_instructions.txt b/scripts/migration/migration_instructions.txt
index 814face5c90..d30eb77dc26 100644
--- a/scripts/migration/migration_instructions.txt
+++ b/scripts/migration/migration_instructions.txt
@@ -110,7 +110,7 @@ datafile_pub_date.sql
12. (when ready for users to log in) add user passwords
-[how?]
+migrate_passwords.sql
__________________________________________________
diff --git a/scripts/search/search b/scripts/search/search
index b058dbdacb9..ac14596ac2a 100755
--- a/scripts/search/search
+++ b/scripts/search/search
@@ -1,11 +1,11 @@
#!/bin/sh
if [ -z "$1" ]; then
- curl -s 'http://localhost:8080/api/search?q=*'
+ curl -H "X-Dataverse-key: $API_TOKEN" -s 'http://localhost:8080/api/search?q=*'
#curl -s 'http://localhost:8080/api/search?q=*&key=pete'
else
# i.e. ./search 'q=*&fq=filetype_s:"image"&fq=dvtype:files'
# i.e. ./search 'q=*&start=10'
# i.e. ./search 'q=*&sort=name_sort&order=asc'
# i.e. ./search 'q=*&sort=name_sort&order=asc' | jq '.itemsJson[] | {name_sort}'
- curl -s "http://localhost:8080/api/search?$1"
+ curl -H "X-Dataverse-key: $API_TOKEN" -s "http://localhost:8080/api/search?$1"
fi
diff --git a/scripts/vagrant/install-dataverse.sh b/scripts/vagrant/install-dataverse.sh
index 9e141f231ab..0b0d72193db 100644
--- a/scripts/vagrant/install-dataverse.sh
+++ b/scripts/vagrant/install-dataverse.sh
@@ -11,3 +11,4 @@ if [ ! -f $WAR ]; then
fi
cd /dataverse/scripts/installer
./install --hostname localhost $MAILSERVER_ARG --gfdir /home/glassfish/glassfish4 -y --force
+echo "If "vagrant up" was successful (check output above) Dataverse is running on port 8080 of the Linux machine running within Vagrant, but this port has been forwarded to port 8888 of the computer you ran "vagrant up" on. For this reason you should go to http://localhost:8888 to see the Dataverse app running."
diff --git a/scripts/vagrant/setup.sh b/scripts/vagrant/setup.sh
index 611ba780f6f..5789450aa51 100644
--- a/scripts/vagrant/setup.sh
+++ b/scripts/vagrant/setup.sh
@@ -11,7 +11,10 @@ sudo mv jq /usr/bin/jq
echo "Adding Shibboleth yum repo"
cp /dataverse/conf/vagrant/etc/yum.repos.d/shibboleth.repo /etc/yum.repos.d
cp /dataverse/conf/vagrant/etc/yum.repos.d/epel-apache-maven.repo /etc/yum.repos.d
-yum install -y java-1.8.0-openjdk-devel postgresql-server apache-maven httpd mod_ssl shibboleth shibboleth-embedded-ds
+# Uncomment this (and other shib stuff below) if you want
+# to use Vagrant (and maybe PageKite) to test Shibboleth.
+#yum install -y shibboleth shibboleth-embedded-ds
+yum install -y java-1.8.0-openjdk-devel postgresql-server apache-maven httpd mod_ssl
alternatives --set java /usr/lib/jvm/jre-1.8.0-openjdk.x86_64/bin/java
alternatives --set javac /usr/lib/jvm/java-1.8.0-openjdk.x86_64/bin/javac
java -version
@@ -44,17 +47,17 @@ if [ ! -d $GLASSFISH_ROOT ]; then
else
echo "$GLASSFISH_ROOT already exists"
fi
-service shibd start
+#service shibd start
service httpd stop
cp /dataverse/conf/httpd/conf.d/dataverse.conf /etc/httpd/conf.d/dataverse.conf
mkdir -p /var/www/dataverse/error-documents
cp /dataverse/conf/vagrant/var/www/dataverse/error-documents/503.html /var/www/dataverse/error-documents
service httpd start
-curl -k --sslv3 https://pdurbin.pagekite.me/Shibboleth.sso/Metadata > /tmp/pdurbin.pagekite.me
-cp -a /etc/shibboleth/shibboleth2.xml /etc/shibboleth/shibboleth2.xml.orig
-cp -a /etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml.orig
+#curl -k --sslv3 https://pdurbin.pagekite.me/Shibboleth.sso/Metadata > /tmp/pdurbin.pagekite.me
+#cp -a /etc/shibboleth/shibboleth2.xml /etc/shibboleth/shibboleth2.xml.orig
+#cp -a /etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml.orig
# need more attributes, such as sn, givenName, mail
-cp /dataverse/conf/vagrant/etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml
+#cp /dataverse/conf/vagrant/etc/shibboleth/attribute-map.xml /etc/shibboleth/attribute-map.xml
# FIXME: automate this?
#curl 'https://www.testshib.org/cgi-bin/sp2config.cgi?dist=Others&hostname=pdurbin.pagekite.me' > /etc/shibboleth/shibboleth2.xml
#cp /dataverse/conf/vagrant/etc/shibboleth/shibboleth2.xml /etc/shibboleth/shibboleth2.xml
diff --git a/src/main/java/Bundle.properties b/src/main/java/Bundle.properties
index ca19a51533e..e768e4c6b99 100755
--- a/src/main/java/Bundle.properties
+++ b/src/main/java/Bundle.properties
@@ -1060,7 +1060,7 @@ dataset.cite.title.released=DRAFT VERSION will be replaced in the citation with
dataset.cite.title.draft=DRAFT VERSION will be replaced in the citation with the selected version once the dataset has been published.
dataset.cite.title.deassessioned=DEACCESSIONED VERSION has been added to the citation for this version since it is no longer available.
dataset.cite.standards.tip=Learn about Data Citation Standards .
-dataset.cite.downloadBtn=Cite Data
+dataset.cite.downloadBtn=Cite Dataset
dataset.cite.downloadBtn.xml=EndNote XML
dataset.cite.downloadBtn.ris=RIS
dataset.cite.downloadBtn.bib=BibTeX
@@ -1104,7 +1104,6 @@ dataset.metadata.persistentId.tip=The unique persistent identifier for a Dataset
dataset.versionDifferences.termsOfUseAccess=Terms of Use and Access
dataset.versionDifferences.termsOfUseAccessChanged=Terms of Use/Access Changed
file.viewDiffDialog.restricted=Restricted
-file.viewDiffDialog.md5=MD5
dataset.template.tip=Changing the template will clear any fields you may have entered data into.
dataset.noTemplate.label=None
@@ -1162,9 +1161,8 @@ file.download.header=Download
file.preview=Preview:
file.fileName=File Name
file.type.tabularData=Tabular Data
-file.MD5=MD5
-file.MD5.origal=Original File MD5
-file.MD5.exists.tip=A file with this MD5 already exists in the dataset.
+file.originalChecksumType=Original File {0}
+file.checksum.exists.tip=A file with this checksum already exists in the dataset.
file.selectedThumbnail=Thumbnail
file.selectedThumbnail.tip=The thumbnail for this file is used as the default thumbnail for the dataset. Click 'Advanced Options' button of another file to select that file.
@@ -1203,6 +1201,7 @@ file.spss-porExtraLabels=Variable Labels
file.spss-porExtraLabels.title=Upload an additional text file with extra variable labels.
file.spss-porExtraLabels.selectToAddBtn=Select File to Add
file.ingestFailed=Tabular Data Ingest Failed
+file.explore.twoRavens=TwoRavens
file.mapData=Map Data
file.mapData.viewMap=WorldMap
file.mapData.unpublished.header=Data Not Published
@@ -1220,7 +1219,7 @@ file.requestAccess.dialog.msg=You need to Sign Up or Log In to request access to this file.
file.accessRequested=Access Requested
-file.ingestInproGress=Ingest in progress...
+file.ingestInProgress=Ingest in progress...
file.dataFilesTab.metadata.header=Metadata
file.dataFilesTab.metadata.addBtn=Add + Edit Metadata
@@ -1392,19 +1391,25 @@ dataset.widgets.advanced.success.message=Successfully updated your Personal Webs
dataset.widgets.advanced.failure.message=The dataverse Personal Website URL has not been updated.
# file.xhtml
+file.share.fileShare=Share File
+file.share.fileShare.tip=Share this file on your favorite social media networks.
+file.share.fileShare.shareText=View this file.
file.title.label=Title
file.citation.label=Citation
+file.cite.downloadBtn=Cite Data File
file.general.metadata.label=General Metadata
file.description.label=Description
file.tags.label=Tags
+file.lastupdated.label=Last Updated
file.metadataTab.fileMetadata.header=File Metadata
file.metadataTab.fileMetadata.persistentid.label=Data File Persistent ID
-file.metadataTab.fileMetadata.md5.label=MD5
file.metadataTab.fileMetadata.unf.label=UNF
file.metadataTab.fileMetadata.size.label=Size
file.metadataTab.fileMetadata.type.label=Type
file.metadataTab.fileMetadata.description.label=Description
+file.metadataTab.fileMetadata.publicationDate.label=Publication Date
+file.metadataTab.fileMetadata.depositDate.label=Deposit Date
file.metadataTab.fitsMetadata.header=FITS Metadata
file.metadataTab.provenance.header=File Provenance
file.metadataTab.provenance.body=File Provenance information coming in a later release...
diff --git a/src/main/java/Bundle_zh_CN.properties b/src/main/java/Bundle_zh_CN.properties
index d2a8589d2e7..640f504a274 100644
--- a/src/main/java/Bundle_zh_CN.properties
+++ b/src/main/java/Bundle_zh_CN.properties
@@ -361,7 +361,7 @@ file.downloadBtn.format.original=\u539f\u59cb\u6587\u4ef6\u683c\u5f0f\uff08{0}\u
file.downloadBtn.format.rdata=RDATA\u683c\u5f0f
file.downloadBtn.format.var=\u53ef\u53d8\u5143
file.downloadBtn.format.citation=\u6570\u636e\u6587\u4ef6\u5f15\u7528
-file.ingestInproGress=\u6444\u53d6\u4e2d...
+file.ingestInProgress=\u6444\u53d6\u4e2d...
file.dataFilesTab.metadata.header=\u5143\u6570\u636e
file.dataFilesTab.metadata.addBtn=\u6dfb\u52a0+\u7f16\u8f91\u5143\u6570\u636e
file.dataFilesTab.terms.header=\u8bb8\u53ef+\u6761\u6b3e
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataCitation.java b/src/main/java/edu/harvard/iq/dataverse/DataCitation.java
index 3e6bba7e434..c969d4c29a1 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataCitation.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataCitation.java
@@ -172,7 +172,7 @@ public String toString(boolean html) {
// append UNF
if (!StringUtils.isEmpty(UNF)) {
- citation.append(" [").append(UNF).append("]");
+ citation.append(", ").append(UNF);
}
for (DatasetField dsf : optionalValues) {
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataFile.java b/src/main/java/edu/harvard/iq/dataverse/DataFile.java
index 057faf4211e..50b2a1ce869 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataFile.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataFile.java
@@ -7,6 +7,7 @@
import edu.harvard.iq.dataverse.dataaccess.DataFileIO;
import edu.harvard.iq.dataverse.ingest.IngestReport;
import edu.harvard.iq.dataverse.ingest.IngestRequest;
+import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.util.FileUtil;
import edu.harvard.iq.dataverse.util.ShapefileHandler;
import java.io.IOException;
@@ -16,12 +17,15 @@
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.Files;
-import java.util.Comparator;
+import java.text.SimpleDateFormat;
+import java.util.Arrays;
import javax.persistence.Entity;
import javax.persistence.OneToMany;
import javax.persistence.OneToOne;
import javax.persistence.CascadeType;
import javax.persistence.Column;
+import javax.persistence.EnumType;
+import javax.persistence.Enumerated;
import javax.persistence.Index;
import javax.persistence.JoinColumn;
import javax.persistence.JoinTable;
@@ -42,7 +46,7 @@
})
@Entity
@Table(indexes = {@Index(columnList="ingeststatus")
- , @Index(columnList="md5")
+ , @Index(columnList="checksumvalue")
, @Index(columnList="contenttype")
, @Index(columnList="restricted")})
public class DataFile extends DvObject implements Comparable {
@@ -62,9 +66,56 @@ public class DataFile extends DvObject implements Comparable {
@Column( nullable = false )
private String fileSystemName;
-
- @Column( nullable = false )
- private String md5;
+
+ /**
+ * End users will see "SHA-1" (with a hyphen) rather than "SHA1" in the GUI
+ * and API but in the "datafile" table we persist "SHA1" (no hyphen) for
+ * type safety (using keys of the enum). In the "setting" table, we persist
+ * "SHA-1" (with a hyphen) to match the GUI and the "Algorithm Name" list at
+ * https://docs.oracle.com/javase/8/docs/technotes/guides/security/StandardNames.html#MessageDigest
+ *
+ * The list of types should be limited to the list above in the technote
+ * because the string gets passed into MessageDigest.getInstance() and you
+ * can't just pass in any old string.
+ */
+ public enum ChecksumType {
+
+ MD5("MD5"),
+ SHA1("SHA-1");
+
+ private final String text;
+
+ private ChecksumType(final String text) {
+ this.text = text;
+ }
+
+ public static ChecksumType fromString(String text) {
+ if (text != null) {
+ for (ChecksumType checksumType : ChecksumType.values()) {
+ if (text.equals(checksumType.text)) {
+ return checksumType;
+ }
+ }
+ }
+ throw new IllegalArgumentException("ChecksumType must be one of these values: " + Arrays.asList(ChecksumType.values()) + ".");
+ }
+
+ @Override
+ public String toString() {
+ return text;
+ }
+ }
+
+ @Column(nullable = false)
+ @Enumerated(EnumType.STRING)
+ private ChecksumType checksumType;
+
+ /**
+ * Examples include "f622da34d54bdc8ee541d6916ac1c16f" as an MD5 value or
+ * "3a484dfdb1b429c2e15eb2a735f1f5e4d5b04ec6" as a SHA-1 value"
+ */
+ @Column(nullable = false)
+ private String checksumValue;
@Column(nullable=true)
private Long filesize; // Number of bytes in file. Allows 0 and null, negative numbers not permitted
@@ -370,15 +421,26 @@ public void setRestricted(boolean restricted) {
this.restricted = restricted;
}
+ public ChecksumType getChecksumType() {
+ return checksumType;
+ }
- public String getmd5() {
- return this.md5;
+ public void setChecksumType(ChecksumType checksumType) {
+ this.checksumType = checksumType;
}
-
- public void setmd5(String md5) {
- this.md5 = md5;
+
+ public String getChecksumValue() {
+ return this.checksumValue;
}
-
+
+ public void setChecksumValue(String checksumValue) {
+ this.checksumValue = checksumValue;
+ }
+
+ public String getOriginalChecksumType() {
+ return BundleUtil.getStringFromBundle("file.originalChecksumType", Arrays.asList(this.checksumType.toString()) );
+ }
+
public DataFileIO getAccessObject() throws IOException {
DataFileIO dataAccess = DataAccess.createDataAccessObject(this);
@@ -632,4 +694,19 @@ public boolean hasGeospatialTag(){
}
return false;
}
-}
\ No newline at end of file
+
+ public String getPublicationDateFormattedYYYYMMDD() {
+ if (getPublicationDate() != null){
+ return new SimpleDateFormat("yyyy-MM-dd").format(getPublicationDate());
+ }
+ return null;
+ }
+
+ public String getCreateDateFormattedYYYYMMDD() {
+ if (getCreateDate() != null){
+ return new SimpleDateFormat("yyyy-MM-dd").format(getCreateDate());
+ }
+ return null;
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java
index c2ecce1543e..e393848f68f 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataFileServiceBean.java
@@ -255,7 +255,7 @@ public DataFile findCheapAndEasy(Long id) {
Object[] result = null;
try {
- result = (Object[]) em.createNativeQuery("SELECT t0.ID, t0.CREATEDATE, t0.INDEXTIME, t0.MODIFICATIONTIME, t0.PERMISSIONINDEXTIME, t0.PERMISSIONMODIFICATIONTIME, t0.PUBLICATIONDATE, t0.CREATOR_ID, t0.RELEASEUSER_ID, t0.PREVIEWIMAGEAVAILABLE, t1.CONTENTTYPE, t1.FILESYSTEMNAME, t1.FILESIZE, t1.INGESTSTATUS, t1.MD5, t1.RESTRICTED, t3.ID, t3.AUTHORITY, t3.IDENTIFIER FROM DVOBJECT t0, DATAFILE t1, DVOBJECT t2, DATASET t3 WHERE ((t0.ID = " + id + ") AND (t0.OWNER_ID = t2.ID) AND (t2.ID = t3.ID) AND (t1.ID = t0.ID))").getSingleResult();
+ result = (Object[]) em.createNativeQuery("SELECT t0.ID, t0.CREATEDATE, t0.INDEXTIME, t0.MODIFICATIONTIME, t0.PERMISSIONINDEXTIME, t0.PERMISSIONMODIFICATIONTIME, t0.PUBLICATIONDATE, t0.CREATOR_ID, t0.RELEASEUSER_ID, t0.PREVIEWIMAGEAVAILABLE, t1.CONTENTTYPE, t1.FILESYSTEMNAME, t1.FILESIZE, t1.INGESTSTATUS, t1.CHECKSUMVALUE, t1.RESTRICTED, t3.ID, t3.AUTHORITY, t3.IDENTIFIER, t1.CHECKSUMTYPE FROM DVOBJECT t0, DATAFILE t1, DVOBJECT t2, DATASET t3 WHERE ((t0.ID = " + id + ") AND (t0.OWNER_ID = t2.ID) AND (t2.ID = t3.ID) AND (t1.ID = t0.ID))").getSingleResult();
} catch (Exception ex) {
return null;
}
@@ -267,6 +267,7 @@ public DataFile findCheapAndEasy(Long id) {
Integer file_id = (Integer) result[0];
dataFile = new DataFile();
+ dataFile.setMergeable(false);
dataFile.setId(file_id.longValue());
@@ -346,14 +347,14 @@ public DataFile findCheapAndEasy(Long id) {
String md5 = (String) result[14];
if (md5 != null) {
- dataFile.setmd5(md5);
+ dataFile.setChecksumValue(md5);
}
Boolean restricted = (Boolean) result[15];
if (restricted != null) {
dataFile.setRestricted(restricted);
}
-
+
Dataset owner = new Dataset();
@@ -362,6 +363,17 @@ public DataFile findCheapAndEasy(Long id) {
owner.setId((Long)result[16]);
owner.setAuthority((String)result[17]);
owner.setIdentifier((String)result[18]);
+
+ String checksumType = (String) result[19];
+ if (checksumType != null) {
+ try {
+ // In the database we store "SHA1" rather than "SHA-1".
+ DataFile.ChecksumType typeFromStringInDatabase = DataFile.ChecksumType.valueOf(checksumType);
+ dataFile.setChecksumType(typeFromStringInDatabase);
+ } catch (IllegalArgumentException ex) {
+ logger.info("Exception trying to convert " + checksumType + " to enum: " + ex);
+ }
+ }
dataFile.setOwner(owner);
@@ -424,7 +436,7 @@ public void findFileMetadataOptimizedExperimental(Dataset owner, DatasetVersion
int i = 0;
- List dataTableResults = em.createNativeQuery("SELECT t0.ID, t0.DATAFILE_ID, t0.UNF, t0.CASEQUANTITY, t0.VARQUANTITY, t0.ORIGINALFILEFORMAT FROM dataTable t0, dataFile t1, dvObject t2 WHERE ((t0.DATAFILE_ID = t1.ID) AND (t1.ID = t2.ID) AND (t2.OWNER_ID = " + owner.getId() + "))").getResultList();
+ List dataTableResults = em.createNativeQuery("SELECT t0.ID, t0.DATAFILE_ID, t0.UNF, t0.CASEQUANTITY, t0.VARQUANTITY, t0.ORIGINALFILEFORMAT FROM dataTable t0, dataFile t1, dvObject t2 WHERE ((t0.DATAFILE_ID = t1.ID) AND (t1.ID = t2.ID) AND (t2.OWNER_ID = " + owner.getId() + ")) ORDER BY t0.ID").getResultList();
for (Object[] result : dataTableResults) {
DataTable dataTable = new DataTable();
@@ -465,12 +477,13 @@ public void findFileMetadataOptimizedExperimental(Dataset owner, DatasetVersion
i = 0;
- List fileResults = em.createNativeQuery("SELECT t0.ID, t0.CREATEDATE, t0.INDEXTIME, t0.MODIFICATIONTIME, t0.PERMISSIONINDEXTIME, t0.PERMISSIONMODIFICATIONTIME, t0.PUBLICATIONDATE, t0.CREATOR_ID, t0.RELEASEUSER_ID, t1.CONTENTTYPE, t1.FILESYSTEMNAME, t1.FILESIZE, t1.INGESTSTATUS, t1.MD5, t1.RESTRICTED FROM DVOBJECT t0, DATAFILE t1 WHERE ((t0.OWNER_ID = " + owner.getId() + ") AND ((t1.ID = t0.ID) AND (t0.DTYPE = 'DataFile')))").getResultList();
+ List fileResults = em.createNativeQuery("SELECT t0.ID, t0.CREATEDATE, t0.INDEXTIME, t0.MODIFICATIONTIME, t0.PERMISSIONINDEXTIME, t0.PERMISSIONMODIFICATIONTIME, t0.PUBLICATIONDATE, t0.CREATOR_ID, t0.RELEASEUSER_ID, t1.CONTENTTYPE, t1.FILESYSTEMNAME, t1.FILESIZE, t1.INGESTSTATUS, t1.CHECKSUMVALUE, t1.RESTRICTED, t1.CHECKSUMTYPE FROM DVOBJECT t0, DATAFILE t1 WHERE ((t0.OWNER_ID = " + owner.getId() + ") AND ((t1.ID = t0.ID) AND (t0.DTYPE = 'DataFile'))) ORDER BY t0.ID").getResultList();
for (Object[] result : fileResults) {
Integer file_id = (Integer) result[0];
DataFile dataFile = new DataFile();
+ dataFile.setMergeable(false);
dataFile.setId(file_id.longValue());
@@ -544,13 +557,24 @@ public void findFileMetadataOptimizedExperimental(Dataset owner, DatasetVersion
String md5 = (String) result[13];
if (md5 != null) {
- dataFile.setmd5(md5);
+ dataFile.setChecksumValue(md5);
}
Boolean restricted = (Boolean) result[14];
if (restricted != null) {
dataFile.setRestricted(restricted);
}
+
+ String checksumType = (String) result[15];
+ if (checksumType != null) {
+ try {
+ // In the database we store "SHA1" rather than "SHA-1".
+ DataFile.ChecksumType typeFromStringInDatabase = DataFile.ChecksumType.valueOf(checksumType);
+ dataFile.setChecksumType(typeFromStringInDatabase);
+ } catch (IllegalArgumentException ex) {
+ logger.info("Exception trying to convert " + checksumType + " to enum: " + ex);
+ }
+ }
// TODO:
// - if ingest status is "bad", look up the ingest report;
@@ -577,7 +601,6 @@ public void findFileMetadataOptimizedExperimental(Dataset owner, DatasetVersion
filesMap.put(dataFile.getId(), i++);
}
- owner.setFiles(dataFiles);
fileResults = null;
logger.fine("Retrieved and cached "+i+" datafiles.");
@@ -591,13 +614,14 @@ public void findFileMetadataOptimizedExperimental(Dataset owner, DatasetVersion
logger.fine("Retreived "+i+" file categories attached to the dataset.");
if (requestedVersion != null) {
- requestedVersion.setFileMetadatas(retrieveFileMetadataForVersion(owner, requestedVersion, filesMap, categoryMap));
+ requestedVersion.setFileMetadatas(retrieveFileMetadataForVersion(owner, requestedVersion, dataFiles, filesMap, categoryMap));
} else {
for (DatasetVersion version : owner.getVersions()) {
- version.setFileMetadatas(retrieveFileMetadataForVersion(owner, version, filesMap, categoryMap));
+ version.setFileMetadatas(retrieveFileMetadataForVersion(owner, version, dataFiles, filesMap, categoryMap));
logger.fine("Retrieved "+version.getFileMetadatas().size()+" filemetadatas for the version "+version.getId());
}
}
+ owner.setFiles(dataFiles);
}
private List retrieveFileAccessRequesters(DataFile fileIn){
@@ -616,7 +640,7 @@ private List retrieveFileAccessRequesters(DataFile fileIn){
return retList;
}
- private List retrieveFileMetadataForVersion(Dataset dataset, DatasetVersion version, Map filesMap, Map categoryMap) {
+ private List retrieveFileMetadataForVersion(Dataset dataset, DatasetVersion version, List dataFiles, Map filesMap, Map categoryMap) {
List retList = new ArrayList<>();
Map> categoryMetaMap = new HashMap<>();
@@ -634,7 +658,7 @@ private List retrieveFileMetadataForVersion(Dataset dataset, Datas
logger.fine("Retrieved and mapped "+i+" file categories attached to files in the version "+version.getId());
categoryResults = null;
- List metadataResults = em.createNativeQuery("select id, datafile_id, DESCRIPTION, LABEL, RESTRICTED from FileMetadata where datasetversion_id = "+version.getId()).getResultList();
+ List metadataResults = em.createNativeQuery("select id, datafile_id, DESCRIPTION, LABEL, RESTRICTED, DIRECTORYLABEL from FileMetadata where datasetversion_id = "+version.getId() + " ORDER BY LABEL").getResultList();
for (Object[] result : metadataResults) {
Integer filemeta_id = (Integer) result[0];
@@ -666,7 +690,8 @@ private List retrieveFileMetadataForVersion(Dataset dataset, Datas
fileMetadata.setDatasetVersion(version);
- fileMetadata.setDataFile(dataset.getFiles().get(file_list_id));
+ //fileMetadata.setDataFile(dataset.getFiles().get(file_list_id));
+ fileMetadata.setDataFile(dataFiles.get(file_list_id));
String description = (String) result[2];
@@ -685,6 +710,11 @@ private List retrieveFileMetadataForVersion(Dataset dataset, Datas
fileMetadata.setRestricted(restricted);
}
+ String dirLabel = (String) result[5];
+ if (dirLabel != null){
+ fileMetadata.setDirectoryLabel(dirLabel);
+ }
+
retList.add(fileMetadata);
}
@@ -693,7 +723,14 @@ private List retrieveFileMetadataForVersion(Dataset dataset, Datas
logger.fine("Retrieved "+retList.size()+" file metadatas for version "+version.getId()+" (inside the retrieveFileMetadataForVersion method).");
- Collections.sort(retList, FileMetadata.compareByLabel);
+ /*
+ We no longer perform this sort here, just to keep this filemetadata
+ list as identical as possible to when it's produced by the "traditional"
+ EJB method. When it's necessary to have the filemetadatas sorted by
+ FileMetadata.compareByLabel, the DatasetVersion.getFileMetadatasSorted()
+ method should be called.
+
+ Collections.sort(retList, FileMetadata.compareByLabel); */
return retList;
}
@@ -721,9 +758,12 @@ public List findAll() {
}
public DataFile save(DataFile dataFile) {
-
- DataFile savedDataFile = em.merge(dataFile);
- return savedDataFile;
+ if (dataFile.isMergeable()) {
+ DataFile savedDataFile = em.merge(dataFile);
+ return savedDataFile;
+ } else {
+ throw new IllegalArgumentException("This DataFile object has been set to NOT MERGEABLE; please ensure a MERGEABLE object is passed to the save method.");
+ }
}
public Boolean isPreviouslyPublished(Long fileId){
@@ -759,7 +799,7 @@ public List findHarvestedFilesByClient(HarvestingClient harvestingClie
return query.getResultList();
}
- /**/
+ /*moving to the fileutil*/
public void generateStorageIdentifier(DataFile dataFile) {
dataFile.setStorageIdentifier(generateStorageIdentifier());
diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataset.java b/src/main/java/edu/harvard/iq/dataverse/Dataset.java
index 39a953d0de7..5b0dc443942 100644
--- a/src/main/java/edu/harvard/iq/dataverse/Dataset.java
+++ b/src/main/java/edu/harvard/iq/dataverse/Dataset.java
@@ -10,6 +10,7 @@
import java.util.Date;
import java.util.List;
import java.util.Objects;
+import java.util.logging.Logger;
import javax.persistence.CascadeType;
import javax.persistence.Column;
import javax.persistence.Entity;
@@ -41,12 +42,14 @@
@Index(columnList = "thumbnailfile_id")},
uniqueConstraints = @UniqueConstraint(columnNames = {"authority,protocol,identifier,doiseparator"}))
public class Dataset extends DvObjectContainer {
+ private static final Logger logger = Logger.getLogger(Dataset.class.getCanonicalName());
// public static final String REDIRECT_URL = "/dataset.xhtml?persistentId=";
public static final String TARGET_URL = "/citation?persistentId=";
private static final long serialVersionUID = 1L;
@OneToMany(mappedBy = "owner", cascade = CascadeType.MERGE)
+ @OrderBy("id")
private List files = new ArrayList();
private String protocol;
@@ -185,10 +188,12 @@ public String getGlobalId() {
}
public List getFiles() {
+ //logger.info("getFiles() on dataset "+this.getId());
return files;
}
public void setFiles(List files) {
+ logger.info("setFiles() on dataset "+this.getId());
this.files = files;
}
@@ -214,6 +219,10 @@ public boolean isDeaccessioned() {
if (testDsv.isReleased()) {
return false;
}
+ //Also check for draft version
+ if (testDsv.isDraft()) {
+ return false;
+ }
if (testDsv.isDeaccessioned()) {
hasDeaccessionedVersions = true;
}
@@ -282,6 +291,7 @@ private DatasetVersion createNewDatasetVersion(Template template) {
newFm.setCategories(fm.getCategories());
newFm.setDescription(fm.getDescription());
newFm.setLabel(fm.getLabel());
+ newFm.setDirectoryLabel(fm.getDirectoryLabel());
newFm.setRestricted(fm.isRestricted());
newFm.setDataFile(fm.getDataFile());
newFm.setDatasetVersion(dsv);
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
index 55811ac8b1a..a232052b6ee 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetPage.java
@@ -3,16 +3,12 @@
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
-import edu.harvard.iq.dataverse.authorization.users.ApiToken;
-import edu.harvard.iq.dataverse.authorization.users.User;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.authorization.users.PrivateUrlUser;
-import edu.harvard.iq.dataverse.authorization.users.GuestUser;
import edu.harvard.iq.dataverse.datavariable.VariableServiceBean;
import edu.harvard.iq.dataverse.engine.command.Command;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.CreateGuestbookResponseCommand;
import edu.harvard.iq.dataverse.engine.command.impl.CreatePrivateUrlCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeaccessionDatasetVersionCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeleteDatasetVersionCommand;
@@ -37,6 +33,7 @@
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.BundleUtil;
import edu.harvard.iq.dataverse.util.FileSortFieldAndOrder;
+import edu.harvard.iq.dataverse.util.FileUtil;
import edu.harvard.iq.dataverse.util.JsfHelper;
import static edu.harvard.iq.dataverse.util.JsfHelper.JH;
import edu.harvard.iq.dataverse.util.StringUtil;
@@ -68,18 +65,15 @@
import javax.inject.Named;
import org.primefaces.event.FileUploadEvent;
import org.primefaces.model.UploadedFile;
-import javax.servlet.ServletOutputStream;
-import javax.servlet.http.HttpServletResponse;
import javax.validation.ConstraintViolation;
import org.apache.commons.httpclient.HttpClient;
import org.primefaces.context.RequestContext;
-import java.text.DateFormat;
import java.util.Arrays;
import java.util.HashSet;
import javax.faces.model.SelectItem;
import java.util.logging.Level;
-import javax.faces.component.UIComponent;
-import javax.faces.component.UIInput;
+import edu.harvard.iq.dataverse.datasetutility.TwoRavensHelper;
+import edu.harvard.iq.dataverse.datasetutility.WorldMapPermissionHelper;
import javax.faces.event.AjaxBehaviorEvent;
@@ -148,6 +142,8 @@ public enum DisplayMode {
@EJB
GuestbookResponseServiceBean guestbookResponseService;
@EJB
+ FileDownloadServiceBean fileDownloadService;
+ @EJB
DataverseLinkingServiceBean dvLinkingService;
@EJB
DatasetLinkingServiceBean dsLinkingService;
@@ -162,8 +158,11 @@ public enum DisplayMode {
@Inject
DatasetVersionUI datasetVersionUI;
@Inject PermissionsWrapper permissionsWrapper;
+ @Inject FileDownloadHelper fileDownloadHelper;
+ @Inject TwoRavensHelper twoRavensHelper;
+ @Inject WorldMapPermissionHelper worldMapPermissionHelper;
+
- private final DateFormat displayDateFormat = DateFormat.getDateInstance(DateFormat.MEDIUM);
private Dataset dataset = new Dataset();
private EditMode editMode;
@@ -186,6 +185,7 @@ public enum DisplayMode {
private List dataverseTemplates = new ArrayList();
private Template defaultTemplate;
private Template selectedTemplate;
+ private String globalId;
private String persistentId;
private String version;
private String protocol = "";
@@ -210,10 +210,8 @@ public enum DisplayMode {
// Used to store results of permissions checks
private final Map datasetPermissionMap = new HashMap<>(); // { Permission human_name : Boolean }
- private final Map fileDownloadPermissionMap = new HashMap<>(); // { FileMetadata.id : Boolean }
+
- private final Map fileMetadataTwoRavensExploreMap = new HashMap<>(); // { FileMetadata.id : Boolean }
- private final Map fileMetadataWorldMapExplore = new HashMap<>(); // { FileMetadata.id : Boolean }
private DataFile selectedDownloadFile;
@@ -266,7 +264,7 @@ public void setFileMetadatasSearch(List fileMetadatasSearch) {
}
public void updateFileSearch(){
- logger.info("updading file search list");
+ logger.info("updating file search list");
if (readOnly) {
this.fileMetadatasSearch = selectFileMetadatasForDisplay(this.fileLabelSearchTerm);
} else {
@@ -301,7 +299,7 @@ private List selectFileMetadatasForDisplay(String searchTerm) {
List retList = new ArrayList<>();
- for (FileMetadata fileMetadata : workingVersion.getFileMetadatas()) {
+ for (FileMetadata fileMetadata : workingVersion.getFileMetadatasSorted()) {
if (searchResultsIdSet == null || searchResultsIdSet.contains(fileMetadata.getId())) {
retList.add(fileMetadata);
}
@@ -320,7 +318,11 @@ public Long getMaxFileUploadSizeInBytes(){
}
public boolean isUnlimitedUploadFileSize(){
- return (this.maxFileUploadSizeInBytes == null);
+
+ if (this.maxFileUploadSizeInBytes == null){
+ return true;
+ }
+ return false;
}
public boolean isMetadataExportEnabled() {
@@ -445,88 +447,15 @@ public boolean isNoDVsRemaining() {
return noDVsRemaining;
}
- /**
- * Convenience method for "Download File" button display logic
- *
- * Used by the dataset.xhtml render logic when listing files
- * Assumes user already has view access to the file list.
- *
- * @param fileMetadata
- * @return boolean
- */
- public boolean canDownloadFile(FileMetadata fileMetadata){
-
- if (fileMetadata == null){
- return false;
- }
-
- if ((fileMetadata.getId() == null) || (fileMetadata.getDataFile().getId() == null)){
- return false;
- }
-
- // --------------------------------------------------------------------
- // Grab the fileMetadata.id and restriction flag
- // --------------------------------------------------------------------
- Long fid = fileMetadata.getId();
- //logger.info("calling candownloadfile on filemetadata "+fid);
- boolean isRestrictedFile = fileMetadata.isRestricted();
-
-
- // --------------------------------------------------------------------
- // Has this file been checked? Look at the DatasetPage hash
- // --------------------------------------------------------------------
- if (this.fileDownloadPermissionMap.containsKey(fid)){
- // Yes, return previous answer
- //logger.info("using cached result for candownloadfile on filemetadata "+fid);
- return this.fileDownloadPermissionMap.get(fid);
- }
- // --------------------------------------------------------------------
- // (1) Is the file Unrestricted ?
- // --------------------------------------------------------------------
- if (!isRestrictedFile){
- // Yes, save answer and return true
- this.fileDownloadPermissionMap.put(fid, true);
- return true;
- }
-
- // --------------------------------------------------------------------
- // Conditions (2) through (3) are for Restricted files
- // --------------------------------------------------------------------
-
- // --------------------------------------------------------------------
- // (2) Does the User have DownloadFile Permission at the **Dataset** level
- // Michael: Leaving this in for now, but shouldn't this be alredy resolved
- // by the premission system, given that files are never permission roots?
- // --------------------------------------------------------------------
- if (this.doesSessionUserHaveDataSetPermission(Permission.DownloadFile)){
- // Yes, save answer and return true
- this.fileDownloadPermissionMap.put(fid, true);
- return true;
- }
-
- // --------------------------------------------------------------------
- // (3) Does the user has DownloadFile permission on the DataFile
- // --------------------------------------------------------------------
- if (this.permissionService.on(fileMetadata.getDataFile()).has(Permission.DownloadFile)){
- this.fileDownloadPermissionMap.put(fid, true);
- return true;
- }
-
- // --------------------------------------------------------------------
- // (4) No download for you! Come back with permissions!
- // --------------------------------------------------------------------
- this.fileDownloadPermissionMap.put(fid, false);
-
- return false;
- }
public boolean isThumbnailAvailable(FileMetadata fileMetadata) {
// new and optimized logic:
// - check download permission here (should be cached - so it's free!)
// - only then ask the file service if the thumbnail is available/exists.
// the service itself no longer checks download permissions.
- if (!this.canDownloadFile(fileMetadata)) {
+
+ if (!this.fileDownloadHelper.canDownloadFile(fileMetadata)) {
return false;
}
@@ -537,28 +466,14 @@ public boolean isThumbnailAvailable(FileMetadata fileMetadata) {
public boolean canUpdateDataset() {
return permissionsWrapper.canUpdateDataset(dvRequestService.getDataverseRequest(), this.dataset);
}
-
public boolean canPublishDataverse() {
return permissionsWrapper.canIssuePublishDataverseCommand(dataset.getOwner());
}
-
- //public boolean canIssuePublishDatasetCommand() {
- // replacing permissionsWrapper.canIssuePublishDatasetCommand(DatasetPage.dataset) on the page
- // return true;
- //}
-
- //public boolean canIssueDeleteCommand() {
- // return true;
- //}
-
- //public boolean canManagePermissions() {
- // return true;
- //}
-
+
public boolean canViewUnpublishedDataset() {
return permissionsWrapper.canViewUnpublishedDataset( dvRequestService.getDataverseRequest(), dataset);
}
-
+
/*
* 4.2.1 optimization.
* HOWEVER, this doesn't appear to be saving us anything!
@@ -657,15 +572,7 @@ public void reset() {
dataset.setGuestbook(null);
}
- public void guestbookResponseValidator(FacesContext context, UIComponent toValidate, Object value) {
- String response = (String) value;
- if (response != null && response.length() > 255) {
- ((UIInput) toValidate).setValid(false);
- FacesMessage message = new FacesMessage(FacesMessage.SEVERITY_ERROR, JH.localize("dataset.guestbookResponse.guestbook.responseTooLong"), null);
- context.addMessage(toValidate.getClientId(context), message);
- }
- }
public String getGlobalId() {
return persistentId;
@@ -1025,213 +932,6 @@ private void msg(String s){
// System.out.println(s);
}
- /**
- * See table in: https://github.com/IQSS/dataverse/issues/1618
- *
- * Can the user see a reminder to publish button?
- * (0) The application has to be set to Create Edit Maps - true
- * (1) Logged in user
- * (2) Is geospatial file?
- * (3) File has NOT been released
- * (4) No existing Map
- * (5) Can Edit Dataset
- *
- * @param FileMetadata fm
- * @return boolean
- */
- public boolean canSeeMapButtonReminderToPublish(FileMetadata fm){
- if (fm==null){
-
- return false;
- }
-
- // (0) Is the view GeoconnectViewMaps
- if (!settingsService.isTrueForKey(SettingsServiceBean.Key.GeoconnectCreateEditMaps, false)){
- return false;
- }
-
-
- // (1) Is there an authenticated user?
- //
- if (!(isSessionUserAuthenticated())){
- return false;
- }
-
-
- // Is this file a Shapefile or a Tabular file tagged as Geospatial?
- //
- if (!(this.isPotentiallyMappableFileType(fm))){
- return false;
- }
-
- // (3) Is this DataFile released? Yes, don't need reminder
- //
- if (fm.getDataFile().isReleased()){
- return false;
- }
-
- // (4) Does a map already exist? Yes, don't need reminder
- //
- if (this.hasMapLayerMetadata(fm)){
- return false;
- }
-
- // (5) If so, can the logged in user edit the Dataset to which this FileMetadata belongs?
- if (!this.doesSessionUserHaveDataSetPermission(Permission.EditDataset)){
- return false;
- }
-
- // Looks good
- //
- return true;
- }
-
- /**
- * Should there be a Map Data Button for this file?
- * see table in: https://github.com/IQSS/dataverse/issues/1618
- * (1) Is the user logged in?
- * (2) Is this file a Shapefile or a Tabular file tagged as Geospatial?
- * (3) Does the logged in user have permission to edit the Dataset to which this FileMetadata belongs?
- * (4) Is the create Edit Maps flag set to true?
- * (5) Any of these conditions:
- * 9a) File Published
- * (b) Draft: File Previously published
- * @param fm FileMetadata
- * @return boolean
- */
- public boolean canUserSeeMapDataButton(FileMetadata fm){
-
- if (fm==null){
- return false;
- }
-
-
- // (1) Is there an authenticated user?
- if (!(isSessionUserAuthenticated())){
- return false;
- }
-
- // (2) Is this file a Shapefile or a Tabular file tagged as Geospatial?
- // TO DO: EXPAND FOR TABULAR FILES TAGGED AS GEOSPATIAL!
- //
- if (!(this.isPotentiallyMappableFileType(fm))){
- return false;
- }
-
- // (3) Does the user have Edit Dataset permissions?
- //
- if (!this.doesSessionUserHaveDataSetPermission(Permission.EditDataset)){
- return false;
- }
-
- // (4) Is the view GeoconnectViewMaps
- if (!settingsService.isTrueForKey(SettingsServiceBean.Key.GeoconnectCreateEditMaps, false)){
- return false;
- }
-
- // (5) Is File released?
- //
- if (fm.getDataFile().isReleased()){
- return true;
- }
-
- // Nope
- return false;
- }
-
-
- /**
- * Used in the .xhtml file to check whether a tabular file
- * may be viewed via TwoRavens
- *
- * @param fm
- * @return
- */
- public boolean canSeeTwoRavensExploreButton(FileMetadata fm){
-
- if (fm == null){
- return false;
- }
-
- // Has this already been checked?
- if (this.fileMetadataTwoRavensExploreMap.containsKey(fm.getId())){
- // Yes, return previous answer
- //logger.info("using cached result for candownloadfile on filemetadata "+fid);
- return this.fileMetadataTwoRavensExploreMap.get(fm.getId());
- }
-
-
- // (1) Is TwoRavens active via the "setting" table?
- // Nope: get out
- //
- if (!settingsService.isTrueForKey(SettingsServiceBean.Key.TwoRavensTabularView, false)){
- this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
- return false;
- }
-
- // (2) Does the user have download permission?
- // Nope: get out
- //
- if (!(this.canDownloadFile(fm))){
- this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
- return false;
- }
- // (3) Is the DataFile object there and persisted?
- // Nope: scat
- //
- if ((fm.getDataFile() == null)||(fm.getDataFile().getId()==null)){
- this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
- return false;
- }
-
- // (4) Is there tabular data or is the ingest in progress?
- // Yes: great
- //
- if ((fm.getDataFile().isTabularData())||(fm.getDataFile().isIngestInProgress())){
- this.fileMetadataTwoRavensExploreMap.put(fm.getId(), true);
- return true;
- }
-
- // Nope
- this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
- return false;
-
- // (empty fileMetadata.dataFile.id) and (fileMetadata.dataFile.tabularData or fileMetadata.dataFile.ingestInProgress)
- // and DatasetPage.canDownloadFile(fileMetadata)
- }
-
- /**
- * Check if this is a mappable file type.
- *
- * Currently (2/2016)
- * - Shapefile (zipped shapefile)
- * - Tabular file with Geospatial Data tag
- *
- * @param fm
- * @return
- */
- private boolean isPotentiallyMappableFileType(FileMetadata fm){
- if (fm==null){
- return false;
- }
-
- // Yes, it's a shapefile
- //
- if (this.isShapefileType(fm)){
- return true;
- }
-
- // Yes, it's tabular with a geospatial tag
- //
- if (fm.getDataFile().isTabularData()){
- if (fm.getDataFile().hasGeospatialTag()){
- return true;
- }
- }
- return false;
- }
-
-
/**
* For development
*
@@ -1253,61 +953,6 @@ public boolean isGeoconnectDebugAvailable(){
return false;
}
-
-
- /**
- * Should there be a Explore WorldMap Button for this file?
- * See table in: https://github.com/IQSS/dataverse/issues/1618
- *
- * (1) Does the file have MapLayerMetadata?
- * (2) Is there DownloadFile permission for this file?
- *
- * @param fm FileMetadata
- * @return boolean
- */
- public boolean canUserSeeExploreWorldMapButton(FileMetadata fm){
- if (fm==null){
- return false;
- }
-
- if (this.fileMetadataWorldMapExplore.containsKey(fm.getId())){
- // Yes, return previous answer
- //logger.info("using cached result for candownloadfile on filemetadata "+fid);
- return this.fileMetadataWorldMapExplore.get(fm.getId());
- }
-
- /* -----------------------------------------------------
- Does a Map Exist?
- ----------------------------------------------------- */
- if (!(this.hasMapLayerMetadata(fm))){
- // Nope: no button
- this.fileMetadataWorldMapExplore.put(fm.getId(), false);
- return false;
- }
-
- /*
- Is setting for GeoconnectViewMaps true?
- Nope? no button
- */
- if (!settingsService.isTrueForKey(SettingsServiceBean.Key.GeoconnectViewMaps, false)){
- this.fileMetadataWorldMapExplore.put(fm.getId(), false);
- return false;
- }
-
- /* -----------------------------------------------------
- Does user have DownloadFile permission for this file?
- Yes: User can view button!
- ----------------------------------------------------- */
- if (this.canDownloadFile(fm)){
- this.fileMetadataWorldMapExplore.put(fm.getId(), true);
- return true;
- }
-
- // Nope: Can't see button
- //
- this.fileMetadataWorldMapExplore.put(fm.getId(), false);
- return false;
- }
/**
* Create a hashmap consisting of { DataFile.id : MapLayerMetadata object}
@@ -1317,7 +962,6 @@ public boolean canUserSeeExploreWorldMapButton(FileMetadata fm){
*/
private void loadMapLayerMetadataLookup() {
if (this.dataset == null) {
- return;
}
if (this.dataset.getId() == null) {
return;
@@ -1471,22 +1115,19 @@ private String init(boolean initFull) {
// init the citation
displayCitation = dataset.getCitation(true, workingVersion);
-
+
if (initFull) {
- // init the files
- //fileMetadatas = populateFileMetadatas();
+ // init the list of FileMetadatas
if (workingVersion.isDraft() && canUpdateDataset()) {
readOnly = false;
- fileMetadatasSearch = workingVersion.getFileMetadatasSorted();
} else {
// an attempt to retreive both the filemetadatas and datafiles early on, so that
// we don't have to do so later (possibly, many more times than necessary):
-
datafileService.findFileMetadataOptimizedExperimental(dataset);
- fileMetadatasSearch = workingVersion.getFileMetadatas();
}
-
+ fileMetadatasSearch = workingVersion.getFileMetadatasSorted();
+
ownerId = dataset.getOwner().getId();
datasetNextMajorVersion = this.dataset.getNextMajorVersionString();
datasetNextMinorVersion = this.dataset.getNextMinorVersionString();
@@ -1501,6 +1142,8 @@ private String init(boolean initFull) {
// lazyModel = new LazyFileMetadataDataModel(workingVersion.getId(), datafileService );
// populate MapLayerMetadata
this.loadMapLayerMetadataLookup(); // A DataFile may have a related MapLayerMetadata object
+ this.guestbookResponse = guestbookResponseService.initGuestbookResponseForFragment(dataset, null, session);
+
}
} else if (ownerId != null) {
// create mode for a new child dataset
@@ -1565,181 +1208,6 @@ public boolean isReadOnly() {
return readOnly;
}
- public String saveGuestbookResponse(String type) {
- boolean valid = true;
- if (dataset.getGuestbook() != null) {
- if (dataset.getGuestbook().isNameRequired()) {
- if (this.guestbookResponse.getName() == null) {
- valid = false;
- } else {
- valid &= !this.guestbookResponse.getName().isEmpty();
- }
- }
- if (dataset.getGuestbook().isEmailRequired()) {
- if (this.guestbookResponse.getEmail() == null) {
- valid = false;
- } else {
- valid &= !this.guestbookResponse.getEmail().isEmpty();
- }
- }
- if (dataset.getGuestbook().isInstitutionRequired()) {
- if (this.guestbookResponse.getInstitution() == null) {
- valid = false;
- } else {
- valid &= !this.guestbookResponse.getInstitution().isEmpty();
- }
- }
- if (dataset.getGuestbook().isPositionRequired()) {
- if (this.guestbookResponse.getPosition() == null) {
- valid = false;
- } else {
- valid &= !this.guestbookResponse.getPosition().isEmpty();
- }
- }
- }
-
- if (dataset.getGuestbook() != null && !dataset.getGuestbook().getCustomQuestions().isEmpty()) {
- for (CustomQuestion cq : dataset.getGuestbook().getCustomQuestions()) {
- if (cq.isRequired()) {
- for (CustomQuestionResponse cqr : this.guestbookResponse.getCustomQuestionResponses()) {
- if (cqr.getCustomQuestion().equals(cq)) {
- valid &= (cqr.getResponse() != null && !cqr.getResponse().isEmpty());
- }
- }
- }
- }
- }
-
- if (!valid) {
- FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_ERROR, "Validation Error", "Please complete required fields for download and re-try."));
- return "";
- }
-
- Command cmd;
- try {
- if (this.guestbookResponse != null) {
- if (!type.equals("multiple")) {
- cmd = new CreateGuestbookResponseCommand(dvRequestService.getDataverseRequest(), this.guestbookResponse, dataset);
- commandEngine.submit(cmd);
- } else {
- for (FileMetadata fmd : this.selectedDownloadableFiles) {
- DataFile df = fmd.getDataFile();
- if (df != null) {
- this.guestbookResponse.setDataFile(df);
- cmd = new CreateGuestbookResponseCommand(dvRequestService.getDataverseRequest(), this.guestbookResponse, dataset);
- commandEngine.submit(cmd);
- }
- }
- }
- }
- } catch (CommandException ex) {
- FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_ERROR, "Guestbook Response Save Failed", " - " + ex.toString()));
- logger.severe(ex.getMessage());
- }
-
- if (type.equals("multiple")){
- //return callDownloadServlet(getSelectedFilesIdsString());
- callDownloadServlet(getDownloadableFilesIdsString());
- }
-
- if ((type.equals("download") || type.isEmpty())) {
- //return callDownloadServlet(downloadFormat, this.selectedDownloadFile.getId());
- if(type.isEmpty()){
- downloadFormat = "download";
- }
- callDownloadServlet(downloadFormat, this.selectedDownloadFile.getId());
- }
-
- if (type.equals("explore")) {
- String retVal = getDataExploreURLComplete(this.selectedDownloadFile.getId());
- try {
- FacesContext.getCurrentInstance().getExternalContext().redirect(retVal);
- return retVal;
- } catch (IOException ex) {
- logger.info("Failed to issue a redirect to file download url.");
- }
- }
- return "";
- }
-
- public String exploreOutputLink(FileMetadata fm, String type){
- createSilentGuestbookEntry(fm, type);
- String retVal = getDataExploreURLComplete(fm.getDataFile().getId());
- try {
- FacesContext.getCurrentInstance().getExternalContext().redirect(retVal);
- } catch (IOException ex) {
- logger.info("Failed to issue a redirect to file download url.");
- }
- return "";
- }
-
- //private String callDownloadServlet(String multiFileString){
- private void callDownloadServlet(String multiFileString){
-
- String fileDownloadUrl = "/api/access/datafiles/" + multiFileString;
- try {
- FacesContext.getCurrentInstance().getExternalContext().redirect(fileDownloadUrl);
- } catch (IOException ex) {
- logger.info("Failed to issue a redirect to file download url.");
- }
-
- //return fileDownloadUrl;
- }
-
- //private String callDownloadServlet( String downloadType, Long fileId){
- private void callDownloadServlet( String downloadType, Long fileId){
-
- String fileDownloadUrl = "/api/access/datafile/" + fileId;
-
- if (downloadType != null && downloadType.equals("bundle")){
- fileDownloadUrl = "/api/access/datafile/bundle/" + this.selectedDownloadFile.getId();
- }
- if (downloadType != null && downloadType.equals("original")){
- fileDownloadUrl = "/api/access/datafile/" + this.selectedDownloadFile.getId() + "?format=original";
- }
- if (downloadType != null && downloadType.equals("RData")){
- fileDownloadUrl = "/api/access/datafile/" + this.selectedDownloadFile.getId() + "?format=RData";
- }
- if (downloadType != null && downloadType.equals("var")){
- fileDownloadUrl = "/api/meta/datafile/" + this.selectedDownloadFile.getId();
- }
- if (downloadType != null && downloadType.equals("tab")){
- fileDownloadUrl = "/api/access/datafile/" + this.selectedDownloadFile.getId()+ "?format=tab";
- }
- logger.fine("Returning file download url: " + fileDownloadUrl);
- try {
- FacesContext.getCurrentInstance().getExternalContext().redirect(fileDownloadUrl);
- } catch (IOException ex) {
- logger.info("Failed to issue a redirect to file download url.");
- }
- //return fileDownloadUrl;
- }
-
- public String getApiTokenKey() {
- ApiToken apiToken;
-
- if (session.getUser() == null) {
- // ?
- return null;
- }
- if (isSessionUserAuthenticated()) {
- AuthenticatedUser au = (AuthenticatedUser) session.getUser();
- apiToken = authService.findApiTokenByUser(au);
- if (apiToken != null) {
- return "key=" + apiToken.getTokenString();
- }
- // Generate if not available?
- // Or should it just be generated inside the authService
- // automatically?
- apiToken = authService.generateApiTokenForUser(au);
- if (apiToken != null) {
- return "key=" + apiToken.getTokenString();
- }
- }
- return "";
-
- }
-
private void resetVersionUI() {
datasetVersionUI = datasetVersionUI.initDatasetVersionUI(workingVersion, true);
@@ -2133,13 +1601,10 @@ public void refresh() {
-
if (readOnly) {
datafileService.findFileMetadataOptimizedExperimental(dataset);
- fileMetadatasSearch = workingVersion.getFileMetadatas();
- } else {
- fileMetadatasSearch = workingVersion.getFileMetadatasSorted();
- }
+ }
+ fileMetadatasSearch = workingVersion.getFileMetadatasSorted();
displayCitation = dataset.getCitation(true, workingVersion);
stateChanged = false;
@@ -2241,7 +1706,6 @@ public void setSelectedNonDownloadableFiles(List selectedNonDownlo
this.selectedNonDownloadableFiles = selectedNonDownloadableFiles;
}
-
public void validateFilesForDownload(boolean guestbookRequired){
setSelectedDownloadableFiles(new ArrayList<>());
@@ -2255,7 +1719,7 @@ public void validateFilesForDownload(boolean guestbookRequired){
for (FileMetadata fmd : this.selectedFiles){
- if(canDownloadFile(fmd)){
+ if(this.fileDownloadHelper.canDownloadFile(fmd)){
getSelectedDownloadableFiles().add(fmd);
} else {
getSelectedNonDownloadableFiles().add(fmd);
@@ -2264,9 +1728,9 @@ public void validateFilesForDownload(boolean guestbookRequired){
if(!getSelectedDownloadableFiles().isEmpty() && getSelectedNonDownloadableFiles().isEmpty()){
if (guestbookRequired){
- initGuestbookMultipleResponse();
+ modifyGuestbookMultipleResponse();
} else{
- startMultipleFileDownload();
+ startMultipleFileDownload(false);
}
}
@@ -2331,6 +1795,18 @@ public String getSelectedFilesIdsString() {
return downloadIdString;
}
+ // helper Method
+ public String getSelectedDownloadableFilesIdsString() {
+ String downloadIdString = "";
+ for (FileMetadata fmd : this.selectedDownloadableFiles){
+ if (!StringUtil.isEmpty(downloadIdString)) {
+ downloadIdString += ",";
+ }
+ downloadIdString += fmd.getDataFile().getId();
+ }
+ return downloadIdString;
+ }
+
// helper Method
public String getSelectedFilesIdsStringForDownload() {
String downloadIdString = "";
@@ -2613,10 +2089,10 @@ public void deleteFiles() {
// local filesystem:
try {
- Files.delete(Paths.get(ingestService.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier()));
+ Files.delete(Paths.get(FileUtil.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier()));
} catch (IOException ioEx) {
// safe to ignore - it's just a temp file.
- logger.warning("Failed to delete temporary file " + ingestService.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier());
+ logger.warning("Failed to delete temporary file " + FileUtil.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier());
}
dfIt.remove();
@@ -2660,12 +2136,6 @@ public String save() {
return "";
}
-
-
- // One last check before we save the files - go through the newly-uploaded
- // ones and modify their names so that there are no duplicates.
- // (but should we really be doing it here? - maybe a better approach to do it
- // in the ingest service bean, when the files get uploaded.)
// Finally, save the files permanently:
ingestService.addFiles(workingVersion, newFiles);
@@ -2788,37 +2258,6 @@ public String cancel() {
return returnToLatestVersion();
}
- public boolean isDuplicate(FileMetadata fileMetadata) {
- String thisMd5 = fileMetadata.getDataFile().getmd5();
- if (thisMd5 == null) {
- return false;
- }
-
- Map MD5Map = new HashMap();
-
- // TODO:
- // think of a way to do this that doesn't involve populating this
- // map for every file on the page?
- // man not be that much of a problem, if we paginate and never display
- // more than a certain number of files... Still, needs to be revisited
- // before the final 4.0.
- // -- L.A. 4.0
- Iterator fmIt = workingVersion.getFileMetadatas().iterator();
- while (fmIt.hasNext()) {
- FileMetadata fm = fmIt.next();
- String md5 = fm.getDataFile().getmd5();
- if (md5 != null) {
- if (MD5Map.get(md5) != null) {
- MD5Map.put(md5, MD5Map.get(md5).intValue() + 1);
- } else {
- MD5Map.put(md5, 1);
- }
- }
- }
-
- return MD5Map.get(thisMd5) != null && MD5Map.get(thisMd5).intValue() > 1;
- }
-
private HttpClient getClient() {
// TODO:
// cache the http client? -- L.A. 4.0 alpha
@@ -2906,7 +2345,7 @@ public List getVersionTabListForPostLoad(){
}
public void setVersionTabListForPostLoad(List versionTabListForPostLoad) {
-
+
this.versionTabListForPostLoad = versionTabListForPostLoad;
}
@@ -2927,8 +2366,13 @@ public Integer getCompareVersionsCount() {
* See: dataset-versions.xhtml, remoteCommand 'postLoadVersionTablList'
*/
public void postLoadSetVersionTabList(){
-
+
+ if (this.getVersionTabList().isEmpty() && workingVersion.isDeaccessioned()){
+ setVersionTabList(resetVersionTabList());
+ }
this.setVersionTabListForPostLoad(this.getVersionTabList());
+
+
//this.versionTabList = this.resetVersionTabList();
}
@@ -2979,60 +2423,13 @@ public DatasetVersionDifference getDatasetVersionDifference() {
public void setDatasetVersionDifference(DatasetVersionDifference datasetVersionDifference) {
this.datasetVersionDifference = datasetVersionDifference;
}
-
- private void createSilentGuestbookEntry(FileMetadata fileMetadata, String format){
- initGuestbookResponse(fileMetadata, format, null);
- Command cmd;
- try {
- if (this.guestbookResponse != null) {
- cmd = new CreateGuestbookResponseCommand(dvRequestService.getDataverseRequest(), guestbookResponse, dataset);
- commandEngine.submit(cmd);
- } else {
- logger.severe("No Silent/Default Guestbook response made. No response to save - probably because version is DRAFT - not certain ");
- }
- } catch (CommandException ex) {
- FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_ERROR, "Guestbook Response Save Failed", " - " + ex.toString()));
- logger.severe(ex.getMessage());
- }
-
- }
-
- //public String startMultipleFileDownload (){
- public void startMultipleFileDownload (){
-
- for (FileMetadata fmd : this.selectedDownloadableFiles) {
- if (canDownloadFile(fmd)) {
- // todo: cleanup this: "create" method doesn't necessarily
- // mean a response wikk be created (e.g. when dataset in draft)
- createSilentGuestbookEntry(fmd, "");
- }
- }
+ public void startMultipleFileDownload (Boolean writeGuestbook){
- //return
- callDownloadServlet(getDownloadableFilesIdsString());
-
- }
-
- //public String startFileDownload(FileMetadata fileMetadata, String format) {
- public void startFileDownload(FileMetadata fileMetadata, String format) {
- logger.fine("starting file download for filemetadata "+fileMetadata.getId()+", datafile "+fileMetadata.getDataFile().getId());
- createSilentGuestbookEntry(fileMetadata, format);
- logger.fine("created guestbook entry for filemetadata "+fileMetadata.getId()+", datafile "+fileMetadata.getDataFile().getId());
- callDownloadServlet(format, fileMetadata.getDataFile().getId());
- logger.fine("issued file download redirect for filemetadata "+fileMetadata.getId()+", datafile "+fileMetadata.getDataFile().getId());
- }
-
- private String downloadFormat;
+ fileDownloadService.callDownloadServlet(getDownloadableFilesIdsString(), writeGuestbook);
- public String getDownloadFormat() {
- return downloadFormat;
}
-
- public void setDownloadFormat(String downloadFormat) {
- this.downloadFormat = downloadFormat;
- }
-
+
private String downloadType = "";
public String getDownloadType() {
@@ -3043,23 +2440,17 @@ public void setDownloadType(String downloadType) {
this.downloadType = downloadType;
}
-
- public void initGuestbookResponse(FileMetadata fileMetadata){
- initGuestbookResponse(fileMetadata, "", null);
- }
-
- public void initGuestbookResponse(FileMetadata fileMetadata, String downloadType){
- initGuestbookResponse(fileMetadata, downloadType, null);
- }
- public void initGuestbookMultipleResponse(){
+ public void modifyGuestbookMultipleResponse(){
if (this.selectedFiles.isEmpty()) {
RequestContext requestContext = RequestContext.getCurrentInstance();
requestContext.execute("PF('selectFilesForDownload').show()");
return;
}
- initGuestbookResponse(null, "download", null);
+ this.guestbookResponse = this.guestbookResponseService.modifySelectedFileIds(guestbookResponse, getSelectedDownloadableFilesIdsString());
+ this.guestbookResponse.setDownloadtype("Download");
+ this.guestbookResponse.setFileFormat("Download");
RequestContext requestContext = RequestContext.getCurrentInstance();
requestContext.execute("PF('downloadPopup').show();handleResizeDialog('downloadPopup');");
}
@@ -3069,84 +2460,11 @@ public void initGuestbookMultipleResponse(String selectedFileIds){
}
public void initGuestbookResponse(FileMetadata fileMetadata, String downloadFormat, String selectedFileIds) {
- if (fileMetadata != null){
- this.setSelectedDownloadFile(fileMetadata.getDataFile());
- }
- setDownloadFormat(downloadFormat);
- if (fileMetadata == null){
- setDownloadType("multiple");
- } else {
- setDownloadType("download");
- }
- if(this.workingVersion != null && this.workingVersion.isDraft()){
- this.guestbookResponse = null;
- return;
- }
- this.guestbookResponse = new GuestbookResponse();
- User user = session.getUser();
- if (this.dataset.getGuestbook() != null) {
- this.guestbookResponse.setGuestbook(this.dataset.getGuestbook());
- this.guestbookResponse.setName("");
- this.guestbookResponse.setEmail("");
- this.guestbookResponse.setInstitution("");
- this.guestbookResponse.setPosition("");
- this.guestbookResponse.setSessionId(session.toString());
- if (user.isAuthenticated()) {
- AuthenticatedUser aUser = (AuthenticatedUser) user;
- this.guestbookResponse.setName(aUser.getName());
- this.guestbookResponse.setAuthenticatedUser(aUser);
- this.guestbookResponse.setEmail(aUser.getEmail());
- this.guestbookResponse.setInstitution(aUser.getAffiliation());
- this.guestbookResponse.setPosition(aUser.getPosition());
- this.guestbookResponse.setSessionId(session.toString());
- }
- if (fileMetadata != null){
- this.guestbookResponse.setDataFile(fileMetadata.getDataFile());
- }
- } else {
- if (fileMetadata != null){
- this.guestbookResponse = guestbookResponseService.initDefaultGuestbookResponse(dataset, fileMetadata.getDataFile(), user, session);
- } else {
- this.guestbookResponse = guestbookResponseService.initDefaultGuestbookResponse(dataset, null, user, session);
- }
- }
- if (this.dataset.getGuestbook() != null && !this.dataset.getGuestbook().getCustomQuestions().isEmpty()) {
- this.guestbookResponse.setCustomQuestionResponses(new ArrayList());
- for (CustomQuestion cq : this.dataset.getGuestbook().getCustomQuestions()) {
- CustomQuestionResponse cqr = new CustomQuestionResponse();
- cqr.setGuestbookResponse(guestbookResponse);
- cqr.setCustomQuestion(cq);
- cqr.setResponse("");
- if (cq.getQuestionType().equals("options")) {
- //response select Items
- cqr.setResponseSelectItems(setResponseUISelectItems(cq));
- }
- this.guestbookResponse.getCustomQuestionResponses().add(cqr);
- }
- }
- this.guestbookResponse.setDownloadtype("Download");
- if(downloadFormat.toLowerCase().equals("subset")){
- this.guestbookResponse.setDownloadtype("Subset");
- setDownloadFormat("subset");
- setDownloadType("subset");
- }
- if(downloadFormat.toLowerCase().equals("explore")){
- setDownloadFormat("explore");
- setDownloadType("explore");
- this.guestbookResponse.setDownloadtype("Explore");
- }
- this.guestbookResponse.setDataset(dataset);
+ this.guestbookResponse = guestbookResponseService.initGuestbookResponse(fileMetadata, downloadFormat, selectedFileIds, session);
}
- private List setResponseUISelectItems(CustomQuestion cq) {
- List retList = new ArrayList();
- for (CustomQuestionValue cqv : cq.getCustomQuestionValues()) {
- SelectItem si = new SelectItem(cqv.getValueString(), cqv.getValueString());
- retList.add(si);
- }
- return retList;
- }
+
public void compareVersionDifferences() {
RequestContext requestContext = RequestContext.getCurrentInstance();
@@ -3241,118 +2559,6 @@ private List resetReleasedVersionTabList() {
return retList;
}
- public void downloadDatasetCitationXML() {
- downloadCitationXML(null);
- }
-
- public void downloadDatafileCitationXML(FileMetadata fileMetadata) {
- downloadCitationXML(fileMetadata);
- }
-
- public void downloadCitationXML(FileMetadata fileMetadata) {
-
- String xml = datasetService.createCitationXML(workingVersion, fileMetadata);
- FacesContext ctx = FacesContext.getCurrentInstance();
- HttpServletResponse response = (HttpServletResponse) ctx.getExternalContext().getResponse();
- response.setContentType("text/xml");
- String fileNameString = "";
- if (fileMetadata == null || fileMetadata.getLabel() == null) {
- // Dataset-level citation:
- fileNameString = "attachment;filename=" + getFileNameDOI() + ".xml";
- } else {
- // Datafile-level citation:
- fileNameString = "attachment;filename=" + getFileNameDOI() + "-" + fileMetadata.getLabel().replaceAll("\\.tab$", "-endnote.xml");
- }
- response.setHeader("Content-Disposition", fileNameString);
- try {
- ServletOutputStream out = response.getOutputStream();
- out.write(xml.getBytes());
- out.flush();
- ctx.responseComplete();
- } catch (Exception e) {
-
- }
- }
-
- private String getFileNameDOI() {
- Dataset ds = workingVersion.getDataset();
- return "DOI:" + ds.getAuthority() + "_" + ds.getIdentifier().toString();
- }
-
- public void downloadDatasetCitationRIS() {
-
- downloadCitationRIS(null);
-
- }
-
- public void downloadDatafileCitationRIS(FileMetadata fileMetadata) {
- downloadCitationRIS(fileMetadata);
- }
-
- public void downloadCitationRIS(FileMetadata fileMetadata) {
-
- String risFormatDowload = datasetService.createCitationRIS(workingVersion, fileMetadata);
- FacesContext ctx = FacesContext.getCurrentInstance();
- HttpServletResponse response = (HttpServletResponse) ctx.getExternalContext().getResponse();
- response.setContentType("application/download");
-
- String fileNameString = "";
- if (fileMetadata == null || fileMetadata.getLabel() == null) {
- // Dataset-level citation:
- fileNameString = "attachment;filename=" + getFileNameDOI() + ".ris";
- } else {
- // Datafile-level citation:
- fileNameString = "attachment;filename=" + getFileNameDOI() + "-" + fileMetadata.getLabel().replaceAll("\\.tab$", ".ris");
- }
- response.setHeader("Content-Disposition", fileNameString);
-
- try {
- ServletOutputStream out = response.getOutputStream();
- out.write(risFormatDowload.getBytes());
- out.flush();
- ctx.responseComplete();
- } catch (Exception e) {
-
- }
- }
-
- public void downloadDatasetCitationBibtex() {
-
- downloadCitationBibtex(null);
-
- }
-
- public void downloadDatafileCitationBibtex(FileMetadata fileMetadata) {
- downloadCitationBibtex(fileMetadata);
- }
-
- public void downloadCitationBibtex(FileMetadata fileMetadata) {
-
- String bibFormatDowload = new BibtexCitation(workingVersion).toString();
- FacesContext ctx = FacesContext.getCurrentInstance();
- HttpServletResponse response = (HttpServletResponse) ctx.getExternalContext().getResponse();
- response.setContentType("application/download");
-
- String fileNameString = "";
- if (fileMetadata == null || fileMetadata.getLabel() == null) {
- // Dataset-level citation:
- fileNameString = "attachment;filename=" + getFileNameDOI() + ".bib";
- } else {
- // Datafile-level citation:
- fileNameString = "attachment;filename=" + getFileNameDOI() + "-" + fileMetadata.getLabel().replaceAll("\\.tab$", ".bib");
- }
- response.setHeader("Content-Disposition", fileNameString);
-
- try {
- ServletOutputStream out = response.getOutputStream();
- out.write(bibFormatDowload.getBytes());
- out.flush();
- ctx.responseComplete();
- } catch (Exception e) {
-
- }
- }
-
public String getDatasetPublishCustomText(){
String datasetPublishCustomText = settingsService.getValueForKey(SettingsServiceBean.Key.DatasetPublishPopupCustomText);
if( datasetPublishCustomText!= null && !datasetPublishCustomText.isEmpty()){
@@ -3366,41 +2572,6 @@ public Boolean isDatasetPublishPopupCustomTextOnAllVersions(){
return settingsService.isTrueForKey(SettingsServiceBean.Key.DatasetPublishPopupCustomTextOnAllVersions, false);
}
- public String getDataExploreURL() {
- String TwoRavensUrl = settingsService.getValueForKey(SettingsServiceBean.Key.TwoRavensUrl);
-
- if (TwoRavensUrl != null && !TwoRavensUrl.equals("")) {
- return TwoRavensUrl;
- }
-
- return "";
- }
-
- public String getDataExploreURLComplete(Long fileid) {
- String TwoRavensUrl = settingsService.getValueForKey(SettingsServiceBean.Key.TwoRavensUrl);
- String TwoRavensDefaultLocal = "/dataexplore/gui.html?dfId=";
-
- if (TwoRavensUrl != null && !TwoRavensUrl.equals("")) {
- // If we have TwoRavensUrl set up as, as an optional
- // configuration service, it must mean that TwoRavens is sitting
- // on some remote server. And that in turn means that we must use
- // full URLs to pass data and metadata to it.
- // update: actually, no we don't want to use this "dataurl" notation.
- // switching back to the dfId=:
- // -- L.A. 4.1
- /*
- String tabularDataURL = getTabularDataFileURL(fileid);
- String tabularMetaURL = getVariableMetadataURL(fileid);
- return TwoRavensUrl + "?ddiurl=" + tabularMetaURL + "&dataurl=" + tabularDataURL + "&" + getApiTokenKey();
- */
- return TwoRavensUrl + "?dfId=" + fileid + "&" + getApiTokenKey();
- }
-
- // For a local TwoRavens setup it's enough to call it with just
- // the file id:
- return TwoRavensDefaultLocal + fileid + "&" + getApiTokenKey();
- }
-
public String getVariableMetadataURL(Long fileid) {
String myHostURL = getDataverseSiteUrl();
String metaURL = myHostURL + "/api/meta/datafile/" + fileid;
@@ -3463,7 +2634,15 @@ public void setFileMetadataSelected(FileMetadata fm){
public void setFileMetadataSelected(FileMetadata fm, String guestbook) {
if (guestbook != null) {
if (guestbook.equals("create")) {
- createSilentGuestbookEntry(fm, "Subset");
+ //
+ /*
+ FIX ME guestbook entry for subsetting
+ */
+
+
+
+
+ // guestbookResponseService.createSilentGuestbookEntry(fm, "Subset");
} else {
initGuestbookResponse(fm, "Subset", null);
}
@@ -4023,33 +3202,10 @@ public void saveAdvancedOptions() {
fileMetadataSelectedForIngestOptionsPopup = null;
}
- public String getFileDateToDisplay(FileMetadata fileMetadata) {
- Date fileDate = null;
- DataFile datafile = fileMetadata.getDataFile();
- if (datafile != null) {
- boolean fileHasBeenReleased = datafile.isReleased();
- if (fileHasBeenReleased) {
- Timestamp filePublicationTimestamp = datafile.getPublicationDate();
- if (filePublicationTimestamp != null) {
- fileDate = filePublicationTimestamp;
- }
- } else {
- Timestamp fileCreateTimestamp = datafile.getCreateDate();
- if (fileCreateTimestamp != null) {
- fileDate = fileCreateTimestamp;
- }
- }
- }
- if (fileDate != null) {
- return displayDateFormat.format(fileDate);
- }
-
- return "";
- }
public boolean isDownloadButtonAvailable(){
for (FileMetadata fmd : workingVersion.getFileMetadatas()) {
- if (canDownloadFile(fmd)) {
+ if (this.fileDownloadHelper.canDownloadFile(fmd)) {
return true;
}
}
@@ -4067,7 +3223,7 @@ public boolean isFileAccessRequestMultiButtonRequired(){
// return false;
}
for (FileMetadata fmd : workingVersion.getFileMetadatas()){
- if (!canDownloadFile(fmd)){
+ if (!this.fileDownloadHelper.canDownloadFile(fmd)){
return true;
}
}
@@ -4082,7 +3238,7 @@ public boolean isFileAccessRequestMultiButtonEnabled(){
return false;
}
for (FileMetadata fmd : this.selectedRestrictedFiles){
- if (!canDownloadFile(fmd)){
+ if (!this.fileDownloadHelper.canDownloadFile(fmd)){
return true;
}
}
@@ -4095,7 +3251,7 @@ public boolean isDownloadAllButtonEnabled() {
if (downloadButtonAllEnabled == null) {
for (FileMetadata fmd : workingVersion.getFileMetadatas()) {
- if (!canDownloadFile(fmd)) {
+ if (!this.fileDownloadHelper.canDownloadFile(fmd)) {
downloadButtonAllEnabled = false;
break;
}
@@ -4111,7 +3267,7 @@ public boolean isDownloadSelectedButtonEnabled(){
return false;
}
for (FileMetadata fmd : this.selectedFiles){
- if (canDownloadFile(fmd)){
+ if (this.fileDownloadHelper.canDownloadFile(fmd)){
return true;
}
}
@@ -4127,7 +3283,7 @@ public boolean isFileAccessRequestMultiSignUpButtonRequired(){
return false;
}
for (FileMetadata fmd : workingVersion.getFileMetadatas()){
- if (!canDownloadFile(fmd)){
+ if (!this.fileDownloadHelper.canDownloadFile(fmd)){
return true;
}
}
@@ -4146,7 +3302,7 @@ public boolean isFileAccessRequestMultiSignUpButtonEnabled(){
return false;
}
for (FileMetadata fmd : this.selectedRestrictedFiles){
- if (!canDownloadFile(fmd)){
+ if (!this.fileDownloadHelper.canDownloadFile(fmd)){
return true;
}
}
@@ -4154,36 +3310,10 @@ public boolean isFileAccessRequestMultiSignUpButtonEnabled(){
}
public boolean isDownloadPopupRequired() {
- // Each of these conditions is sufficient reason to have to
- // present the user with the popup:
-
- //0. if version is draft then Popup "not required"
- if (!workingVersion.isReleased()){
- return false;
- }
- // 1. License and Terms of Use:
- if (workingVersion.getTermsOfUseAndAccess() != null) {
- if (!TermsOfUseAndAccess.License.CC0.equals(workingVersion.getTermsOfUseAndAccess().getLicense())
- && !(workingVersion.getTermsOfUseAndAccess().getTermsOfUse() == null
- || workingVersion.getTermsOfUseAndAccess().getTermsOfUse().equals(""))) {
- return true;
- }
-
- // 2. Terms of Access:
- if (!(workingVersion.getTermsOfUseAndAccess().getTermsOfAccess() == null) && !workingVersion.getTermsOfUseAndAccess().getTermsOfAccess().equals("")) {
- return true;
- }
- }
-
- // 3. Guest Book:
- if (dataset.getGuestbook() != null && dataset.getGuestbook().isEnabled() && dataset.getGuestbook().getDataverse() != null ) {
- return true;
- }
-
- return false;
+ return fileDownloadService.isDownloadPopupRequired(workingVersion);
}
- public String requestAccessMultipleFiles(String fileIdString) {
+ public String requestAccessMultipleFiles(String fileIdString) {
if (fileIdString.isEmpty()) {
RequestContext requestContext = RequestContext.getCurrentInstance();
requestContext.execute("PF('selectFilesForRequestAccess').show()");
@@ -4202,37 +3332,18 @@ public String requestAccessMultipleFiles(String fileIdString) {
test = null;
}
if (test != null) {
- DataFile request = datafileService.find(test);
idForNotification = test;
- requestAccess(request, false);
+ fileDownloadService.requestAccess(test);
}
}
}
if (idForNotification.intValue() > 0) {
- sendRequestFileAccessNotification(idForNotification);
+ fileDownloadService.sendRequestFileAccessNotification(dataset,idForNotification);
}
return returnToDatasetOnly();
}
+
- public void requestAccess(DataFile file, boolean sendNotification) {
- if (!file.getFileAccessRequesters().contains((AuthenticatedUser) session.getUser())) {
- file.getFileAccessRequesters().add((AuthenticatedUser) session.getUser());
- datafileService.save(file);
-
- // create notifications
- if (sendNotification) {
- sendRequestFileAccessNotification(file.getId());
-
- }
- }
- }
-
- private void sendRequestFileAccessNotification(Long fileId) {
- for (AuthenticatedUser au : permissionService.getUsersWithPermissionOn(Permission.ManageDatasetPermissions, dataset)) {
- userNotificationService.sendNotification(au, new Timestamp(new Date().getTime()), UserNotification.Type.REQUESTFILEACCESS, fileId);
- }
-
- }
public boolean isSortButtonEnabled() {
/**
@@ -4369,5 +3480,49 @@ public boolean isUserCanCreatePrivateURL() {
public String getPrivateUrlLink(PrivateUrl privateUrl) {
return privateUrl.getLink();
}
+
+
+ public FileDownloadHelper getFileDownloadHelper() {
+ return fileDownloadHelper;
+ }
+
+ public void setFileDownloadHelper(FileDownloadHelper fileDownloadHelper) {
+ this.fileDownloadHelper = fileDownloadHelper;
+ }
+
+
+ public FileDownloadServiceBean getFileDownloadService() {
+ return fileDownloadService;
+ }
+
+ public void setFileDownloadService(FileDownloadServiceBean fileDownloadService) {
+ this.fileDownloadService = fileDownloadService;
+ }
+
+
+ public GuestbookResponseServiceBean getGuestbookResponseService() {
+ return guestbookResponseService;
+ }
+
+ public void setGuestbookResponseService(GuestbookResponseServiceBean guestbookResponseService) {
+ this.guestbookResponseService = guestbookResponseService;
+ }
+
+
+ public WorldMapPermissionHelper getWorldMapPermissionHelper() {
+ return worldMapPermissionHelper;
+ }
+
+ public void setWorldMapPermissionHelper(WorldMapPermissionHelper worldMapPermissionHelper) {
+ this.worldMapPermissionHelper = worldMapPermissionHelper;
+ }
+
+ public TwoRavensHelper getTwoRavensHelper() {
+ return twoRavensHelper;
+ }
+
+ public void setTwoRavensHelper(TwoRavensHelper twoRavensHelper) {
+ this.twoRavensHelper = twoRavensHelper;
+ }
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java b/src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
index 3be005a252d..437c42ba8b3 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetVersion.java
@@ -52,6 +52,8 @@
uniqueConstraints = @UniqueConstraint(columnNames = {"dataset_id,versionnumber,minorversionnumber"}))
public class DatasetVersion implements Serializable {
+ private static final Logger logger = Logger.getLogger(DatasetVersion.class.getCanonicalName());
+
/**
* Convenience comparator to compare dataset versions by their version number.
* The draft version is considered the latest.
@@ -1034,6 +1036,26 @@ public Set validate() {
}
}
}
+ List dsvfileMetadatas = this.getFileMetadatas();
+ if (dsvfileMetadatas != null) {
+ for (FileMetadata fileMetadata : dsvfileMetadatas) {
+ Set> constraintViolations = validator.validate(fileMetadata);
+ if (constraintViolations.size() > 0) {
+ // currently only support one message
+ ConstraintViolation violation = constraintViolations.iterator().next();
+ /**
+ * @todo How can we expose this more detailed message
+ * containing the invalid value to the user?
+ */
+ String message = "Constraint violation found in FileMetadata. "
+ + violation.getMessage() + " "
+ + "The invalid value is \"" + violation.getInvalidValue().toString() + "\".";
+ logger.info(message);
+ returnSet.add(violation);
+ break; // currently only support one message, so we can break out of the loop after the first constraint violation
+ }
+ }
+ }
return returnSet;
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DatasetVersionDifference.java b/src/main/java/edu/harvard/iq/dataverse/DatasetVersionDifference.java
index 5fe3c9cd61e..20df6c378a4 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DatasetVersionDifference.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DatasetVersionDifference.java
@@ -69,6 +69,17 @@ public DatasetVersionDifference(DatasetVersion newVersion, DatasetVersion origin
}
}
+
+ // TODO: ?
+ // It looks like we are going through the filemetadatas in both versions,
+ // *sequentially* (i.e. at the cost of O(N*M)), to select the lists of
+ // changed, deleted and added files between the 2 versions... But why
+ // are we doing it, if we are doing virtually the same thing inside
+ // the initDatasetFilesDifferenceList(), below - but in a more efficient
+ // way (sorting both lists, then goint through them in parallel, at the
+ // cost of (N+M) max.?
+ // -- 4.6 Nov. 2016
+
for (FileMetadata fmdo : originalVersion.getFileMetadatas()) {
boolean deleted = true;
for (FileMetadata fmdn : newVersion.getFileMetadatas()) {
@@ -648,6 +659,7 @@ private void initDatasetFilesDifferencesList() {
// same study file, the file metadatas ARE version-specific, so some of
// the fields there (filename, etc.) may be different. If this is the
// case, we want to display these differences as well.
+
if (originalVersion.getFileMetadatas().size() == 0 && newVersion.getFileMetadatas().size() == 0) {
noFileDifferencesFoundLabel = "No data files in either version of the study";
return;
@@ -659,50 +671,62 @@ private void initDatasetFilesDifferencesList() {
FileMetadata fm1;
FileMetadata fm2;
- Collections.sort(originalVersion.getFileMetadatas(), new Comparator() {
- public int compare(FileMetadata l1, FileMetadata l2) {
- FileMetadata fm1 = l1; //(DatasetField[]) l1.get(0);
- FileMetadata fm2 = l2;
- int a = fm1.getDataFile().getId().intValue();
- int b = fm2.getDataFile().getId().intValue();
- return Integer.valueOf(a).compareTo(Integer.valueOf(b));
- }
- });
-
- // Here's a potential problem: this new version may have been created
- // specifically because new files are being added to the dataset.
- // In which case there may be files associated with this new version
- // with no database ids - since they haven't been saved yet.
- // So if we try to sort the files in the version the way we did above,
- // by ID, it may fail with a null pointer.
- // To solve this, we should simply check if the file has the id; and if not,
- // sort it higher than any file with an id - because it is a most recently
- // added file. Since we are only doing this for the purposes of generating
- // version differences, this should be OK.
- // -- L.A. Aug. 2014
-
- Collections.sort(newVersion.getFileMetadatas(), new Comparator() {
- public int compare(FileMetadata l1, FileMetadata l2) {
- FileMetadata fm1 = l1; //(DatasetField[]) l1.get(0);
- FileMetadata fm2 = l2;
- Long a = fm1.getDataFile().getId();
- Long b = fm2.getDataFile().getId();
-
- if (a == null && b == null) {
- return 0;
- } else if (a == null) {
- return 1;
- } else if (b == null) {
- return -1;
- }
- return a.compareTo(b);
+ // We also have to be careful sorting this FileMetadatas. If we sort the
+ // lists as they are still attached to their respective versions, we may end
+ // up messing up the page, which was rendered based on the specific order
+ // of these in the working version!
+ // So the right way of doing this is to create defensive copies of the
+ // lists; extra memory, but safer.
+ // -- L.A. Nov. 2016
+
+ List fileMetadatasNew = new ArrayList<>(newVersion.getFileMetadatas());
+ List fileMetadatasOriginal = new ArrayList<>(originalVersion.getFileMetadatas());
+
+ Collections.sort(fileMetadatasOriginal, new Comparator() {
+ public int compare(FileMetadata l1, FileMetadata l2) {
+ FileMetadata fm1 = l1; //(DatasetField[]) l1.get(0);
+ FileMetadata fm2 = l2;
+ int a = fm1.getDataFile().getId().intValue();
+ int b = fm2.getDataFile().getId().intValue();
+ return Integer.valueOf(a).compareTo(Integer.valueOf(b));
+ }
+ });
+
+ // Here's a potential problem: this new version may have been created
+ // specifically because new files are being added to the dataset.
+ // In which case there may be files associated with this new version
+ // with no database ids - since they haven't been saved yet.
+ // So if we try to sort the files in the version the way we did above,
+ // by ID, it may fail with a null pointer.
+ // To solve this, we should simply check if the file has the id; and if not,
+ // sort it higher than any file with an id - because it is a most recently
+ // added file. Since we are only doing this for the purposes of generating
+ // version differences, this should be OK.
+ // -- L.A. Aug. 2014
+
+
+ Collections.sort(fileMetadatasNew, new Comparator() {
+ public int compare(FileMetadata l1, FileMetadata l2) {
+ FileMetadata fm1 = l1; //(DatasetField[]) l1.get(0);
+ FileMetadata fm2 = l2;
+ Long a = fm1.getDataFile().getId();
+ Long b = fm2.getDataFile().getId();
+
+ if (a == null && b == null) {
+ return 0;
+ } else if (a == null) {
+ return 1;
+ } else if (b == null) {
+ return -1;
}
- });
+ return a.compareTo(b);
+ }
+ });
- while (i < originalVersion.getFileMetadatas().size()
- && j < newVersion.getFileMetadatas().size()) {
- fm1 = originalVersion.getFileMetadatas().get(i);
- fm2 = newVersion.getFileMetadatas().get(j);
+ while (i < fileMetadatasOriginal.size()
+ && j < fileMetadatasNew.size()) {
+ fm1 = fileMetadatasOriginal.get(i);
+ fm2 = fileMetadatasNew.get(j);
if (fm2.getDataFile().getId() != null && fm1.getDataFile().getId().compareTo(fm2.getDataFile().getId()) == 0) {
// The 2 versions share the same study file;
@@ -711,7 +735,8 @@ public int compare(FileMetadata l1, FileMetadata l2) {
if (fileMetadataIsDifferent(fm1, fm2)) {
datasetFileDifferenceItem fdi = selectFileMetadataDiffs(fm1, fm2);
fdi.setFileId(fm1.getDataFile().getId().toString());
- fdi.setFileMD5(fm1.getDataFile().getmd5());
+ fdi.setFileChecksumType(fm1.getDataFile().getChecksumType());
+ fdi.setFileChecksumValue(fm1.getDataFile().getChecksumValue());
datasetFilesDiffList.add(fdi);
}
i++;
@@ -719,14 +744,16 @@ public int compare(FileMetadata l1, FileMetadata l2) {
} else if (fm2.getDataFile().getId() != null && fm1.getDataFile().getId().compareTo(fm2.getDataFile().getId()) > 0) {
datasetFileDifferenceItem fdi = selectFileMetadataDiffs(null, fm2);
fdi.setFileId(fm2.getDataFile().getId().toString());
- fdi.setFileMD5(fm2.getDataFile().getmd5());
+ fdi.setFileChecksumType(fm2.getDataFile().getChecksumType());
+ fdi.setFileChecksumValue(fm2.getDataFile().getChecksumValue());
datasetFilesDiffList.add(fdi);
j++;
} else if (fm2.getDataFile().getId() == null || fm1.getDataFile().getId().compareTo(fm2.getDataFile().getId()) < 0) {
datasetFileDifferenceItem fdi = selectFileMetadataDiffs(fm1, null);
fdi.setFileId(fm1.getDataFile().getId().toString());
- fdi.setFileMD5(fm1.getDataFile().getmd5());
+ fdi.setFileChecksumType(fm1.getDataFile().getChecksumType());
+ fdi.setFileChecksumValue(fm1.getDataFile().getChecksumValue());
datasetFilesDiffList.add(fdi);
i++;
@@ -736,28 +763,37 @@ public int compare(FileMetadata l1, FileMetadata l2) {
// We've reached the end of at least one file list.
// Whatever files are left on either of the 2 lists are automatically "different"
// between the 2 versions.
- while (i < originalVersion.getFileMetadatas().size()) {
- fm1 = originalVersion.getFileMetadatas().get(i);
+ while (i < fileMetadatasOriginal.size()) {
+ fm1 = fileMetadatasOriginal.get(i);
datasetFileDifferenceItem fdi = selectFileMetadataDiffs(fm1, null);
fdi.setFileId(fm1.getDataFile().getId().toString());
- fdi.setFileMD5(fm1.getDataFile().getmd5());
+ fdi.setFileChecksumType(fm1.getDataFile().getChecksumType());
+ fdi.setFileChecksumValue(fm1.getDataFile().getChecksumValue());
datasetFilesDiffList.add(fdi);
i++;
}
- while (j < newVersion.getFileMetadatas().size()) {
- fm2 = newVersion.getFileMetadatas().get(j);
+ while (j < fileMetadatasNew.size()) {
+ fm2 = fileMetadatasNew.get(j);
datasetFileDifferenceItem fdi = selectFileMetadataDiffs(null, fm2);
if (fm2.getDataFile().getId() != null) {
fdi.setFileId(fm2.getDataFile().getId().toString());
} else {
fdi.setFileId("[UNASSIGNED]");
}
- if (fm2.getDataFile().getmd5() != null) {
- fdi.setFileMD5(fm2.getDataFile().getmd5());
+ if (fm2.getDataFile().getChecksumValue() != null) {
+ fdi.setFileChecksumType(fm2.getDataFile().getChecksumType());
+ fdi.setFileChecksumValue(fm2.getDataFile().getChecksumValue());
} else {
- fdi.setFileMD5("[UNASSIGNED]");
+ /**
+ * @todo What should we do here? checksumValue is set to
+ * "nullable = false" so it should never be non-null. Let's set
+ * it to "null" and see if this code path is ever reached. If
+ * not, the null check above can probably be safely removed.
+ */
+ fdi.setFileChecksumType(null);
+ fdi.setFileChecksumValue("[UNASSIGNED]");
}
datasetFilesDiffList.add(fdi);
@@ -993,7 +1029,8 @@ public datasetFileDifferenceItem() {
}
private String fileId;
- private String fileMD5;
+ private DataFile.ChecksumType fileChecksumType;
+ private String fileChecksumValue;
private String fileName1;
private String fileType1;
@@ -1131,13 +1168,21 @@ public void setFile1Empty(boolean state) {
public void setFile2Empty(boolean state) {
file2Empty = state;
}
-
- public String getFileMD5() {
- return fileMD5;
+
+ public DataFile.ChecksumType getFileChecksumType() {
+ return fileChecksumType;
+ }
+
+ public void setFileChecksumType(DataFile.ChecksumType fileChecksumType) {
+ this.fileChecksumType = fileChecksumType;
+ }
+
+ public String getFileChecksumValue() {
+ return fileChecksumValue;
}
- public void setFileMD5(String fileMD5) {
- this.fileMD5 = fileMD5;
+ public void setFileChecksumValue(String fileChecksumValue) {
+ this.fileChecksumValue = fileChecksumValue;
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java
index 44f0a7a77b4..c1c1d26cc13 100644
--- a/src/main/java/edu/harvard/iq/dataverse/Dataverse.java
+++ b/src/main/java/edu/harvard/iq/dataverse/Dataverse.java
@@ -45,8 +45,7 @@
@NamedQuery(name = "Dataverse.filterByAliasNameAffiliation", query="SELECT dv FROM Dataverse dv WHERE (LOWER(dv.alias) LIKE :alias) OR (LOWER(dv.name) LIKE :name) OR (LOWER(dv.affiliation) LIKE :affiliation) order by dv.alias")
})
@Entity
-@Table(indexes = {@Index(columnList="fk_dataverse_id")
- , @Index(columnList="defaultcontributorrole_id")
+@Table(indexes = {@Index(columnList="defaultcontributorrole_id")
, @Index(columnList="defaulttemplate_id")
, @Index(columnList="alias")
, @Index(columnList="affiliation")
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataverseEntity.java b/src/main/java/edu/harvard/iq/dataverse/DataverseEntity.java
new file mode 100644
index 00000000000..0e41cfc04fa
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/DataverseEntity.java
@@ -0,0 +1,27 @@
+/*
+ * To change this license header, choose License Headers in Project Properties.
+ * To change this template file, choose Tools | Templates
+ * and open the template in the editor.
+ */
+package edu.harvard.iq.dataverse;
+
+
+/**
+ * This is a non persistent superclass to be used by entities in Dataverse
+ * for any shared non persistent properties; for example "mergeable" which should
+ * be set to false, when an entity is manually loaded through native queries
+ */
+public abstract class DataverseEntity {
+
+ private boolean mergeable = true;
+
+ public boolean isMergeable() {
+ return mergeable;
+ }
+
+ public void setMergeable(boolean mergeable) {
+ this.mergeable = mergeable;
+ }
+
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/DataverseHeaderFragment.java b/src/main/java/edu/harvard/iq/dataverse/DataverseHeaderFragment.java
index 0e3e9e6d2e9..a28db9367ba 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DataverseHeaderFragment.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DataverseHeaderFragment.java
@@ -88,6 +88,31 @@ public void initBreadcrumbs(DvObject dvObject) {
dvObject instanceof Dataset ? JH.localize("newDataset") : null );
}
}
+
+ public void initBreadcrumbsForFileMetadata(FileMetadata fmd) {
+ if (fmd == null) {
+ return;
+ }
+
+ breadcrumbs.clear();
+
+ //First Add regular breadcrumb for the data file
+ DvObject dvObject = fmd.getDataFile();
+ breadcrumbs.add(0, new Breadcrumb(dvObject, dvObject.getDisplayName()));
+
+ //Get the Dataset Owning the Datafile and add version to the breadcrumb
+ dvObject = dvObject.getOwner();
+ String optionalUrlExtension = "&version=" + fmd.getDatasetVersion().getSemanticVersion();
+ breadcrumbs.add(0, new Breadcrumb(dvObject, dvObject.getDisplayName(), optionalUrlExtension));
+
+ // now get Dataverse Owner of the dataset and proceed as usual
+ dvObject = dvObject.getOwner();
+ while (dvObject != null) {
+ breadcrumbs.add(0, new Breadcrumb(dvObject, dvObject.getDisplayName()));
+ dvObject = dvObject.getOwner();
+ }
+
+ }
public Long getUnreadNotificationCount(Long userId){
@@ -222,8 +247,15 @@ public void addBreadcrumb (String linkString){
public static class Breadcrumb {
private final String breadcrumbText;
- private DvObject dvObject = null;
- private String url = null;
+ private DvObject dvObject = null;
+ private String url = null;
+ private String optionalUrlExtension = null;
+
+ public Breadcrumb( DvObject dvObject, String breadcrumbText, String optionalUrlExtension ) {
+ this.breadcrumbText = breadcrumbText;
+ this.dvObject = dvObject;
+ this.optionalUrlExtension = optionalUrlExtension;
+ }
public Breadcrumb( DvObject dvObject, String breadcrumbText) {
this.breadcrumbText = breadcrumbText;
@@ -250,6 +282,9 @@ public DvObject getDvObject() {
public String getUrl() {
return url;
}
-
+
+ public String getOptionalUrlExtension() {
+ return optionalUrlExtension;
+ }
}
}
\ No newline at end of file
diff --git a/src/main/java/edu/harvard/iq/dataverse/DvObject.java b/src/main/java/edu/harvard/iq/dataverse/DvObject.java
index 47d4878f4cc..1bae78189cb 100644
--- a/src/main/java/edu/harvard/iq/dataverse/DvObject.java
+++ b/src/main/java/edu/harvard/iq/dataverse/DvObject.java
@@ -31,7 +31,7 @@
, @Index(columnList="owner_id")
, @Index(columnList="creator_id")
, @Index(columnList="releaseuser_id")})
-public abstract class DvObject implements java.io.Serializable {
+public abstract class DvObject extends DataverseEntity implements java.io.Serializable {
public static final String DATAVERSE_DTYPE_STRING = "Dataverse";
public static final String DATASET_DTYPE_STRING = "Dataset";
@@ -64,12 +64,12 @@ public String visit(Dataverse dv) {
@Override
public String visit(Dataset ds) {
- return "[" + ds.getId() + " " + ds.getLatestVersion().getTitle() + "]";
+ return "[" + ds.getId() + (ds.getLatestVersion() != null ? " " + ds.getLatestVersion().getTitle() : "") + "]";
}
@Override
public String visit(DataFile df) {
- return "[" + df.getId() + " " + df.getFileMetadata().getLabel() + "]";
+ return "[" + df.getId() + (df.getFileMetadata() != null ? " " + df.getFileMetadata().getLabel() : "") + "]";
}
};
diff --git a/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java b/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java
index 851fc5a50cc..7a84672a936 100644
--- a/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/EditDatafilesPage.java
@@ -6,34 +6,18 @@
import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
-import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
-import edu.harvard.iq.dataverse.authorization.users.ApiToken;
-import edu.harvard.iq.dataverse.authorization.users.User;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
-import edu.harvard.iq.dataverse.datavariable.VariableServiceBean;
+import edu.harvard.iq.dataverse.dataaccess.ImageThumbConverter;
import edu.harvard.iq.dataverse.engine.command.Command;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
-import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.CreateGuestbookResponseCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.DeaccessionDatasetVersionCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeleteDataFileCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.DeleteDatasetVersionCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.DestroyDatasetCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.LinkDatasetCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.PublishDatasetCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.PublishDataverseCommand;
import edu.harvard.iq.dataverse.engine.command.impl.UpdateDatasetCommand;
import edu.harvard.iq.dataverse.ingest.IngestRequest;
import edu.harvard.iq.dataverse.ingest.IngestServiceBean;
-import edu.harvard.iq.dataverse.metadataimport.ForeignMetadataImportServiceBean;
-import edu.harvard.iq.dataverse.search.FacetCategory;
+import edu.harvard.iq.dataverse.ingest.IngestUtil;
import edu.harvard.iq.dataverse.search.FileView;
-import edu.harvard.iq.dataverse.search.SearchFilesServiceBean;
-import edu.harvard.iq.dataverse.search.SolrSearchResult;
-import edu.harvard.iq.dataverse.search.SortBy;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
-import edu.harvard.iq.dataverse.util.BundleUtil;
-import edu.harvard.iq.dataverse.util.FileSortFieldAndOrder;
+import edu.harvard.iq.dataverse.util.FileUtil;
import edu.harvard.iq.dataverse.util.JsfHelper;
import static edu.harvard.iq.dataverse.util.JsfHelper.JH;
import edu.harvard.iq.dataverse.util.StringUtil;
@@ -44,10 +28,8 @@
import java.io.InputStream;
import java.io.StringReader;
import java.nio.file.Files;
-import java.nio.file.Path;
import java.nio.file.Paths;
import java.sql.Timestamp;
-import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.HashMap;
@@ -61,7 +43,6 @@
import javax.faces.application.FacesMessage;
import javax.faces.context.FacesContext;
import javax.faces.event.ActionEvent;
-import javax.faces.event.ValueChangeEvent;
import javax.faces.view.ViewScoped;
import javax.inject.Inject;
import javax.inject.Named;
@@ -71,16 +52,11 @@
import javax.json.JsonObject;
import javax.json.JsonArray;
import javax.json.JsonReader;
-import javax.servlet.ServletOutputStream;
-import javax.servlet.http.HttpServletResponse;
-import javax.validation.ConstraintViolation;
import org.apache.commons.httpclient.HttpClient;
import org.apache.commons.httpclient.methods.GetMethod;
-import org.primefaces.context.RequestContext;
import java.text.DateFormat;
import java.util.Arrays;
import java.util.HashSet;
-import javax.faces.model.SelectItem;
import java.util.logging.Level;
import javax.faces.event.AjaxBehaviorEvent;
@@ -135,12 +111,13 @@ public enum FileEditMode {
private String selectedFileIdsString = null;
private FileEditMode mode = FileEditMode.EDIT;
private List selectedFileIdsList = new ArrayList<>();
- private List fileMetadatas = new ArrayList<>();
+ private List fileMetadatas = new ArrayList<>();;
private Long ownerId;
private Long versionId;
- private List newFiles = new ArrayList();
+ private List newFiles = new ArrayList<>();;
+ private List uploadedFiles = new ArrayList<>();;
private DatasetVersion workingVersion;
private String dropBoxSelection = "";
private String displayCitation;
@@ -149,13 +126,19 @@ public enum FileEditMode {
private String persistentId;
+ private String versionString = "";
+
+
private boolean saveEnabled = false;
// Used to store results of permissions checks
private final Map datasetPermissionMap = new HashMap<>(); // { Permission human_name : Boolean }
private Long maxFileUploadSizeInBytes = null;
-
+ private Integer multipleUploadFilesLimit = null;
+
+ private final int NUMBER_OF_SCROLL_ROWS = 25;
+
public String getSelectedFileIds() {
return selectedFileIdsString;
}
@@ -173,11 +156,15 @@ public void setMode(FileEditMode mode) {
}
public List getFileMetadatas() {
- if (fileMetadatas != null) {
- logger.fine("Returning a list of "+fileMetadatas.size()+" file metadatas.");
- } else {
- logger.fine("File metadatas list hasn't been initialized yet.");
- }
+ // [experimental]
+ // this would be a way to hide any already-uploaded files from the page
+ // while a new upload is happening:
+ // (the uploadStarted button on the page needs the update="filesTable"
+ // attribute added for this to work)
+ //if (uploadInProgress) {
+ // return null;
+ //}
+
return fileMetadatas;
}
@@ -185,6 +172,54 @@ public void setFileMetadatas(List fileMetadatas) {
this.fileMetadatas = fileMetadatas;
}
+ /*
+ The 2 methods below are for setting up the PrimeFaces:dataTabe component
+ used to display the uploaded files, or the files selected for editing.
+
+ - isScrollable():
+ this supplies the value of the component attribute "scrollable".
+ When we have more than NUMBER_OF_SCROLL_ROWS worth of files (currently
+ set to 25), we will add a scroller to the table, showing NUMBER_OF_SCROLL_ROWS
+ at a time; thus making the page a little bit more useable.
+ When there is fewer rows, however, the attribute needs to be set to
+ "false" - because otherwise some (idiosyncratic) amount of white space
+ is added to the bottom of the table, making the page look silly.
+
+ - getScrollHeightPercentage():
+ this method calculates the *percentage* of the total length of the
+ list of files, such that the resulting table is always NUMBER_OF_SCROLL_ROWS
+ high. This is *the only way* to keep the number of files shown in the
+ table fixed as the size of the list grows! (the "scrollRows" attribute
+ of the p:dataTable component only applies when "liveScroll=true" is being
+ used).
+ */
+
+ public boolean isScrollable() {
+ if (fileMetadatas == null || fileMetadatas.size() <= NUMBER_OF_SCROLL_ROWS + 1) {
+ return false;
+ }
+
+ return true;
+ }
+
+ public String getScrollHeightPercentage() {
+ int perc;
+ if (fileMetadatas == null || fileMetadatas.size() < NUMBER_OF_SCROLL_ROWS) {
+ perc = 100;
+ } else {
+ perc = NUMBER_OF_SCROLL_ROWS * 100 / fileMetadatas.size();
+ }
+
+ if (perc == 0) {
+ perc = 1;
+ } else if (perc > 100) {
+ perc = 100;
+ }
+
+ logger.info("scroll height percentage: "+perc);
+ return perc + "%";
+ }
+
/*
Any settings, such as the upload size limits, should be saved locally -
so that the db doesn't get hit repeatedly. (this setting is initialized
@@ -193,11 +228,11 @@ in the init() method)
This may be "null", signifying unlimited download size.
*/
- public Long getMaxFileUploadSizeInBytes(){
+ public Long getMaxFileUploadSizeInBytes() {
return this.maxFileUploadSizeInBytes;
}
- public boolean isUnlimitedUploadFileSize(){
+ public boolean isUnlimitedUploadFileSize() {
if (this.maxFileUploadSizeInBytes == null){
return true;
@@ -205,7 +240,14 @@ public boolean isUnlimitedUploadFileSize(){
return false;
}
-
+ /*
+ The number of files the GUI user is allowed to upload in one batch,
+ via drag-and-drop, or through the file select dialog. Now configurable
+ in the Settings table.
+ */
+ public Integer getMaxNumberOfFiles() {
+ return this.multipleUploadFilesLimit;
+ }
/**
* Check Dataset related permissions
*
@@ -327,10 +369,14 @@ public String initCreateMode(String modeToken, DatasetVersion version, List();
- /*
- protocol = settingsService.getValueForKey(SettingsServiceBean.Key.Protocol, nonNullDefaultIfKeyNotFound);
- authority = settingsService.getValueForKey(SettingsServiceBean.Key.Authority, nonNullDefaultIfKeyNotFound);
- separator = settingsService.getValueForKey(SettingsServiceBean.Key.DoiSeparator, nonNullDefaultIfKeyNotFound);
- */
+ newFiles = new ArrayList();
+ uploadedFiles = new ArrayList();
+
+ this.maxFileUploadSizeInBytes = systemConfig.getMaxFileUploadSize();
+ this.multipleUploadFilesLimit = systemConfig.getMultipleUploadFilesLimit();
if (dataset.getId() != null){
// Set Working Version and Dataset by Datasaet Id and Version
@@ -416,6 +461,13 @@ public String init() {
if (fileMetadatas.size() < 1) {
return permissionsWrapper.notFound();
}
+
+ if (FileEditMode.SINGLE == mode){
+ if (fileMetadatas.get(0).getDatasetVersion().getId() != null){
+ versionString = "DRAFT";
+ }
+ }
+
}
saveEnabled = true;
@@ -744,10 +796,10 @@ public void deleteFiles() {
// local filesystem:
try {
- Files.delete(Paths.get(ingestService.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier()));
+ Files.delete(Paths.get(FileUtil.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier()));
} catch (IOException ioEx) {
// safe to ignore - it's just a temp file.
- logger.warning("Failed to delete temporary file " + ingestService.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier());
+ logger.warning("Failed to delete temporary file " + FileUtil.getFilesTempDirectory() + "/" + dfn.getStorageIdentifier());
}
dfIt.remove();
@@ -988,7 +1040,8 @@ public String save() {
// the individual File Landing page, we want to redirect back to
// the landing page. BUT ONLY if the file still exists - i.e., if
// the user hasn't just deleted it!
- return returnToFileLandingPage(fileMetadatas.get(0).getId());
+ versionString = "DRAFT";
+ return returnToFileLandingPage();
}
//if (newDraftVersion) {
@@ -1009,28 +1062,37 @@ private String returnToDraftVersion(){
return "/dataset.xhtml?persistentId=" + dataset.getGlobalId() + "&version=DRAFT&faces-redirect=true";
}
- private String returnToDraftVersionById() {
- return "/dataset.xhtml?versionId=" + workingVersion.getId() + "&faces-redirect=true";
- }
-
private String returnToDatasetOnly(){
dataset = datasetService.find(dataset.getId());
return "/dataset.xhtml?persistentId=" + dataset.getGlobalId() + "&faces-redirect=true";
}
- private String returnToFileLandingPage(Long fileId) {
- return "/file.xhtml?fileId=" + fileId + "&datasetVersionId=" + workingVersion.getId() + "&faces-redirect=true";
+ private String returnToFileLandingPage() {
+
+ Long fileId = fileMetadatas.get(0).getDataFile().getId();
+ if (versionString.equals("DRAFT")){
+ return "/file.xhtml?fileId=" + fileId + "&version=DRAFT&faces-redirect=true";
+ }
+ return "/file.xhtml?fileId=" + fileId + "&faces-redirect=true";
+
}
public String cancel() {
+ if (mode == FileEditMode.SINGLE) {
+ return returnToFileLandingPage();
+ }
if (workingVersion.getId() != null) {
return returnToDraftVersion();
}
- return returnToDatasetOnly();
+ return returnToDatasetOnly();
}
+
+ /* deprecated; super inefficient, when called repeatedly on a long list
+ of files!
+ leaving the code here, commented out, for illustration purposes. -- 4.6
public boolean isDuplicate(FileMetadata fileMetadata) {
- String thisMd5 = fileMetadata.getDataFile().getmd5();
+ String thisMd5 = fileMetadata.getDataFile().getChecksumValue();
if (thisMd5 == null) {
return false;
}
@@ -1040,7 +1102,7 @@ public boolean isDuplicate(FileMetadata fileMetadata) {
// TODO:
// think of a way to do this that doesn't involve populating this
// map for every file on the page?
- // man not be that much of a problem, if we paginate and never display
+ // may not be that much of a problem, if we paginate and never display
// more than a certain number of files... Still, needs to be revisited
// before the final 4.0.
// -- L.A. 4.0
@@ -1052,7 +1114,7 @@ public boolean isDuplicate(FileMetadata fileMetadata) {
while (fmIt.hasNext()) {
FileMetadata fm = fmIt.next();
- String md5 = fm.getDataFile().getmd5();
+ String md5 = fm.getDataFile().getChecksumValue();
if (md5 != null) {
if (MD5Map.get(md5) != null) {
MD5Map.put(md5, MD5Map.get(md5).intValue() + 1);
@@ -1063,15 +1125,15 @@ public boolean isDuplicate(FileMetadata fileMetadata) {
}
return MD5Map.get(thisMd5) != null && MD5Map.get(thisMd5).intValue() > 1;
- }
-
+ }*/
+
private HttpClient getClient() {
// TODO:
// cache the http client? -- L.A. 4.0 alpha
return new HttpClient();
}
- public boolean showFileUploadFileComponent(){
+ public boolean showFileUploadFileComponent() {
if (mode == FileEditMode.UPLOAD || mode == FileEditMode.CREATE) {
return true;
@@ -1079,7 +1141,6 @@ public boolean showFileUploadFileComponent(){
return false;
}
-
/**
* Download a file from drop box
@@ -1123,6 +1184,8 @@ private InputStream getDropBoxInputStream(String fileLink, GetMethod dropBoxMeth
public void handleDropBoxUpload(ActionEvent event) {
logger.fine("handleDropBoxUpload");
+ uploadComponentId = event.getComponent().getClientId();
+
// -----------------------------------------------------------
// Read JSON object from the output of the DropBox Chooser:
// -----------------------------------------------------------
@@ -1135,6 +1198,7 @@ public void handleDropBoxUpload(ActionEvent event) {
// -----------------------------------------------------------
DataFile dFile = null;
GetMethod dropBoxMethod = null;
+ String localWarningMessage = null;
for (int i = 0; i < dbArray.size(); i++) {
JsonObject dbObject = dbArray.getJsonObject(i);
@@ -1153,10 +1217,15 @@ public void handleDropBoxUpload(ActionEvent event) {
- Max size NOT specified in db: default is unlimited
- Max size specified in db: check too make sure file is within limits
// ---------------------------- */
- if ((!this.isUnlimitedUploadFileSize())&&(fileSize > this.getMaxFileUploadSizeInBytes())){
+ if ((!this.isUnlimitedUploadFileSize()) && (fileSize > this.getMaxFileUploadSizeInBytes())) {
String warningMessage = "Dropbox file \"" + fileName + "\" exceeded the limit of " + fileSize + " bytes and was not uploaded.";
//msg(warningMessage);
- FacesContext.getCurrentInstance().addMessage(event.getComponent().getClientId(), new FacesMessage(FacesMessage.SEVERITY_ERROR, "upload failure", warningMessage));
+ //FacesContext.getCurrentInstance().addMessage(event.getComponent().getClientId(), new FacesMessage(FacesMessage.SEVERITY_ERROR, "upload failure", warningMessage));
+ if (localWarningMessage == null) {
+ localWarningMessage = warningMessage;
+ } else {
+ localWarningMessage = localWarningMessage.concat("; " + warningMessage);
+ }
continue; // skip to next file, and add error mesage
}
@@ -1179,8 +1248,11 @@ public void handleDropBoxUpload(ActionEvent event) {
// Send it through the ingest service
// -----------------------------------------------------------
try {
- // Note: A single file may be unzipped into multiple files
- datafiles = ingestService.createDataFiles(workingVersion, dropBoxStream, fileName, "application/octet-stream");
+ // Note: A single uploaded file may produce multiple datafiles -
+ // for example, multiple files can be extracted from an uncompressed
+ // zip file.
+ //datafiles = ingestService.createDataFiles(workingVersion, dropBoxStream, fileName, "application/octet-stream");
+ datafiles = FileUtil.createDataFiles(workingVersion, dropBoxStream, fileName, "application/octet-stream", systemConfig);
} catch (IOException ex) {
this.logger.log(Level.SEVERE, "Error during ingest of DropBox file {0} from link {1}", new Object[]{fileName, fileLink});
@@ -1211,38 +1283,126 @@ public void handleDropBoxUpload(ActionEvent event) {
// -----------------------------------------------------------
// Check if there are duplicate files or ingest warnings
// -----------------------------------------------------------
- String warningMessage = processUploadedFileList(datafiles);
- logger.fine("Warning message during upload: " + warningMessage);
- if (warningMessage != null){
+ uploadWarningMessage = processUploadedFileList(datafiles);
+ logger.fine("Warning message during upload: " + uploadWarningMessage);
+ /*if (warningMessage != null){
logger.fine("trying to send faces message to " + event.getComponent().getClientId());
FacesContext.getCurrentInstance().addMessage(event.getComponent().getClientId(), new FacesMessage(FacesMessage.SEVERITY_ERROR, "upload failure", warningMessage));
- }
+ if (uploadWarningMessage == null) {
+ uploadWarningMessage = warningMessage;
+ } else {
+ uploadWarningMessage = uploadWarningMessage.concat("; "+warningMessage);
+ }
+ }*/
+ }
+ }
+
+ if (localWarningMessage != null) {
+ if (uploadWarningMessage == null) {
+ uploadWarningMessage = localWarningMessage;
+ } else {
+ uploadWarningMessage = localWarningMessage.concat("; " + uploadWarningMessage);
}
}
}
+
+ public void uploadStarted() {
+ // uploadStarted() is triggered by PrimeFaces ();
+ uploadInProgress = false;
+ // refresh the warning message below the upload component, if exists:
+ if (uploadWarningMessage != null && uploadComponentId != null) {
+ FacesContext.getCurrentInstance().addMessage(uploadComponentId, new FacesMessage(FacesMessage.SEVERITY_ERROR, "upload warning", uploadWarningMessage));
+ }
+
+ // We clear the following duplicate warning labels, because we want to
+ // only inform the user of the duplicates dropped in the current upload
+ // attempt - for ex., one batch of drag-and-dropped files, or a single
+ // file uploaded through the file chooser.
+
+ dupeFileNamesExisting = null;
+ dupeFileNamesNew = null;
+ multipleDupesExisting = false;
+ multipleDupesNew = false;
+ uploadWarningMessage = null;
+
+ }
+ private String uploadWarningMessage = null;
+ private String uploadComponentId = null;
public void handleFileUpload(FileUploadEvent event) {
+ if (!uploadInProgress) {
+ uploadInProgress = true;
+ }
UploadedFile uFile = event.getFile();
List dFileList = null;
-
try {
- // Note: A single file may be unzipped into multiple files
- dFileList = ingestService.createDataFiles(workingVersion, uFile.getInputstream(), uFile.getFileName(), uFile.getContentType());
+ // Note: A single uploaded file may produce multiple datafiles -
+ // for example, multiple files can be extracted from an uncompressed
+ // zip file.
+ dFileList = FileUtil.createDataFiles(workingVersion, uFile.getInputstream(), uFile.getFileName(), uFile.getContentType(), systemConfig);
+
} catch (IOException ioex) {
logger.warning("Failed to process and/or save the file " + uFile.getFileName() + "; " + ioex.getMessage());
return;
}
// -----------------------------------------------------------
- // Check if there are duplicate files or ingest warnings
+ // These raw datafiles are then post-processed, in order to drop any files
+ // already in the dataset/already uploaded, and to correct duplicate file names, etc.
// -----------------------------------------------------------
String warningMessage = processUploadedFileList(dFileList);
+
if (warningMessage != null){
- logger.fine("trying to send faces message to " + event.getComponent().getClientId());
- FacesContext.getCurrentInstance().addMessage(event.getComponent().getClientId(), new FacesMessage(FacesMessage.SEVERITY_ERROR, "upload failure", warningMessage));
+ uploadWarningMessage = warningMessage;
+ FacesContext.getCurrentInstance().addMessage(event.getComponent().getClientId(), new FacesMessage(FacesMessage.SEVERITY_ERROR, "upload warning", warningMessage));
+ // save the component id of the p:upload widget, so that we could
+ // send an info message there, from elsewhere in the code:
+ uploadComponentId = event.getComponent().getClientId();
}
}
@@ -1251,88 +1411,125 @@ public void handleFileUpload(FileUploadEvent event) {
* check the list of DataFile objects
* @param dFileList
*/
- private String processUploadedFileList(List dFileList){
+
+ private String dupeFileNamesExisting = null;
+ private String dupeFileNamesNew = null;
+ private boolean multipleDupesExisting = false;
+ private boolean multipleDupesNew = false;
+ private boolean uploadInProgress = false;
+
+ private String processUploadedFileList(List dFileList) {
+ if (dFileList == null) {
+ return null;
+ }
- DataFile dataFile = null;
- String duplicateFileNames = null;
- boolean multipleFiles = dFileList.size() > 1;
- boolean multipleDupes = false;
+ DataFile dataFile;
String warningMessage = null;
// -----------------------------------------------------------
// Iterate through list of DataFile objects
// -----------------------------------------------------------
- if (dFileList != null) {
- for (int i = 0; i < dFileList.size(); i++) {
- dataFile = dFileList.get(i);
+ for (int i = 0; i < dFileList.size(); i++) {
+ dataFile = dFileList.get(i);
- //logger.info("dataFile: " + dataFile);
-
- // -----------------------------------------------------------
- // Check for ingest warnings
- // -----------------------------------------------------------
- if (dataFile.isIngestProblem()) {
- if (dataFile.getIngestReportMessage() != null) {
- if (warningMessage == null) {
- warningMessage = dataFile.getIngestReportMessage();
- } else {
- warningMessage = warningMessage.concat("; " + dataFile.getIngestReportMessage());
- }
+ // -----------------------------------------------------------
+ // Check for ingest warnings
+ // -----------------------------------------------------------
+ if (dataFile.isIngestProblem()) {
+ if (dataFile.getIngestReportMessage() != null) {
+ if (warningMessage == null) {
+ warningMessage = dataFile.getIngestReportMessage();
+ } else {
+ warningMessage = warningMessage.concat("; " + dataFile.getIngestReportMessage());
}
- dataFile.setIngestDone();
}
+ dataFile.setIngestDone();
+ }
- // -----------------------------------------------------------
- // Check for duplicates -- e.g. file is already in the dataset
- // -----------------------------------------------------------
- if (!isDuplicate(dataFile.getFileMetadata())) {
- newFiles.add(dataFile); // looks good
- fileMetadatas.add(dataFile.getFileMetadata());
+ // -----------------------------------------------------------
+ // Check for duplicates -- e.g. file is already in the dataset,
+ // or if another file with the same checksum has already been
+ // uploaded.
+ // -----------------------------------------------------------
+ if (isFileAlreadyInDataset(dataFile)) {
+ if (dupeFileNamesExisting == null) {
+ dupeFileNamesExisting = dataFile.getFileMetadata().getLabel();
} else {
- if (duplicateFileNames == null) {
- duplicateFileNames = dataFile.getFileMetadata().getLabel();
- } else {
- duplicateFileNames = duplicateFileNames.concat(", " + dataFile.getFileMetadata().getLabel());
- multipleDupes = true;
- }
+ dupeFileNamesExisting = dupeFileNamesExisting.concat(", " + dataFile.getFileMetadata().getLabel());
+ multipleDupesExisting = true;
+ }
+ // skip
+ } else if (isFileAlreadyUploaded(dataFile)) {
+ if (dupeFileNamesNew == null) {
+ dupeFileNamesNew = dataFile.getFileMetadata().getLabel();
+ } else {
+ dupeFileNamesNew = dupeFileNamesNew.concat(", " + dataFile.getFileMetadata().getLabel());
+ multipleDupesNew = true;
+ }
+ // skip
+ } else {
+ // OK, this one is not a duplicate, we want it.
+ // But let's check if its filename is a duplicate of another
+ // file already uploaded, or already in the dataset:
+ dataFile.getFileMetadata().setLabel(duplicateFilenameCheck(dataFile.getFileMetadata()));
+ if (isTemporaryPreviewAvailable(dataFile.getStorageIdentifier(), dataFile.getContentType())) {
+ dataFile.setPreviewImageAvailable(true);
+ }
+ uploadedFiles.add(dataFile);
+ // We are NOT adding the fileMetadata to the list that is being used
+ // to render the page; we'll do that once we know that all the individual uploads
+ // in this batch (as in, a bunch of drag-and-dropped files) have finished.
+ //fileMetadatas.add(dataFile.getFileMetadata());
+ }
- // remove the file from the dataset (since createDataFiles has already linked
- // it to the dataset!
- // first, through the filemetadata list, then through tht datafiles list:
- Iterator fmIt = dataset.getEditVersion().getFileMetadatas().iterator();
- while (fmIt.hasNext()) {
- FileMetadata fm = fmIt.next();
- if (fm.getId() == null && dataFile.getStorageIdentifier().equals(fm.getDataFile().getStorageIdentifier())) {
- fmIt.remove();
- break;
- }
+ /*
+ preserved old, pre 4.6 code - mainly as an illustration of how we used to do this.
+
+ if (!isDuplicate(dataFile.getFileMetadata())) {
+ newFiles.add(dataFile); // looks good
+ fileMetadatas.add(dataFile.getFileMetadata());
+ } else {
+ if (duplicateFileNames == null) {
+ duplicateFileNames = dataFile.getFileMetadata().getLabel();
+ } else {
+ duplicateFileNames = duplicateFileNames.concat(", " + dataFile.getFileMetadata().getLabel());
+ multipleDupes = true;
+ }
+
+ // remove the file from the dataset (since createDataFiles has already linked
+ // it to the dataset!
+ // first, through the filemetadata list, then through tht datafiles list:
+ Iterator fmIt = dataset.getEditVersion().getFileMetadatas().iterator();
+ while (fmIt.hasNext()) {
+ FileMetadata fm = fmIt.next();
+ if (fm.getId() == null && dataFile.getStorageIdentifier().equals(fm.getDataFile().getStorageIdentifier())) {
+ fmIt.remove();
+ break;
}
+ }
- Iterator dfIt = dataset.getFiles().iterator();
- while (dfIt.hasNext()) {
- DataFile dfn = dfIt.next();
- if (dfn.getId() == null && dataFile.getStorageIdentifier().equals(dfn.getStorageIdentifier())) {
- dfIt.remove();
- break;
- }
+ Iterator dfIt = dataset.getFiles().iterator();
+ while (dfIt.hasNext()) {
+ DataFile dfn = dfIt.next();
+ if (dfn.getId() == null && dataFile.getStorageIdentifier().equals(dfn.getStorageIdentifier())) {
+ dfIt.remove();
+ break;
}
}
- }
+ } */
}
// -----------------------------------------------------------
- // Formate error message for duplicate files
+ // Format error message for duplicate files
+ // (note the separate messages for the files already in the dataset,
+ // and the newly uploaded ones)
// -----------------------------------------------------------
- if (duplicateFileNames != null) {
+ if (dupeFileNamesExisting != null) {
String duplicateFilesErrorMessage = null;
- if (multipleDupes) {
- duplicateFilesErrorMessage = "The following files already exist in the dataset: " + duplicateFileNames;
+ if (multipleDupesExisting) {
+ duplicateFilesErrorMessage = "The following files already exist in the dataset: " + dupeFileNamesExisting + " (skipping)";
} else {
- if (multipleFiles) {
- duplicateFilesErrorMessage = "The following file already exists in the dataset: " + duplicateFileNames;
- } else {
- duplicateFilesErrorMessage = "This file already exists in this dataset. Please upload another file.";
- }
+ duplicateFilesErrorMessage = "The following file already exists in the dataset: " + dupeFileNamesExisting;
}
if (warningMessage == null) {
warningMessage = duplicateFilesErrorMessage;
@@ -1340,16 +1537,115 @@ private String processUploadedFileList(List dFileList){
warningMessage = warningMessage.concat("; " + duplicateFilesErrorMessage);
}
}
-
+
+ if (dupeFileNamesNew != null) {
+ String duplicateFilesErrorMessage = null;
+ if (multipleDupesNew) {
+ duplicateFilesErrorMessage = "The following files are duplicates of (an) already uploaded file(s): " + dupeFileNamesNew + " (skipping)";
+ } else {
+ duplicateFilesErrorMessage = "The following file is a duplicate of an already uploaded file: " + dupeFileNamesNew + " (skipping)";
+ }
+
+ if (warningMessage == null) {
+ warningMessage = duplicateFilesErrorMessage;
+ } else {
+ warningMessage = warningMessage.concat("; " + duplicateFilesErrorMessage);
+ }
+ }
+
if (warningMessage != null) {
logger.severe(warningMessage);
- return warningMessage; // there's an issue return error message
- } else {
- return null; // looks good, return null
+ return warningMessage;
+ }
+
+ return null;
+ }
+
+ public boolean isTemporaryPreviewAvailable(String fileSystemId, String mimeType) {
+
+ String filesRootDirectory = System.getProperty("dataverse.files.directory");
+ if (filesRootDirectory == null || filesRootDirectory.equals("")) {
+ filesRootDirectory = "/tmp/files";
+ }
+
+ String fileSystemName = filesRootDirectory + "/temp/" + fileSystemId;
+
+ String imageThumbFileName = null;
+
+ if ("application/pdf".equals(mimeType)) {
+ imageThumbFileName = ImageThumbConverter.generatePDFThumb(fileSystemName);
+ } else if (mimeType != null && mimeType.startsWith("image/")) {
+ imageThumbFileName = ImageThumbConverter.generateImageThumb(fileSystemName);
}
+
+ if (imageThumbFileName != null) {
+ return true;
+ }
+
+ return false;
+ }
+
+ private Set fileLabelsExisting = null;
+
+ private String duplicateFilenameCheck(FileMetadata fileMetadata) {
+ if (fileLabelsExisting == null) {
+ fileLabelsExisting = IngestUtil.existingPathNamesAsSet(workingVersion);
+ }
+
+ return IngestUtil.duplicateFilenameCheck(fileMetadata, fileLabelsExisting);
}
+
+ private Map checksumMapOld = null; // checksums of the files already in the dataset
+ private Map checksumMapNew = null; // checksums of the new files already uploaded
+ private void initChecksumMap() {
+ checksumMapOld = new HashMap<>();
+
+ Iterator fmIt = workingVersion.getFileMetadatas().iterator();
+
+ while (fmIt.hasNext()) {
+ FileMetadata fm = fmIt.next();
+ if (fm.getId() != null && fm.getDataFile() != null) {
+ String chksum = fm.getDataFile().getChecksumValue();
+ if (chksum != null) {
+ checksumMapOld.put(chksum, 1);
+
+ }
+ }
+ }
+ }
+
+ private boolean isFileAlreadyInDataset(DataFile dataFile) {
+ if (checksumMapOld == null) {
+ initChecksumMap();
+ }
+
+ String chksum = dataFile.getChecksumValue();
+
+ return chksum == null ? false : checksumMapOld.get(chksum) != null;
+ }
+
+ private boolean isFileAlreadyUploaded(DataFile dataFile) {
+ if (checksumMapNew == null) {
+ checksumMapNew = new HashMap<>();
+ }
+
+ String chksum = dataFile.getChecksumValue();
+
+ if (chksum == null) {
+ return false;
+ }
+
+ if (checksumMapNew.get(chksum) != null) {
+ return true;
+ }
+
+ checksumMapNew.put(chksum, 1);
+ return false;
+ }
+
+
public boolean isLocked() {
if (dataset != null) {
logger.fine("checking lock status of dataset " + dataset.getId());
@@ -1849,30 +2145,6 @@ public void saveAdvancedOptions() {
fileMetadataSelectedForIngestOptionsPopup = null;
}
- public String getFileDateToDisplay(FileMetadata fileMetadata) {
- Date fileDate = null;
- DataFile datafile = fileMetadata.getDataFile();
- if (datafile != null) {
- boolean fileHasBeenReleased = datafile.isReleased();
- if (fileHasBeenReleased) {
- Timestamp filePublicationTimestamp = datafile.getPublicationDate();
- if (filePublicationTimestamp != null) {
- fileDate = filePublicationTimestamp;
- }
- } else {
- Timestamp fileCreateTimestamp = datafile.getCreateDate();
- if (fileCreateTimestamp != null) {
- fileDate = fileCreateTimestamp;
- }
- }
- }
- if (fileDate != null) {
- return displayDateFormat.format(fileDate);
- }
-
- return "";
- }
-
private void populateFileMetadatas() {
if (selectedFileIdsList != null) {
diff --git a/src/main/java/edu/harvard/iq/dataverse/FileDownloadHelper.java b/src/main/java/edu/harvard/iq/dataverse/FileDownloadHelper.java
new file mode 100644
index 00000000000..d4035314b8f
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/FileDownloadHelper.java
@@ -0,0 +1,186 @@
+/*
+ * To change this license header, choose License Headers in Project Properties.
+ * To change this template file, choose Tools | Templates
+ * and open the template in the editor.
+ */
+package edu.harvard.iq.dataverse;
+
+import edu.harvard.iq.dataverse.authorization.Permission;
+import edu.harvard.iq.dataverse.authorization.users.GuestUser;
+import java.util.HashMap;
+import java.util.Map;
+import javax.ejb.EJB;
+import javax.faces.view.ViewScoped;
+import javax.inject.Inject;
+import javax.inject.Named;
+
+/**
+ *
+ * @author skraffmi
+ *
+ *
+ */
+ @ViewScoped
+ @Named
+public class FileDownloadHelper implements java.io.Serializable {
+
+ @Inject
+ DataverseSession session;
+
+ @EJB
+ PermissionServiceBean permissionService;
+
+
+ private final Map fileDownloadPermissionMap = new HashMap<>(); // { FileMetadata.id : Boolean }
+
+
+
+
+ /**
+ * WARNING: Before calling this, make sure the user has download
+ * permission for the file!! (See DatasetPage.canDownloadFile())
+ *
+ * Should there be a Explore WorldMap Button for this file?
+ * See table in: https://github.com/IQSS/dataverse/issues/1618
+ *
+ * (1) Does the file have MapLayerMetadata?
+ * (2) Are the proper settings in place
+ *
+ * @param fm fileMetadata
+ * @return boolean
+ */
+ public boolean canDownloadFile(FileMetadata fileMetadata){
+ if (fileMetadata == null){
+ return false;
+ }
+
+ if ((fileMetadata.getId() == null) || (fileMetadata.getDataFile().getId() == null)){
+ return false;
+ }
+
+ // --------------------------------------------------------------------
+ // Grab the fileMetadata.id and restriction flag
+ // --------------------------------------------------------------------
+ Long fid = fileMetadata.getId();
+ //logger.info("calling candownloadfile on filemetadata "+fid);
+ boolean isRestrictedFile = fileMetadata.isRestricted();
+
+ // --------------------------------------------------------------------
+ // Has this file been checked? Look at the DatasetPage hash
+ // --------------------------------------------------------------------
+ if (this.fileDownloadPermissionMap.containsKey(fid)){
+ // Yes, return previous answer
+ //logger.info("using cached result for candownloadfile on filemetadata "+fid);
+ return this.fileDownloadPermissionMap.get(fid);
+ }
+ //----------------------------------------------------------------------
+ //(0) Before we do any testing - if version is deaccessioned and user
+ // does not have edit dataset permission then may download
+ //----------------------------------------------------------------------
+
+ if (fileMetadata.getDatasetVersion().isDeaccessioned()) {
+ if (this.doesSessionUserHavePermission(Permission.EditDataset, fileMetadata)) {
+ // Yes, save answer and return true
+ this.fileDownloadPermissionMap.put(fid, true);
+ return true;
+ } else {
+ this.fileDownloadPermissionMap.put(fid, false);
+ return false;
+ }
+ }
+
+ // --------------------------------------------------------------------
+ // (1) Is the file Unrestricted ?
+ // --------------------------------------------------------------------
+ if (!isRestrictedFile){
+ // Yes, save answer and return true
+ this.fileDownloadPermissionMap.put(fid, true);
+ return true;
+ }
+
+ // --------------------------------------------------------------------
+ // Conditions (2) through (4) are for Restricted files
+ // --------------------------------------------------------------------
+
+ // --------------------------------------------------------------------
+ // (2) In Dataverse 4.3 and earlier we required that users be authenticated
+ // to download files, but in developing the Private URL feature, we have
+ // added a new subclass of "User" called "PrivateUrlUser" that returns false
+ // for isAuthenticated but that should be able to download restricted files
+ // when given the Member role (which includes the DownloadFile permission).
+ // This is consistent with how Builtin and Shib users (both are
+ // AuthenticatedUsers) can download restricted files when they are granted
+ // the Member role. For this reason condition 2 has been changed. Previously,
+ // we required isSessionUserAuthenticated to return true. Now we require
+ // that the User is not an instance of GuestUser, which is similar in
+ // spirit to the previous check.
+ // --------------------------------------------------------------------
+
+ if (session.getUser() instanceof GuestUser){
+ this.fileDownloadPermissionMap.put(fid, false);
+ return false;
+ }
+
+
+ // --------------------------------------------------------------------
+ // (3) Does the User have DownloadFile Permission at the **Dataset** level
+ // --------------------------------------------------------------------
+
+
+ if (this.doesSessionUserHavePermission(Permission.DownloadFile, fileMetadata)){
+ // Yes, save answer and return true
+ this.fileDownloadPermissionMap.put(fid, true);
+ return true;
+ }
+
+
+ // --------------------------------------------------------------------
+ // (4) Does the user has DownloadFile permission on the DataFile
+ // --------------------------------------------------------------------
+ /*
+ if (this.permissionService.on(fileMetadata.getDataFile()).has(Permission.DownloadFile)){
+ this.fileDownloadPermissionMap.put(fid, true);
+ return true;
+ }
+ */
+
+ // --------------------------------------------------------------------
+ // (6) No download....
+ // --------------------------------------------------------------------
+ this.fileDownloadPermissionMap.put(fid, false);
+
+ return false;
+ }
+
+ public boolean doesSessionUserHavePermission(Permission permissionToCheck, FileMetadata fileMetadata){
+ if (permissionToCheck == null){
+ return false;
+ }
+
+ DvObject objectToCheck = null;
+
+ if (permissionToCheck.equals(Permission.EditDataset)){
+ objectToCheck = fileMetadata.getDatasetVersion().getDataset();
+ } else if (permissionToCheck.equals(Permission.DownloadFile)){
+ objectToCheck = fileMetadata.getDataFile();
+ }
+
+ if (objectToCheck == null){
+ return false;
+ }
+
+ boolean hasPermission = this.permissionService.userOn(this.session.getUser(), objectToCheck).has(permissionToCheck);
+
+ // return true/false
+ return hasPermission;
+ }
+
+ public DataverseSession getSession() {
+ return session;
+ }
+
+ public void setSession(DataverseSession session) {
+ this.session = session;
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/FileDownloadServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/FileDownloadServiceBean.java
new file mode 100644
index 00000000000..ccc8f6847b7
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/FileDownloadServiceBean.java
@@ -0,0 +1,436 @@
+package edu.harvard.iq.dataverse;
+
+import edu.harvard.iq.dataverse.authorization.Permission;
+import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
+import edu.harvard.iq.dataverse.datasetutility.TwoRavensHelper;
+import edu.harvard.iq.dataverse.datasetutility.WorldMapPermissionHelper;
+import edu.harvard.iq.dataverse.engine.command.Command;
+import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
+import edu.harvard.iq.dataverse.engine.command.impl.CreateGuestbookResponseCommand;
+import edu.harvard.iq.dataverse.engine.command.impl.RequestAccessCommand;
+import java.io.IOException;
+import java.sql.Timestamp;
+import java.util.ArrayList;
+import java.util.Arrays;
+import java.util.Date;
+import java.util.List;
+import java.util.logging.Logger;
+import javax.ejb.EJB;
+import javax.ejb.Stateless;
+import javax.faces.context.ExternalContext;
+import javax.faces.context.FacesContext;
+import javax.inject.Inject;
+import javax.inject.Named;
+import javax.persistence.EntityManager;
+import javax.persistence.PersistenceContext;
+import javax.servlet.ServletOutputStream;
+import javax.servlet.http.HttpServletResponse;
+import org.primefaces.context.RequestContext;
+
+/**
+ *
+ * @author skraffmi
+ * Handles All File Download processes
+ * including Guestbook responses
+ */
+@Stateless
+@Named
+public class FileDownloadServiceBean implements java.io.Serializable {
+
+ @PersistenceContext(unitName = "VDCNet-ejbPU")
+ private EntityManager em;
+
+ @EJB
+ GuestbookResponseServiceBean guestbookResponseService;
+ @EJB
+ DatasetServiceBean datasetService;
+ @EJB
+ DatasetVersionServiceBean datasetVersionService;
+ @EJB
+ DataFileServiceBean datafileService;
+ @EJB
+ PermissionServiceBean permissionService;
+ @EJB
+ DataverseServiceBean dataverseService;
+ @EJB
+ UserNotificationServiceBean userNotificationService;
+
+ @Inject
+ DataverseSession session;
+
+ @EJB
+ EjbDataverseEngine commandEngine;
+
+ @Inject
+ DataverseRequestServiceBean dvRequestService;
+
+ @Inject TwoRavensHelper twoRavensHelper;
+ @Inject WorldMapPermissionHelper worldMapPermissionHelper;
+
+ private static final Logger logger = Logger.getLogger(FileDownloadServiceBean.class.getCanonicalName());
+
+
+ public void writeGuestbookAndStartDownload(GuestbookResponse guestbookResponse){
+ if (guestbookResponse != null && guestbookResponse.getDataFile() != null ){
+ writeGuestbookResponseRecord(guestbookResponse);
+ callDownloadServlet(guestbookResponse.getFileFormat(), guestbookResponse.getDataFile().getId(), guestbookResponse.isWriteResponse());
+ }
+
+ if (guestbookResponse != null && guestbookResponse.getSelectedFileIds() != null ){
+ List list = new ArrayList<>(Arrays.asList(guestbookResponse.getSelectedFileIds().split(",")));
+
+ for (String idAsString : list) {
+ DataFile df = datafileService.findCheapAndEasy(new Long(idAsString)) ;
+ if (df != null) {
+ guestbookResponse.setDataFile(df);
+ writeGuestbookResponseRecord(guestbookResponse);
+ }
+ }
+
+ callDownloadServlet(guestbookResponse.getSelectedFileIds(), true);
+ }
+
+
+ }
+
+ public void writeGuestbookResponseRecord(GuestbookResponse guestbookResponse) {
+
+ try {
+ Command cmd = new CreateGuestbookResponseCommand(dvRequestService.getDataverseRequest(), guestbookResponse, guestbookResponse.getDataset());
+ commandEngine.submit(cmd);
+ } catch (CommandException e) {
+ //if an error occurs here then download won't happen no need for response recs...
+
+ }
+
+ }
+
+ public void callDownloadServlet(String multiFileString, Boolean gbRecordsWritten){
+
+ String fileDownloadUrl = "/api/access/datafiles/" + multiFileString;
+ if (gbRecordsWritten){
+ fileDownloadUrl += "?gbrecs=true";
+ }
+ try {
+ FacesContext.getCurrentInstance().getExternalContext().redirect(fileDownloadUrl);
+ } catch (IOException ex) {
+ logger.info("Failed to issue a redirect to file download url.");
+ }
+
+ //return fileDownloadUrl;
+ }
+
+ //private String callDownloadServlet( String downloadType, Long fileId){
+ public void callDownloadServlet( String downloadType, Long fileId, Boolean gbRecordsWritten){
+
+ String fileDownloadUrl = "/api/access/datafile/" + fileId;
+
+ if (downloadType != null && downloadType.equals("bundle")){
+ fileDownloadUrl = "/api/access/datafile/bundle/" + fileId;
+ }
+ if (downloadType != null && downloadType.equals("original")){
+ fileDownloadUrl = "/api/access/datafile/" + fileId + "?format=original";
+ }
+ if (downloadType != null && downloadType.equals("RData")){
+ fileDownloadUrl = "/api/access/datafile/" + fileId + "?format=RData";
+ }
+ if (downloadType != null && downloadType.equals("var")){
+ fileDownloadUrl = "/api/meta/datafile/" + fileId;
+ }
+ if (downloadType != null && downloadType.equals("tab")){
+ fileDownloadUrl = "/api/access/datafile/" + fileId+ "?format=tab";
+ }
+ if (gbRecordsWritten){
+ if(downloadType != null && ( downloadType.equals("original") || downloadType.equals("RData") || downloadType.equals("tab")) ){
+ fileDownloadUrl += "&gbrecs=true";
+ } else {
+ fileDownloadUrl += "?gbrecs=true";
+ }
+
+ }
+ logger.fine("Returning file download url: " + fileDownloadUrl);
+ try {
+ FacesContext.getCurrentInstance().getExternalContext().redirect(fileDownloadUrl);
+ } catch (IOException ex) {
+ logger.info("Failed to issue a redirect to file download url.");
+ }
+ //return fileDownloadUrl;
+ }
+
+ //public String startFileDownload(FileMetadata fileMetadata, String format) {
+ public void startFileDownload(GuestbookResponse guestbookResponse, FileMetadata fileMetadata, String format) {
+ boolean recordsWritten = false;
+ if(!fileMetadata.getDatasetVersion().isDraft()){
+ guestbookResponse = guestbookResponseService.modifyDatafileAndFormat(guestbookResponse, fileMetadata, format);
+ writeGuestbookResponseRecord(guestbookResponse);
+ recordsWritten = true;
+ }
+ callDownloadServlet(format, fileMetadata.getDataFile().getId(), recordsWritten);
+ logger.fine("issued file download redirect for filemetadata "+fileMetadata.getId()+", datafile "+fileMetadata.getDataFile().getId());
+ }
+
+ public String startExploreDownloadLink(GuestbookResponse guestbookResponse, FileMetadata fmd){
+
+ if (guestbookResponse != null && guestbookResponse.isWriteResponse()
+ && (( fmd != null && fmd.getDataFile() != null) || guestbookResponse.getDataFile() != null)){
+ if(guestbookResponse.getDataFile() == null && fmd != null){
+ guestbookResponse.setDataFile(fmd.getDataFile());
+ }
+ if (fmd == null || !fmd.getDatasetVersion().isDraft()){
+ writeGuestbookResponseRecord(guestbookResponse);
+ }
+ }
+
+ Long datafileId;
+
+ if (fmd == null && guestbookResponse != null && guestbookResponse.getDataFile() != null){
+ datafileId = guestbookResponse.getDataFile().getId();
+ } else {
+ datafileId = fmd.getDataFile().getId();
+ }
+ String retVal = twoRavensHelper.getDataExploreURLComplete(datafileId);
+
+ try {
+ FacesContext.getCurrentInstance().getExternalContext().redirect(retVal);
+ return retVal;
+ } catch (IOException ex) {
+ logger.info("Failed to issue a redirect to file download url.");
+ }
+ return retVal;
+ }
+
+ public String startWorldMapDownloadLink(GuestbookResponse guestbookResponse, FileMetadata fmd){
+
+ if (guestbookResponse != null && guestbookResponse.isWriteResponse() && ((fmd != null && fmd.getDataFile() != null) || guestbookResponse.getDataFile() != null)){
+ if(guestbookResponse.getDataFile() == null && fmd != null){
+ guestbookResponse.setDataFile(fmd.getDataFile());
+ }
+ if (fmd == null || !fmd.getDatasetVersion().isDraft()){
+ writeGuestbookResponseRecord(guestbookResponse);
+ }
+ }
+ DataFile file = null;
+ if (fmd != null){
+ file = fmd.getDataFile();
+ }
+ if (guestbookResponse != null && guestbookResponse.getDataFile() != null){
+ file = guestbookResponse.getDataFile();
+ }
+
+
+ String retVal = worldMapPermissionHelper.getMapLayerMetadata(file).getLayerLink();
+
+ try {
+ FacesContext.getCurrentInstance().getExternalContext().redirect(retVal);
+ return retVal;
+ } catch (IOException ex) {
+ logger.info("Failed to issue a redirect to file download url.");
+ }
+ return retVal;
+ }
+
+ public boolean isDownloadPopupRequired(DatasetVersion datasetVersion) {
+ // Each of these conditions is sufficient reason to have to
+ // present the user with the popup:
+ if (datasetVersion == null){
+ return false;
+ }
+ //0. if version is draft then Popup "not required"
+ if (!datasetVersion.isReleased()){
+ return false;
+ }
+ // 1. License and Terms of Use:
+ if (datasetVersion.getTermsOfUseAndAccess() != null) {
+ if (!TermsOfUseAndAccess.License.CC0.equals(datasetVersion.getTermsOfUseAndAccess().getLicense())
+ && !(datasetVersion.getTermsOfUseAndAccess().getTermsOfUse() == null
+ || datasetVersion.getTermsOfUseAndAccess().getTermsOfUse().equals(""))) {
+ return true;
+ }
+
+ // 2. Terms of Access:
+ if (!(datasetVersion.getTermsOfUseAndAccess().getTermsOfAccess() == null) && !datasetVersion.getTermsOfUseAndAccess().getTermsOfAccess().equals("")) {
+ return true;
+ }
+ }
+
+ // 3. Guest Book:
+ if (datasetVersion.getDataset().getGuestbook() != null && datasetVersion.getDataset().getGuestbook().isEnabled() && datasetVersion.getDataset().getGuestbook().getDataverse() != null ) {
+ return true;
+ }
+
+ return false;
+ }
+
+ public Boolean canSeeTwoRavensExploreButton(){
+ return false;
+ }
+
+
+ public Boolean canUserSeeExploreWorldMapButton(){
+ return false;
+ }
+
+ public void downloadDatasetCitationXML(Dataset dataset) {
+ downloadCitationXML(null, dataset);
+ }
+
+ public void downloadDatafileCitationXML(FileMetadata fileMetadata) {
+ downloadCitationXML(fileMetadata, null);
+ }
+
+ public void downloadCitationXML(FileMetadata fileMetadata, Dataset dataset) {
+ DatasetVersion workingVersion;
+ if (dataset != null){
+ workingVersion = dataset.getLatestVersion();
+ } else {
+ workingVersion = fileMetadata.getDatasetVersion();
+ }
+ String xml = datasetService.createCitationXML(workingVersion, fileMetadata);
+ FacesContext ctx = FacesContext.getCurrentInstance();
+ HttpServletResponse response = (HttpServletResponse) ctx.getExternalContext().getResponse();
+ response.setContentType("text/xml");
+ String fileNameString = "";
+ if (fileMetadata == null || fileMetadata.getLabel() == null) {
+ // Dataset-level citation:
+ fileNameString = "attachment;filename=" + getFileNameDOI(workingVersion) + ".xml";
+ } else {
+ // Datafile-level citation:
+ fileNameString = "attachment;filename=" + getFileNameDOI(workingVersion) + "-" + fileMetadata.getLabel().replaceAll("\\.tab$", "-endnote.xml");
+ }
+ response.setHeader("Content-Disposition", fileNameString);
+ try {
+ ServletOutputStream out = response.getOutputStream();
+ out.write(xml.getBytes());
+ out.flush();
+ ctx.responseComplete();
+ } catch (Exception e) {
+
+ }
+ }
+
+ public void downloadDatasetCitationRIS(Dataset dataset) {
+
+ downloadCitationRIS(null, dataset);
+
+ }
+
+ public void downloadDatafileCitationRIS(FileMetadata fileMetadata) {
+ downloadCitationRIS(fileMetadata, null);
+ }
+
+ public void downloadCitationRIS(FileMetadata fileMetadata, Dataset dataset) {
+ DatasetVersion workingVersion;
+ if (dataset != null){
+ workingVersion = dataset.getLatestVersion();
+ } else {
+ workingVersion = fileMetadata.getDatasetVersion();
+ }
+ String risFormatDowload = datasetService.createCitationRIS(workingVersion, fileMetadata);
+ FacesContext ctx = FacesContext.getCurrentInstance();
+ HttpServletResponse response = (HttpServletResponse) ctx.getExternalContext().getResponse();
+ response.setContentType("application/download");
+
+ String fileNameString = "";
+ if (fileMetadata == null || fileMetadata.getLabel() == null) {
+ // Dataset-level citation:
+ fileNameString = "attachment;filename=" + getFileNameDOI(workingVersion) + ".ris";
+ } else {
+ // Datafile-level citation:
+ fileNameString = "attachment;filename=" + getFileNameDOI(workingVersion) + "-" + fileMetadata.getLabel().replaceAll("\\.tab$", ".ris");
+ }
+ response.setHeader("Content-Disposition", fileNameString);
+
+ try {
+ ServletOutputStream out = response.getOutputStream();
+ out.write(risFormatDowload.getBytes());
+ out.flush();
+ ctx.responseComplete();
+ } catch (Exception e) {
+
+ }
+ }
+
+ private String getFileNameDOI(DatasetVersion workingVersion) {
+ Dataset ds = workingVersion.getDataset();
+ return "DOI:" + ds.getAuthority() + "_" + ds.getIdentifier().toString();
+ }
+
+ public void downloadDatasetCitationBibtex(Dataset dataset) {
+
+ downloadCitationBibtex(null, dataset);
+
+ }
+
+ public void downloadDatafileCitationBibtex(FileMetadata fileMetadata) {
+ downloadCitationBibtex(fileMetadata, null);
+ }
+
+ public void downloadCitationBibtex(FileMetadata fileMetadata, Dataset dataset) {
+ DatasetVersion workingVersion;
+ if (dataset != null){
+ workingVersion = dataset.getLatestVersion();
+ } else {
+ workingVersion = fileMetadata.getDatasetVersion();
+ }
+ String bibFormatDowload = new BibtexCitation(workingVersion).toString();
+ FacesContext ctx = FacesContext.getCurrentInstance();
+ HttpServletResponse response = (HttpServletResponse) ctx.getExternalContext().getResponse();
+ response.setContentType("application/download");
+
+ String fileNameString = "";
+ if (fileMetadata == null || fileMetadata.getLabel() == null) {
+ // Dataset-level citation:
+ fileNameString = "attachment;filename=" + getFileNameDOI(workingVersion) + ".bib";
+ } else {
+ // Datafile-level citation:
+ fileNameString = "attachment;filename=" + getFileNameDOI(workingVersion) + "-" + fileMetadata.getLabel().replaceAll("\\.tab$", ".bib");
+ }
+ response.setHeader("Content-Disposition", fileNameString);
+
+ try {
+ ServletOutputStream out = response.getOutputStream();
+ out.write(bibFormatDowload.getBytes());
+ out.flush();
+ ctx.responseComplete();
+ } catch (Exception e) {
+
+ }
+ }
+
+
+
+ public void requestAccess(DataFile file) {
+ if (requestAccess(file.getId())) {
+ // update the local file object so that the page properly updates
+ file.getFileAccessRequesters().add((AuthenticatedUser) session.getUser());
+
+ // create notifications
+ sendRequestFileAccessNotification(file.getOwner(), file.getId());
+ }
+
+ }
+
+ public boolean requestAccess(Long fileId) {
+ DataFile file = datafileService.find(fileId);
+ if (!file.getFileAccessRequesters().contains((AuthenticatedUser) session.getUser())) {
+ try {
+ commandEngine.submit(new RequestAccessCommand(dvRequestService.getDataverseRequest(), file));
+ return true;
+ } catch (CommandException ex) {
+ logger.info("Unable to request access for file id " + fileId + ". Exception: " + ex);
+ }
+ }
+
+ return false;
+ }
+
+ public void sendRequestFileAccessNotification(Dataset dataset, Long fileId) {
+ permissionService.getUsersWithPermissionOn(Permission.ManageDatasetPermissions, dataset).stream().forEach((au) -> {
+ userNotificationService.sendNotification(au, new Timestamp(new Date().getTime()), UserNotification.Type.REQUESTFILEACCESS, fileId);
+ });
+
+ }
+
+
+
+}
\ No newline at end of file
diff --git a/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java b/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java
index 7231a457264..e203196c4a5 100644
--- a/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java
+++ b/src/main/java/edu/harvard/iq/dataverse/FileMetadata.java
@@ -1,9 +1,12 @@
package edu.harvard.iq.dataverse;
import java.io.Serializable;
+import java.sql.Timestamp;
+import java.text.DateFormat;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Comparator;
+import java.util.Date;
import java.util.LinkedList;
import java.util.List;
import java.util.logging.Level;
@@ -35,13 +38,18 @@
@Entity
public class FileMetadata implements Serializable {
private static final long serialVersionUID = 1L;
-
+ private static final DateFormat displayDateFormat = DateFormat.getDateInstance(DateFormat.MEDIUM);
private static final Logger logger = Logger.getLogger(FileMetadata.class.getCanonicalName());
- @Pattern(regexp="^[^:<>;#/\"\\*\\|\\?\\\\]*$", message = "File Name cannot contain any of the following characters: \\ / : * ? \" < > | ; # .")
+ @Pattern(regexp="^[^:<>;#/\"\\*\\|\\?\\\\]*$",
+ message = "File Name cannot contain any of the following characters: \\ / : * ? \" < > | ; # .")
@NotBlank(message = "Please specify a file name.")
@Column( nullable=false )
private String label = "";
+ @Pattern(regexp="|[^/\\\\]|^[^/\\\\]+.*[^/\\\\]+$",
+ message = "Directory Name cannot contain leading or trailing file separators.")
+ @Column ( nullable=true )
+ private String directoryLabel;
@Column(columnDefinition = "TEXT")
private String description = "";
@@ -80,6 +88,13 @@ public void setLabel(String label) {
this.label = label;
}
+ public String getDirectoryLabel() {
+ return directoryLabel;
+ }
+
+ public void setDirectoryLabel(String directoryLabel) {
+ this.directoryLabel = directoryLabel;
+ }
public String getDescription() {
return description;
@@ -224,8 +239,49 @@ public void addCategoryByName(String newCategoryName) {
}
}
-
-
+ public String getFileDateToDisplay() {
+ Date fileDate = null;
+ DataFile datafile = this.getDataFile();
+ if (datafile != null) {
+ boolean fileHasBeenReleased = datafile.isReleased();
+ if (fileHasBeenReleased) {
+ Timestamp filePublicationTimestamp = datafile.getPublicationDate();
+ if (filePublicationTimestamp != null) {
+ fileDate = filePublicationTimestamp;
+ }
+ } else {
+ Timestamp fileCreateTimestamp = datafile.getCreateDate();
+ if (fileCreateTimestamp != null) {
+ fileDate = fileCreateTimestamp;
+ }
+ }
+ }
+ if (fileDate != null) {
+ return displayDateFormat.format(fileDate);
+ }
+ return "";
+ }
+
+ public String getFileCitation(){
+ return getFileCitation(false);
+ }
+
+
+
+
+ public String getFileCitation(boolean html){
+ String citation = this.getDatasetVersion().getCitation(html);
+ /*
+ ", #{FilePage.fileMetadata.label} [fileName]"
+
+ */
+ citation += "; " + this.getLabel() + " [fileName]" ;
+ if (this.dataFile.isTabularData() && this.dataFile.getUnf() != null && !this.dataFile.getUnf().isEmpty()){
+ citation += ", " + this.dataFile.getUnf() + " [fileUNF]";
+ }
+ return citation;
+ }
+
public DatasetVersion getDatasetVersion() {
return datasetVersion;
}
@@ -344,6 +400,14 @@ public boolean contentEquals(FileMetadata other) {
} else if (other.getLabel() != null) {
return false;
}
+
+ if (this.getDirectoryLabel() != null) {
+ if (!this.getDirectoryLabel().equals(other.getDirectoryLabel())) {
+ return false;
+ }
+ } else if (other.getDirectoryLabel() != null) {
+ return false;
+ }
if (this.getDescription() != null) {
if (!this.getDescription().equals(other.getDescription())) {
@@ -353,14 +417,6 @@ public boolean contentEquals(FileMetadata other) {
return false;
}
- /*
- * we could also compare the sets of file categories; but since this
- * functionality is for deciding whether to index an extra filemetadata,
- * we're not doing it, as of now; because the categories are not indexed
- * and not displayed on the search cards.
- * -- L.A. 4.0 beta12
- */
-
return true;
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/FilePage.java b/src/main/java/edu/harvard/iq/dataverse/FilePage.java
index 9ff1cdf1389..9dfe0649d21 100644
--- a/src/main/java/edu/harvard/iq/dataverse/FilePage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/FilePage.java
@@ -5,18 +5,22 @@
*/
package edu.harvard.iq.dataverse;
+import edu.harvard.iq.dataverse.DatasetVersionServiceBean.RetrieveDatasetVersionResponse;
+import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
-import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
+import edu.harvard.iq.dataverse.authorization.users.GuestUser;
+import edu.harvard.iq.dataverse.datasetutility.TwoRavensHelper;
+import edu.harvard.iq.dataverse.datasetutility.WorldMapPermissionHelper;
import edu.harvard.iq.dataverse.engine.command.Command;
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
-import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetCommand;
import edu.harvard.iq.dataverse.engine.command.impl.UpdateDatasetCommand;
+import edu.harvard.iq.dataverse.export.ExportException;
+import edu.harvard.iq.dataverse.export.ExportService;
+import edu.harvard.iq.dataverse.export.spi.Exporter;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.JsfHelper;
import static edu.harvard.iq.dataverse.util.JsfHelper.JH;
-import java.io.IOException;
-import java.nio.file.Files;
-import java.nio.file.Paths;
+import edu.harvard.iq.dataverse.util.SystemConfig;
import java.util.ArrayList;
import java.util.Iterator;
import java.util.List;
@@ -43,16 +47,32 @@ public class FilePage implements java.io.Serializable {
private FileMetadata fileMetadata;
private Long fileId;
- private Long datasetVersionId;
- private DataFile file;
-
+ private String version;
+ private DataFile file;
+ private GuestbookResponse guestbookResponse;
+ private int selectedTabIndex;
private Dataset editDataset;
@EJB
DataFileServiceBean datafileService;
+
+ @EJB
+ DatasetVersionServiceBean datasetVersionService;
@EJB
PermissionServiceBean permissionService;
+ @EJB
+ SettingsServiceBean settingsService;
+ @EJB
+ FileDownloadServiceBean fileDownloadService;
+ @EJB
+ GuestbookResponseServiceBean guestbookResponseService;
+ @EJB
+ AuthenticationServiceBean authService;
+
+ @EJB
+ SystemConfig systemConfig;
+
@Inject
DataverseSession session;
@@ -63,9 +83,14 @@ public class FilePage implements java.io.Serializable {
DataverseRequestServiceBean dvRequestService;
@Inject
PermissionsWrapper permissionsWrapper;
+ @Inject
+ FileDownloadHelper fileDownloadHelper;
+ @Inject
+ TwoRavensHelper twoRavensHelper;
+ @Inject WorldMapPermissionHelper worldMapPermissionHelper;
public String init() {
-
+
if ( fileId != null ) {
@@ -82,24 +107,32 @@ public String init() {
if (file == null){
return permissionsWrapper.notFound();
}
-
- fileMetadata = datafileService.findFileMetadataByDatasetVersionIdAndDataFileId(datasetVersionId, fileId);
+ RetrieveDatasetVersionResponse retrieveDatasetVersionResponse;
+ retrieveDatasetVersionResponse = datasetVersionService.selectRequestedVersion(file.getOwner().getVersions(), version);
+ Long getDatasetVersionID = retrieveDatasetVersionResponse.getDatasetVersion().getId();
+ fileMetadata = datafileService.findFileMetadataByDatasetVersionIdAndDataFileId(getDatasetVersionID, fileId);
+
+
if (fileMetadata == null){
return permissionsWrapper.notFound();
}
-
-
// If this DatasetVersion is unpublished and permission is doesn't have permissions:
// > Go to the Login page
//
-
- if ( !permissionService.on(file).has(Permission.DownloadFile)) {
- return permissionsWrapper.notAuthorized();
- }
-
+ // Check permisisons
+
+ Boolean authorized = (fileMetadata.getDatasetVersion().isReleased()) ||
+ (!fileMetadata.getDatasetVersion().isReleased() && this.canViewUnpublishedDataset())
+ || fileMetadata.getDatasetVersion().isDeaccessioned();
+
+ if (!authorized ) {
+ return permissionsWrapper.notAuthorized();
+ }
+
+ this.guestbookResponse = this.guestbookResponseService.initGuestbookResponseForFragment(fileMetadata, session);
} else {
return permissionsWrapper.notFound();
@@ -107,11 +140,24 @@ public String init() {
return null;
}
-
+
+ private boolean canViewUnpublishedDataset() {
+ return permissionsWrapper.canViewUnpublishedDataset( dvRequestService.getDataverseRequest(), fileMetadata.getDatasetVersion().getDataset());
+ }
+
public FileMetadata getFileMetadata() {
return fileMetadata;
}
+
+
+ public boolean isDownloadPopupRequired() {
+ if(fileMetadata.getId() == null || fileMetadata.getDatasetVersion().getId() == null ){
+ return false;
+ }
+ return fileDownloadService.isDownloadPopupRequired(fileMetadata.getDatasetVersion());
+ }
+
public void setFileMetadata(FileMetadata fileMetadata) {
this.fileMetadata = fileMetadata;
@@ -132,17 +178,42 @@ public Long getFileId() {
public void setFileId(Long fileId) {
this.fileId = fileId;
}
-
- public Long getDatasetVersionId() {
- return datasetVersionId;
+
+ public String getVersion() {
+ return version;
}
- public void setDatasetVersionId(Long datasetVersionId) {
- this.datasetVersionId = datasetVersionId;
+ public void setVersion(String version) {
+ this.version = version;
}
+ public List< String[]> getExporters(){
+ List retList = new ArrayList();
+ String myHostURL = systemConfig.getDataverseSiteUrl();
+ for (String [] provider : ExportService.getInstance().getExportersLabels() ){
+ String formatName = provider[1];
+ String formatDisplayName = provider[0];
+
+ Exporter exporter = null;
+ try {
+ exporter = ExportService.getInstance().getExporter(formatName);
+ } catch (ExportException ex) {
+ exporter = null;
+ }
+ if (exporter != null && exporter.isAvailableToUsers()) {
+ // Not all metadata exports should be presented to the web users!
+ // Some are only for harvesting clients.
+
+ String[] temp = new String[2];
+ temp[0] = formatDisplayName;
+ temp[1] = myHostURL + "/api/datasets/export?exporter=" + formatName + "&persistentId=" + fileMetadata.getDatasetVersion().getDataset().getGlobalId();
+ retList.add(temp);
+ }
+ }
+ return retList;
+ }
+
public String restrictFile(boolean restricted){
-
String fileNames = null;
editDataset = this.file.getOwner();
@@ -151,6 +222,7 @@ public String restrictFile(boolean restricted){
for (FileMetadata fmw: editDataset.getEditVersion().getFileMetadatas()){
if (fmw.getDataFile().equals(this.fileMetadata.getDataFile())){
+ fileNames += fmw.getLabel();
fmw.setRestricted(restricted);
}
}
@@ -161,6 +233,7 @@ public String restrictFile(boolean restricted){
JsfHelper.addFlashMessage(successMessage);
}
save();
+ init();
return returnToDraftVersion();
}
@@ -225,9 +298,8 @@ public String save() {
Command cmd;
try {
- System.out.print(filesToBeDeleted.size());
cmd = new UpdateDatasetCommand(editDataset, dvRequestService.getDataverseRequest(), filesToBeDeleted);
- commandEngine.submit(cmd);
+ commandEngine.submit(cmd);
} catch (EJBException ex) {
@@ -249,20 +321,72 @@ public String save() {
}
- JsfHelper.addSuccessMessage(JH.localize("dataset.message.filesSuccess"));
-
- setDatasetVersionId(editDataset.getEditVersion().getId());
+ JsfHelper.addSuccessMessage(JH.localize("dataset.message.filesSuccess"));
+ setVersion("DRAFT");
return "";
}
+ public boolean isThumbnailAvailable(FileMetadata fileMetadata) {
+ // new and optimized logic:
+ // - check download permission here (should be cached - so it's free!)
+ // - only then ask the file service if the thumbnail is available/exists.
+ // the service itself no longer checks download permissions.
+
+ if (!fileDownloadHelper.canDownloadFile(fileMetadata)) {
+ return false;
+ }
+
+ return datafileService.isThumbnailAvailable(fileMetadata.getDataFile());
+ }
+
private String returnToDatasetOnly(){
return "/dataset.xhtml?persistentId=" + editDataset.getGlobalId() + "&version=DRAFT" + "&faces-redirect=true";
}
private String returnToDraftVersion(){
+
+ return "/file.xhtml?fileId=" + fileId + "&version=DRAFT&faces-redirect=true";
+ }
+
+ public FileDownloadServiceBean getFileDownloadService() {
+ return fileDownloadService;
+ }
- return "/file.xhtml?fileId=" + fileId + "&datasetVersionId=" + editDataset.getEditVersion().getId() + "&faces-redirect=true";
+ public void setFileDownloadService(FileDownloadServiceBean fileDownloadService) {
+ this.fileDownloadService = fileDownloadService;
}
+
+ public GuestbookResponseServiceBean getGuestbookResponseService() {
+ return guestbookResponseService;
+ }
+
+ public void setGuestbookResponseService(GuestbookResponseServiceBean guestbookResponseService) {
+ this.guestbookResponseService = guestbookResponseService;
+ }
+
+
+ public GuestbookResponse getGuestbookResponse() {
+ return guestbookResponse;
+ }
+
+ public void setGuestbookResponse(GuestbookResponse guestbookResponse) {
+ this.guestbookResponse = guestbookResponse;
+ }
+
+
+ public boolean canUpdateDataset() {
+ return permissionsWrapper.canUpdateDataset(dvRequestService.getDataverseRequest(), this.file.getOwner());
+ }
+
+ public int getSelectedTabIndex() {
+ return selectedTabIndex;
+ }
+
+ public void setSelectedTabIndex(int selectedTabIndex) {
+ this.selectedTabIndex = selectedTabIndex;
+ }
+
+
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/GuestbookPage.java b/src/main/java/edu/harvard/iq/dataverse/GuestbookPage.java
index a07678472ea..a4d9067e584 100644
--- a/src/main/java/edu/harvard/iq/dataverse/GuestbookPage.java
+++ b/src/main/java/edu/harvard/iq/dataverse/GuestbookPage.java
@@ -15,6 +15,7 @@
import java.util.Date;
import java.util.Iterator;
import java.util.List;
+import java.util.logging.Logger;
import javax.ejb.EJB;
import javax.ejb.EJBException;
import javax.faces.application.FacesMessage;
@@ -32,6 +33,8 @@
@Named("GuestbookPage")
public class GuestbookPage implements java.io.Serializable {
+ private static final Logger logger = Logger.getLogger(GuestbookPage.class.getCanonicalName());
+
@EJB
GuestbookServiceBean guestbookService;
@@ -309,13 +312,12 @@ public String save() {
}
//
FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_FATAL, "Guestbook Save Failed", " - " + error.toString()));
- System.out.print("dataverse " + dataverse.getName());
- System.out.print("Ejb exception");
- System.out.print(error.toString());
+ logger.info("Guestbook Page EJB Exception. Dataverse: " + dataverse.getName());
+ logger.info(error.toString());
return null;
} catch (CommandException ex) {
- System.out.print("command exception");
- System.out.print(ex.toString());
+ logger.info("Guestbook Page Command Exception. Dataverse: " + dataverse.getName());
+ logger.info(ex.toString());
FacesContext.getCurrentInstance().addMessage(null, new FacesMessage(FacesMessage.SEVERITY_FATAL, "Guestbook Save Failed", " - " + ex.toString()));
//logger.severe(ex.getMessage());
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/GuestbookResponse.java b/src/main/java/edu/harvard/iq/dataverse/GuestbookResponse.java
index 53d7876ff11..a3790fd32ce 100644
--- a/src/main/java/edu/harvard/iq/dataverse/GuestbookResponse.java
+++ b/src/main/java/edu/harvard/iq/dataverse/GuestbookResponse.java
@@ -7,7 +7,6 @@
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import java.io.Serializable;
-import java.sql.Timestamp;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
@@ -61,9 +60,51 @@ public class GuestbookResponse implements Serializable {
private String position;
private String downloadtype;
private String sessionId;
-
+
@Temporal(value = TemporalType.TIMESTAMP)
private Date responseTime;
+
+ /*
+ Transient Values carry non-written information
+ that will assist in the download process
+ - selected file ids is a comma delimited list that contains the file ids for multiple download
+ - fileFormat tells the download api which format a subsettable file should be downloaded as
+ - writeResponse is set to false when dataset version is draft.
+ */
+
+ @Transient
+ private String selectedFileIds;
+
+ @Transient
+ private String fileFormat;
+
+ @Transient
+ private boolean writeResponse = true;
+
+ public boolean isWriteResponse() {
+ return writeResponse;
+ }
+
+ public void setWriteResponse(boolean writeResponse) {
+ this.writeResponse = writeResponse;
+ }
+
+ public String getSelectedFileIds() {
+ return selectedFileIds;
+ }
+
+ public void setSelectedFileIds(String selectedFileIds) {
+ this.selectedFileIds = selectedFileIds;
+ }
+
+
+ public String getFileFormat() {
+ return fileFormat;
+ }
+
+ public void setFileFormat(String downloadFormat) {
+ this.fileFormat = downloadFormat;
+ }
public GuestbookResponse(){
diff --git a/src/main/java/edu/harvard/iq/dataverse/GuestbookResponseServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/GuestbookResponseServiceBean.java
index 23a33b0cbf6..fa3e8a67d9e 100644
--- a/src/main/java/edu/harvard/iq/dataverse/GuestbookResponseServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/GuestbookResponseServiceBean.java
@@ -8,24 +8,26 @@
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUser;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.authorization.users.User;
+import static edu.harvard.iq.dataverse.util.JsfHelper.JH;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Calendar;
import java.util.Date;
-import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import javax.ejb.Stateless;
import javax.ejb.TransactionAttribute;
import javax.ejb.TransactionAttributeType;
+import javax.faces.application.FacesMessage;
+import javax.faces.component.UIComponent;
+import javax.faces.component.UIInput;
import javax.faces.context.FacesContext;
+import javax.faces.model.SelectItem;
import javax.inject.Named;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.Query;
import javax.persistence.TypedQuery;
-import javax.servlet.ServletOutputStream;
-import javax.servlet.http.HttpServletResponse;
/**
*
@@ -432,17 +434,126 @@ public AuthenticatedUser getAuthenticatedUser(User user) {
}
return null;
}
+
+
+ public GuestbookResponse initGuestbookResponseForFragment(Dataset dataset, FileMetadata fileMetadata, DataverseSession session){
+
+ DatasetVersion workingVersion = null;
+ if (fileMetadata != null){
+ workingVersion = fileMetadata.getDatasetVersion();
+ } else {
+ workingVersion = dataset.getLatestVersion();
+ }
- public GuestbookResponse initDefaultGuestbookResponse(Dataset dataset, DataFile dataFile, User user, DataverseSession session) {
+
GuestbookResponse guestbookResponse = new GuestbookResponse();
- guestbookResponse.setGuestbook(findDefaultGuestbook());
- if (dataFile != null){
- guestbookResponse.setDataFile(dataFile);
- }
+
+ if(workingVersion != null && workingVersion.isDraft()){
+ guestbookResponse.setWriteResponse(false);
+ }
+
+ if (fileMetadata != null){
+ guestbookResponse.setDataFile(fileMetadata.getDataFile());
+ }
+
+ if (dataset.getGuestbook() != null) {
+ guestbookResponse.setGuestbook(workingVersion.getDataset().getGuestbook());
+ setUserDefaultResponses(guestbookResponse, session);
+ if (fileMetadata != null){
+ guestbookResponse.setDataFile(fileMetadata.getDataFile());
+ }
+ } else {
+ if (fileMetadata != null){
+ guestbookResponse = initDefaultGuestbookResponse(dataset, fileMetadata.getDataFile(), session);
+ } else {
+ guestbookResponse = initDefaultGuestbookResponse(dataset, null, session);
+ }
+ }
+ if (dataset.getGuestbook() != null && !dataset.getGuestbook().getCustomQuestions().isEmpty()) {
+ initCustomQuestions(guestbookResponse, dataset);
+ }
+ guestbookResponse.setDownloadtype("Download");
+
guestbookResponse.setDataset(dataset);
- guestbookResponse.setResponseTime(new Date());
- guestbookResponse.setSessionId(session.toString());
+
+ return guestbookResponse;
+ }
+
+ public GuestbookResponse initGuestbookResponseForFragment(FileMetadata fileMetadata, DataverseSession session){
+ return initGuestbookResponseForFragment(fileMetadata.getDatasetVersion().getDataset(), fileMetadata, session);
+ }
+
+ public void initGuestbookResponse(FileMetadata fileMetadata, String downloadType, DataverseSession session){
+ initGuestbookResponse(fileMetadata, downloadType, null, session);
+ }
+
+ public GuestbookResponse initGuestbookResponse(FileMetadata fileMetadata, String downloadFormat, String selectedFileIds, DataverseSession session) {
+ Dataset dataset;
+ DatasetVersion workingVersion = null;
+ if (fileMetadata != null){
+ workingVersion = fileMetadata.getDatasetVersion();
+ }
+
+
+
+ GuestbookResponse guestbookResponse = new GuestbookResponse();
+
+ if(workingVersion != null && workingVersion.isDraft()){
+ guestbookResponse.setWriteResponse(false);
+ }
+
+ dataset = workingVersion.getDataset();
+
+ if (fileMetadata != null){
+ guestbookResponse.setDataFile(fileMetadata.getDataFile());
+ }
+
+ if (dataset.getGuestbook() != null) {
+ guestbookResponse.setGuestbook(workingVersion.getDataset().getGuestbook());
+ setUserDefaultResponses(guestbookResponse, session);
+ if (fileMetadata != null){
+ guestbookResponse.setDataFile(fileMetadata.getDataFile());
+ }
+ } else {
+ if (fileMetadata != null){
+ guestbookResponse = initDefaultGuestbookResponse(dataset, fileMetadata.getDataFile(), session);
+ } else {
+ guestbookResponse = initDefaultGuestbookResponse(dataset, null, session);
+ }
+ }
+ if (dataset.getGuestbook() != null && !dataset.getGuestbook().getCustomQuestions().isEmpty()) {
+ initCustomQuestions(guestbookResponse, dataset);
+ }
+ guestbookResponse.setDownloadtype("Download");
+ if(downloadFormat.toLowerCase().equals("subset")){
+ guestbookResponse.setDownloadtype("Subset");
+ }
+ if(downloadFormat.toLowerCase().equals("explore")){
+ guestbookResponse.setDownloadtype("Explore");
+ }
+ guestbookResponse.setDataset(dataset);
+
+ return guestbookResponse;
+ }
+
+ private void initCustomQuestions(GuestbookResponse guestbookResponse, Dataset dataset) {
+ guestbookResponse.setCustomQuestionResponses(new ArrayList());
+ for (CustomQuestion cq : dataset.getGuestbook().getCustomQuestions()) {
+ CustomQuestionResponse cqr = new CustomQuestionResponse();
+ cqr.setGuestbookResponse(guestbookResponse);
+ cqr.setCustomQuestion(cq);
+ cqr.setResponse("");
+ if (cq.getQuestionType().equals("options")) {
+ //response select Items
+ cqr.setResponseSelectItems(setResponseUISelectItems(cq));
+ }
+ guestbookResponse.getCustomQuestionResponses().add(cqr);
+ }
+ }
+
+ private void setUserDefaultResponses(GuestbookResponse guestbookResponse, DataverseSession session) {
+ User user = session.getUser();
if (user != null) {
guestbookResponse.setEmail(getUserEMail(user));
guestbookResponse.setName(getUserName(user));
@@ -456,8 +567,125 @@ public GuestbookResponse initDefaultGuestbookResponse(Dataset dataset, DataFile
guestbookResponse.setPosition("");
guestbookResponse.setAuthenticatedUser(null);
}
+ guestbookResponse.setSessionId(session.toString());
+ }
+
+ public GuestbookResponse initDefaultGuestbookResponse(Dataset dataset, DataFile dataFile, DataverseSession session) {
+ GuestbookResponse guestbookResponse = new GuestbookResponse();
+ guestbookResponse.setGuestbook(findDefaultGuestbook());
+ if(dataset.getLatestVersion() != null && dataset.getLatestVersion().isDraft()){
+ guestbookResponse.setWriteResponse(false);
+ }
+ if (dataFile != null){
+ guestbookResponse.setDataFile(dataFile);
+ }
+ guestbookResponse.setDataset(dataset);
+ guestbookResponse.setResponseTime(new Date());
+ guestbookResponse.setSessionId(session.toString());
+ guestbookResponse.setDownloadtype("Download");
+ setUserDefaultResponses(guestbookResponse, session);
return guestbookResponse;
}
+
+ public void guestbookResponseValidator(FacesContext context, UIComponent toValidate, Object value) {
+ String response = (String) value;
+
+ if (response != null && response.length() > 255) {
+ ((UIInput) toValidate).setValid(false);
+ FacesMessage message = new FacesMessage(FacesMessage.SEVERITY_ERROR, JH.localize("dataset.guestbookResponse.guestbook.responseTooLong"), null);
+ context.addMessage(toValidate.getClientId(context), message);
+ }
+ }
+
+ public GuestbookResponse modifyDatafile(GuestbookResponse in, FileMetadata fm) {
+ if (in != null && fm.getDataFile() != null) {
+ in.setDataFile(fm.getDataFile());
+ }
+ if (in != null && fm.getDatasetVersion() != null && fm.getDatasetVersion().isDraft() ) {
+ in.setWriteResponse(false);
+ }
+ return in;
+ }
+
+ public GuestbookResponse modifySelectedFileIds(GuestbookResponse in, String fileIds) {
+ if (in != null && fileIds != null) {
+ in.setSelectedFileIds(fileIds);
+ }
+ return in;
+ }
+
+ public GuestbookResponse modifyDatafileAndFormat(GuestbookResponse in, FileMetadata fm, String format) {
+ if (in != null && fm.getDataFile() != null) {
+ in.setFileFormat(format);
+ in.setDataFile(fm.getDataFile());
+ }
+ if (in != null && fm.getDatasetVersion() != null && fm.getDatasetVersion().isDraft() ) {
+ in.setWriteResponse(false);
+ }
+
+ return in;
+ }
+
+ public Boolean validateGuestbookResponse(GuestbookResponse guestbookResponse, String type) {
+ boolean valid = true;
+ Dataset dataset = guestbookResponse.getDataset();
+ if (dataset.getGuestbook() != null) {
+ if (dataset.getGuestbook().isNameRequired()) {
+ if (guestbookResponse.getName() == null) {
+ valid = false;
+ } else {
+ valid &= !guestbookResponse.getName().isEmpty();
+ }
+ }
+ if (dataset.getGuestbook().isEmailRequired()) {
+ if (guestbookResponse.getEmail() == null) {
+ valid = false;
+ } else {
+ valid &= !guestbookResponse.getEmail().isEmpty();
+ }
+ }
+ if (dataset.getGuestbook().isInstitutionRequired()) {
+ if (guestbookResponse.getInstitution() == null) {
+ valid = false;
+ } else {
+ valid &= !guestbookResponse.getInstitution().isEmpty();
+ }
+ }
+ if (dataset.getGuestbook().isPositionRequired()) {
+ if (guestbookResponse.getPosition() == null) {
+ valid = false;
+ } else {
+ valid &= !guestbookResponse.getPosition().isEmpty();
+ }
+ }
+ }
+
+ if (dataset.getGuestbook() != null && !dataset.getGuestbook().getCustomQuestions().isEmpty()) {
+ for (CustomQuestion cq : dataset.getGuestbook().getCustomQuestions()) {
+ if (cq.isRequired()) {
+ for (CustomQuestionResponse cqr : guestbookResponse.getCustomQuestionResponses()) {
+ if (cqr.getCustomQuestion().equals(cq)) {
+ valid &= (cqr.getResponse() != null && !cqr.getResponse().isEmpty());
+ }
+ }
+ }
+ }
+ }
+
+ return valid;
+ }
+
+ private List setResponseUISelectItems(CustomQuestion cq) {
+ List retList = new ArrayList();
+ for (CustomQuestionValue cqv : cq.getCustomQuestionValues()) {
+ SelectItem si = new SelectItem(cqv.getValueString(), cqv.getValueString());
+ retList.add(si);
+ }
+ return retList;
+ }
+
+
+
public GuestbookResponse findById(Long id) {
return em.find(GuestbookResponse.class, id);
diff --git a/src/main/java/edu/harvard/iq/dataverse/NavigationWrapper.java b/src/main/java/edu/harvard/iq/dataverse/NavigationWrapper.java
index 989b69d10b0..beee1a81173 100644
--- a/src/main/java/edu/harvard/iq/dataverse/NavigationWrapper.java
+++ b/src/main/java/edu/harvard/iq/dataverse/NavigationWrapper.java
@@ -51,7 +51,7 @@ public String getPageFromContext() {
// that we don't want, so we filter through a list of paramters we do allow
// @todo verify what needs to be in this list of available parameters (for example do we want to repeat searches when you login?
List acceptableParameters = new ArrayList();
- acceptableParameters.addAll(Arrays.asList("id", "alias", "version", "q", "ownerId", "persistentId", "versionId", "datasetId", "selectedFileIds", "mode", "dataverseId"));
+ acceptableParameters.addAll(Arrays.asList("id", "alias", "version", "q", "ownerId", "persistentId", "versionId", "datasetId", "selectedFileIds", "mode", "dataverseId", "fileId", "datasetVersionId"));
if (req.getParameterMap() != null) {
StringBuilder queryString = new StringBuilder();
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java b/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java
index 6068a2c6e1e..388bafb58c2 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/AbstractApiBean.java
@@ -37,6 +37,7 @@
import edu.harvard.iq.dataverse.validation.BeanValidationServiceBean;
import java.io.StringReader;
import java.net.URI;
+import java.util.UUID;
import java.util.concurrent.Callable;
import java.util.logging.Level;
import java.util.logging.Logger;
@@ -99,7 +100,7 @@ public Response refineResponse( String message ) {
final Throwable cause = getCause();
baseMessage = (cause!=null ? cause.getMessage() : "");
}
- return errorResponse(statusCode, message+" "+baseMessage);
+ return error(statusCode, message+" "+baseMessage);
}
/**
@@ -203,7 +204,59 @@ public JsonParser call() throws Exception {
return new JsonParser(datasetFieldSvc, metadataBlockSvc,settingsSvc);
}
});
-
+
+ /**
+ * Functional interface for handling HTTP requests in the APIs.
+ *
+ * @see #response(edu.harvard.iq.dataverse.api.AbstractApiBean.DataverseRequestHandler)
+ */
+ protected static interface DataverseRequestHandler {
+ Response handle( DataverseRequest u ) throws WrappedResponse;
+ }
+
+
+ /* ===================== *\
+ * Utility Methods *
+ * Get that DSL feelin' *
+ \* ===================== */
+
+ protected JsonParser jsonParser() {
+ return jsonParserRef.get();
+ }
+
+ protected boolean isNumeric( String str ) {
+ return Util.isNumeric(str);
+ }
+
+ protected boolean parseBooleanOrDie( String input ) throws WrappedResponse {
+ if (input == null ) throw new WrappedResponse( badRequest("Boolean value missing"));
+ input = input.trim();
+ if ( Util.isBoolean(input) ) {
+ return Util.isTrue(input);
+ } else {
+ throw new WrappedResponse( badRequest("Illegal boolean value '" + input + "'"));
+ }
+ }
+
+ /**
+ * Returns the {@code key} query parameter from the current request, or {@code null} if
+ * the request has no such parameter.
+ * @param key Name of the requested parameter.
+ * @return Value of the requested parameter in the current request.
+ */
+ protected String getRequestParameter( String key ) {
+ return httpRequest.getParameter(key);
+ }
+
+ protected String getRequestApiKey() {
+ String headerParamApiKey = httpRequest.getHeader(DATAVERSE_KEY_HEADER_NAME);
+ String queryParamApiKey = httpRequest.getParameter("key");
+ return headerParamApiKey!=null ? headerParamApiKey : queryParamApiKey;
+ }
+
+ /* ========= *\
+ * Finders *
+ \* ========= */
protected RoleAssignee findAssignee(String identifier) {
try {
RoleAssignee roleAssignee = roleAssigneeSvc.getRoleAssignee(identifier);
@@ -213,7 +266,7 @@ protected RoleAssignee findAssignee(String identifier) {
while (cause.getCause() != null) {
cause = cause.getCause();
}
- logger.info("Exception caught looking up RoleAssignee based on identifier '" + identifier + "': " + cause.getMessage());
+ logger.log(Level.INFO, "Exception caught looking up RoleAssignee based on identifier ''{0}'': {1}", new Object[]{identifier, cause.getMessage()});
return null;
}
}
@@ -228,22 +281,6 @@ protected AuthenticatedUser findUserByApiToken( String apiKey ) {
return authSvc.lookupUser(apiKey);
}
- /**
- * Returns the {@code key} query parameter from the current request, or {@code null} if
- * the request has no such parameter.
- * @param key Name of the requested parameter.
- * @return Value of the requested parameter in the current request.
- */
- protected String getRequestParameter( String key ) {
- return httpRequest.getParameter(key);
- }
-
- protected String getRequestApiKey() {
- String headerParamApiKey = httpRequest.getHeader(DATAVERSE_KEY_HEADER_NAME);
- String queryParamApiKey = httpRequest.getParameter("key");
- return headerParamApiKey!=null ? headerParamApiKey : queryParamApiKey;
- }
-
/**
* Returns the user of pointed by the API key, or the guest user
* @return a user, may be a guest user.
@@ -285,7 +322,13 @@ private AuthenticatedUser findAuthenticatedUserOrDie( String key ) throws Wrappe
throw new WrappedResponse( badApiKey(key) );
}
-
+ protected Dataverse findDataverseOrDie( String dvIdtf ) throws WrappedResponse {
+ Dataverse dv = findDataverse(dvIdtf);
+ if ( dv == null ) {
+ throw new WrappedResponse(error( Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + dvIdtf + "'"));
+ }
+ return dv;
+ }
protected DataverseRequest createDataverseRequest( User u ) {
return new DataverseRequest(u, httpRequest);
@@ -322,7 +365,7 @@ protected DvObject findDvo( String id ) {
protected T failIfNull( T t, String errorMessage ) throws WrappedResponse {
if ( t != null ) return t;
- throw new WrappedResponse( errorResponse( Response.Status.BAD_REQUEST,errorMessage) );
+ throw new WrappedResponse( error( Response.Status.BAD_REQUEST,errorMessage) );
}
protected MetadataBlock findMetadataBlock(Long id) {
@@ -337,12 +380,24 @@ protected DatasetFieldType findDatasetFieldType(String idtf) throws NumberFormat
: datasetFieldSvc.findByNameOpt(idtf);
}
+ /* =================== *\
+ * Command Execution *
+ \* =================== */
+
+ /**
+ * Executes a command, and returns the appropriate result/HTTP response.
+ * @param Return type for the command
+ * @param cmd The command to execute.
+ * @return Value from the command
+ * @throws edu.harvard.iq.dataverse.api.AbstractApiBean.WrappedResponse Unwrap and return.
+ * @see #response(java.util.concurrent.Callable)
+ */
protected T execCommand( Command cmd ) throws WrappedResponse {
try {
return engineSvc.submit(cmd);
} catch (IllegalCommandException ex) {
- throw new WrappedResponse( ex, errorResponse(Response.Status.BAD_REQUEST, ex.getMessage() ) );
+ throw new WrappedResponse( ex, error(Response.Status.FORBIDDEN, ex.getMessage() ) );
} catch (PermissionException ex) {
/**
@@ -350,31 +405,82 @@ protected T execCommand( Command cmd ) throws WrappedResponse {
* There's valuable information in there that can help people reason
* about permissions!
*/
- throw new WrappedResponse(errorResponse(Response.Status.UNAUTHORIZED,
+ throw new WrappedResponse(error(Response.Status.UNAUTHORIZED,
"User " + cmd.getRequest().getUser().getIdentifier() + " is not permitted to perform requested action.") );
} catch (CommandException ex) {
Logger.getLogger(AbstractApiBean.class.getName()).log(Level.SEVERE, "Error while executing command " + cmd, ex);
- throw new WrappedResponse(ex, errorResponse(Status.INTERNAL_SERVER_ERROR, ex.getMessage()));
+ throw new WrappedResponse(ex, error(Status.INTERNAL_SERVER_ERROR, ex.getMessage()));
}
}
- protected Response okResponse( JsonArrayBuilder bld ) {
- return Response.ok(Json.createObjectBuilder()
- .add("status", "OK")
- .add("data", bld).build()).build();
+ /**
+ * A syntactically nicer way of using {@link #execCommand(edu.harvard.iq.dataverse.engine.command.Command)}.
+ * @param hdl The block to run.
+ * @return HTTP Response appropriate for the way {@code hdl} executed.
+ */
+ protected Response response( Callable hdl ) {
+ try {
+ return hdl.call();
+ } catch ( WrappedResponse rr ) {
+ return rr.getResponse();
+ } catch ( Exception ex ) {
+ String incidentId = UUID.randomUUID().toString();
+ logger.log(Level.SEVERE, "API internal error " + incidentId +": " + ex.getMessage(), ex);
+ return Response.status(500)
+ .entity( Json.createObjectBuilder()
+ .add("status", "ERROR")
+ .add("code", 500)
+ .add("message", "Internal server error. More details available at the server logs.")
+ .add("incidentId", incidentId)
+ .build())
+ .type("application/json").build();
+ }
}
- protected Response createdResponse( String uri, JsonObjectBuilder bld ) {
- return Response.created( URI.create(uri) )
+ /**
+ * The preferred way of handling a request that requires a user. The system
+ * looks for the user and, if found, handles it to the handler for doing the
+ * actual work.
+ *
+ * This is a relatively secure way to handle things, since if the user is not
+ * found, the response is about the bad API key, rather than something else
+ * (say, 404 NOT FOUND which leaks information about the existence of the
+ * sought object).
+ *
+ * @param hdl handling code block.
+ * @return HTTP Response appropriate for the way {@code hdl} executed.
+ */
+ protected Response response( DataverseRequestHandler hdl ) {
+ try {
+ return hdl.handle(createDataverseRequest(findUserOrDie()));
+ } catch ( WrappedResponse rr ) {
+ return rr.getResponse();
+ } catch ( Exception ex ) {
+ String incidentId = UUID.randomUUID().toString();
+ logger.log(Level.SEVERE, "API internal error " + incidentId +": " + ex.getMessage(), ex);
+ return Response.status(500)
.entity( Json.createObjectBuilder()
- .add("status", "OK")
- .add("data", bld).build())
- .type(MediaType.APPLICATION_JSON)
- .build();
+ .add("status", "ERROR")
+ .add("code", 500)
+ .add("message", "Internal server error. More details available at the server logs.")
+ .add("incidentId", incidentId)
+ .build())
+ .type("application/json").build();
+ }
+ }
+
+ /* ====================== *\
+ * HTTP Response methods *
+ \* ====================== */
+
+ protected Response ok( JsonArrayBuilder bld ) {
+ return Response.ok(Json.createObjectBuilder()
+ .add("status", "OK")
+ .add("data", bld).build()).build();
}
- protected Response okResponse( JsonObjectBuilder bld ) {
+ protected Response ok( JsonObjectBuilder bld ) {
return Response.ok( Json.createObjectBuilder()
.add("status", "OK")
.add("data", bld).build() )
@@ -382,7 +488,7 @@ protected Response okResponse( JsonObjectBuilder bld ) {
.build();
}
- protected Response okResponse( String msg ) {
+ protected Response ok( String msg ) {
return Response.ok().entity(Json.createObjectBuilder()
.add("status", "OK")
.add("data", Json.createObjectBuilder().add("message",msg)).build() )
@@ -390,24 +496,21 @@ protected Response okResponse( String msg ) {
.build();
}
- /**
- * Returns an OK response (HTTP 200, status:OK) with the passed value
- * in the data field.
- * @param value the value for the data field
- * @return a HTTP OK response with the passed value as data.
- */
- protected Response okResponseWithValue( String value ) {
- return Response.ok(Json.createObjectBuilder()
- .add("status", "OK")
- .add("data", value).build(), MediaType.APPLICATION_JSON_TYPE ).build();
- }
-
- protected Response okResponseWithValue( boolean value ) {
+ protected Response ok( boolean value ) {
return Response.ok().entity(Json.createObjectBuilder()
.add("status", "OK")
.add("data", value).build() ).build();
}
+ protected Response created( String uri, JsonObjectBuilder bld ) {
+ return Response.created( URI.create(uri) )
+ .entity( Json.createObjectBuilder()
+ .add("status", "OK")
+ .add("data", bld).build())
+ .type(MediaType.APPLICATION_JSON)
+ .build();
+ }
+
protected Response accepted() {
return Response.accepted()
.entity(Json.createObjectBuilder()
@@ -415,52 +518,34 @@ protected Response accepted() {
).build();
}
- protected JsonParser jsonParser() {
- return jsonParserRef.get();
- }
-
protected Response notFound( String msg ) {
- return errorResponse(Status.NOT_FOUND, msg);
+ return error(Status.NOT_FOUND, msg);
}
protected Response badRequest( String msg ) {
- return errorResponse( Status.BAD_REQUEST, msg );
+ return error( Status.BAD_REQUEST, msg );
}
protected Response badApiKey( String apiKey ) {
- return errorResponse(Status.UNAUTHORIZED, (apiKey != null ) ? "Bad api key '" + apiKey +"'" : "Please provide a key query parameter (?key=XXX) or via the HTTP header " + DATAVERSE_KEY_HEADER_NAME );
+ return error(Status.UNAUTHORIZED, (apiKey != null ) ? "Bad api key '" + apiKey +"'" : "Please provide a key query parameter (?key=XXX) or via the HTTP header " + DATAVERSE_KEY_HEADER_NAME );
}
protected Response permissionError( PermissionException pe ) {
- return errorResponse( Status.UNAUTHORIZED, pe.getMessage() );
+ return permissionError( pe.getMessage() );
}
-
- protected static Response errorResponse( Status sts ) {
- return errorResponse(sts, null);
+
+ protected Response permissionError( String message ) {
+ return error( Status.UNAUTHORIZED, message );
}
- protected static Response errorResponse( Status sts, String msg ) {
+ protected static Response error( Status sts, String msg ) {
return Response.status(sts)
.entity( NullSafeJsonBuilder.jsonObjectBuilder()
.add("status", "ERROR")
.add( "message", msg ).build()
).type(MediaType.APPLICATION_JSON_TYPE).build();
}
-
- protected Response execute( Command c ) {
- try {
- engineSvc.submit( c );
- return accepted();
-
- } catch ( PermissionException pex ) {
- return permissionError( pex );
-
- } catch ( CommandException ce ) {
- return errorResponse(Status.INTERNAL_SERVER_ERROR, ce.getLocalizedMessage());
- }
- }
-
- protected boolean isNumeric( String str ) { return Util.isNumeric(str); };
+
}
class LazyRef {
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Access.java b/src/main/java/edu/harvard/iq/dataverse/api/Access.java
index 11746a19c04..2377c459992 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Access.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Access.java
@@ -18,6 +18,8 @@
import edu.harvard.iq.dataverse.DataverseServiceBean;
import edu.harvard.iq.dataverse.DataverseSession;
import edu.harvard.iq.dataverse.DataverseTheme;
+import edu.harvard.iq.dataverse.GuestbookResponse;
+import edu.harvard.iq.dataverse.GuestbookResponseServiceBean;
import edu.harvard.iq.dataverse.PermissionServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
@@ -113,6 +115,8 @@ public class Access extends AbstractApiBean {
WorldMapTokenServiceBean worldMapTokenServiceBean;
@Inject
DataverseRequestServiceBean dvRequestService;
+ @EJB
+ GuestbookResponseServiceBean guestbookResponseService;
private static final String API_KEY_HEADER = "X-Dataverse-key";
@@ -174,10 +178,16 @@ public BundleDownloadInstance datafileBundle(@PathParam("fileId") Long fileId, @
@Path("datafile/{fileId}")
@GET
@Produces({ "application/xml" })
- public DownloadInstance datafile(@PathParam("fileId") Long fileId, @QueryParam("key") String apiToken, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ {
-
+ public DownloadInstance datafile(@PathParam("fileId") Long fileId, @QueryParam("gbrecs") Boolean gbrecs, @QueryParam("key") String apiToken, @Context UriInfo uriInfo, @Context HttpHeaders headers, @Context HttpServletResponse response) /*throws NotFoundException, ServiceUnavailableException, PermissionDeniedException, AuthorizationRequiredException*/ {
DataFile df = dataFileService.find(fileId);
+ GuestbookResponse gbr = null;
+ /*
+ if (gbrecs == null && df.isReleased()){
+ //commenting out for 4.6 SEK
+ // gbr = guestbookResponseService.initDefaultGuestbookResponse(df.getOwner(), df, session);
+ }
+ */
if (df == null) {
logger.warning("Access: datafile service could not locate a DataFile object for id "+fileId+"!");
throw new WebApplicationException(Response.Status.NOT_FOUND);
@@ -206,7 +216,11 @@ public DownloadInstance datafile(@PathParam("fileId") Long fileId, @QueryParam("
dInfo.addServiceAvailable(new OptionalAccessService("subset", "text/tab-separated-values", "variables=<LIST>", "Column-wise Subsetting"));
}
DownloadInstance downloadInstance = new DownloadInstance(dInfo);
-
+ if (gbr != null){
+ downloadInstance.setGbr(gbr);
+ downloadInstance.setDataverseRequestService(dvRequestService);
+ downloadInstance.setCommand(engineSvc);
+ }
for (String key : uriInfo.getQueryParameters().keySet()) {
String value = uriInfo.getQueryParameters().getFirst(key);
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Admin.java b/src/main/java/edu/harvard/iq/dataverse/api/Admin.java
index ac1128d3aeb..7cad071d341 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Admin.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Admin.java
@@ -1,6 +1,7 @@
package edu.harvard.iq.dataverse.api;
import edu.harvard.iq.dataverse.Dataverse;
+import edu.harvard.iq.dataverse.DvObject;
import edu.harvard.iq.dataverse.EMailValidator;
import edu.harvard.iq.dataverse.actionlogging.ActionLogRecord;
import edu.harvard.iq.dataverse.api.dto.RoleDTO;
@@ -9,7 +10,6 @@
import edu.harvard.iq.dataverse.authorization.UserIdentifier;
import edu.harvard.iq.dataverse.authorization.exceptions.AuthenticationProviderFactoryNotFoundException;
import edu.harvard.iq.dataverse.authorization.exceptions.AuthorizationSetupException;
-import edu.harvard.iq.dataverse.authorization.providers.AuthenticationProviderFactory;
import edu.harvard.iq.dataverse.authorization.providers.AuthenticationProviderRow;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUser;
import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
@@ -32,11 +32,9 @@
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
import javax.ws.rs.core.Response;
-
import static edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder.jsonObjectBuilder;
import static edu.harvard.iq.dataverse.util.json.JsonPrinter.*;
import java.io.StringReader;
-import java.util.List;
import java.util.Map;
import java.util.logging.Level;
import java.util.logging.Logger;
@@ -48,7 +46,9 @@
import javax.validation.ConstraintViolationException;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Response.Status;
-import static edu.harvard.iq.dataverse.api.AbstractApiBean.errorResponse;
+import static edu.harvard.iq.dataverse.api.AbstractApiBean.error;
+import edu.harvard.iq.dataverse.authorization.RoleAssignee;
+import edu.harvard.iq.dataverse.authorization.users.User;
/**
* Where the secure, setup API calls live.
* @author michael
@@ -70,14 +70,14 @@ public Response listAllSettings() {
JsonObjectBuilder bld = jsonObjectBuilder();
settingsSvc.listAll().forEach(
s -> bld.add(s.getName(), s.getContent()));
- return okResponse(bld);
+ return ok(bld);
}
@Path("settings/{name}")
@PUT
public Response putSetting( @PathParam("name") String name, String content ) {
Setting s = settingsSvc.set(name, content);
- return okResponse( jsonObjectBuilder().add(s.getName(), s.getContent()) );
+ return ok( jsonObjectBuilder().add(s.getName(), s.getContent()) );
}
@Path("settings/{name}")
@@ -86,7 +86,7 @@ public Response getSetting( @PathParam("name") String name ) {
String s = settingsSvc.get(name);
return ( s != null )
- ? okResponse( s )
+ ? ok( s )
: notFound("Setting " + name + " not found");
}
@@ -95,33 +95,27 @@ public Response getSetting( @PathParam("name") String name ) {
public Response deleteSetting( @PathParam("name") String name ) {
settingsSvc.delete(name);
- return okResponse("Setting " + name + " deleted.");
+ return ok("Setting " + name + " deleted.");
}
@Path("authenticationProviderFactories")
@GET
public Response listAuthProviderFactories() {
- JsonArrayBuilder arb = Json.createArrayBuilder();
- for ( AuthenticationProviderFactory f : authSvc.listProviderFactories() ){
- arb.add( jsonObjectBuilder()
+ return ok(authSvc.listProviderFactories()
+ .stream()
+ .map( f -> jsonObjectBuilder()
.add("alias", f.getAlias() )
- .add("info", f.getInfo() ));
- }
-
- return okResponse(arb);
+ .add("info", f.getInfo() ) )
+ .collect( toJsonArray() )
+ );
}
@Path("authenticationProviders")
@GET
public Response listAuthProviders() {
- JsonArrayBuilder arb = Json.createArrayBuilder();
- for ( AuthenticationProviderRow r :
- em.createNamedQuery("AuthenticationProviderRow.findAll", AuthenticationProviderRow.class).getResultList() ){
- arb.add( json(r) );
- }
-
- return okResponse(arb);
+ return ok(em.createNamedQuery("AuthenticationProviderRow.findAll", AuthenticationProviderRow.class).getResultList()
+ .stream().map( r->json(r) ).collect( toJsonArray() ));
}
@Path("authenticationProviders")
@@ -140,9 +134,9 @@ public Response addProvider( AuthenticationProviderRow row ) {
authSvc.deregisterProvider(provider.getId());
authSvc.registerProvider(provider);
}
- return createdResponse("/s/authenticationProviders/"+managed.getId(), json(managed));
+ return created("/s/authenticationProviders/"+managed.getId(), json(managed));
} catch ( AuthorizationSetupException e ) {
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage() );
+ return error(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage() );
}
}
@@ -150,8 +144,8 @@ public Response addProvider( AuthenticationProviderRow row ) {
@GET
public Response showProvider( @PathParam("id") String id ) {
AuthenticationProviderRow row = em.find(AuthenticationProviderRow.class, id);
- return (row != null ) ? okResponse( json(row) )
- : errorResponse(Status.NOT_FOUND,"Can't find authetication provider with id '" + id + "'");
+ return (row != null ) ? ok( json(row) )
+ : error(Status.NOT_FOUND,"Can't find authetication provider with id '" + id + "'");
}
@POST
@@ -160,13 +154,13 @@ public Response showProvider( @PathParam("id") String id ) {
public Response enableAuthenticationProvider( @PathParam("id")String id, String body ) {
if ( ! Util.isBoolean(body) ) {
- return errorResponse(Response.Status.BAD_REQUEST, "Illegal value '" + body + "'. Try 'true' or 'false'");
+ return error(Response.Status.BAD_REQUEST, "Illegal value '" + body + "'. Use 'true' or 'false'");
}
boolean enable = Util.isTrue(body);
AuthenticationProviderRow row = em.find(AuthenticationProviderRow.class, id);
if ( row == null ) {
- return errorResponse( Status.NOT_FOUND, "Can't find authentication provider with id '" + id + "'");
+ return notFound("Can't find authentication provider with id '" + id + "'");
}
row.setEnabled(enable);
@@ -175,25 +169,24 @@ public Response enableAuthenticationProvider( @PathParam("id")String id, String
if ( enable ) {
// enable a provider
if ( authSvc.getAuthenticationProvider(id) != null ) {
- return okResponse( String.format("Authentication provider '%s' already enabled", id));
+ return ok( String.format("Authentication provider '%s' already enabled", id));
}
try {
authSvc.registerProvider( authSvc.loadProvider(row) );
- return okResponse(String.format("Authentication Provider %s enabled", row.getId()));
+ return ok(String.format("Authentication Provider %s enabled", row.getId()));
} catch (AuthenticationProviderFactoryNotFoundException ex) {
- return errorResponse(Response.Status.BAD_REQUEST,
- String.format("Can't instantiate provider, as there's no factory with alias %s", row.getFactoryAlias()));
+ return notFound(String.format("Can't instantiate provider, as there's no factory with alias %s", row.getFactoryAlias()));
} catch (AuthorizationSetupException ex) {
logger.log(Level.WARNING, "Error instantiating authentication provider: " + ex.getMessage(), ex);
- return errorResponse(Response.Status.BAD_REQUEST,
+ return error(Status.INTERNAL_SERVER_ERROR,
String.format("Can't instantiate provider: %s", ex.getMessage()));
}
} else {
// disable a provider
authSvc.deregisterProvider(id);
- return okResponse("Authentication Provider '" + id + "' disabled. "
+ return ok("Authentication Provider '" + id + "' disabled. "
+ ( authSvc.getAuthenticationProviderIds().isEmpty()
? "WARNING: no enabled authentication providers left." : "") );
}
@@ -208,7 +201,7 @@ public Response deleteAuthenticationProvider( @PathParam("id") String id ) {
em.remove( row );
}
- return okResponse("AuthenticationProvider " + id + " deleted. "
+ return ok("AuthenticationProvider " + id + " deleted. "
+ ( authSvc.getAuthenticationProviderIds().isEmpty()
? "WARNING: no enabled authentication providers left." : ""));
}
@@ -218,9 +211,9 @@ public Response deleteAuthenticationProvider( @PathParam("id") String id ) {
public Response getAuthenticatedUser(@PathParam("identifier") String identifier) {
AuthenticatedUser authenticatedUser = authSvc.getAuthenticatedUser(identifier);
if (authenticatedUser != null) {
- return okResponse(jsonForAuthUser(authenticatedUser));
+ return ok(jsonForAuthUser(authenticatedUser));
}
- return errorResponse(Response.Status.BAD_REQUEST, "User " + identifier + " not found.");
+ return error(Response.Status.BAD_REQUEST, "User " + identifier + " not found.");
}
@DELETE
@@ -229,9 +222,9 @@ public Response deleteAuthenticatedUser(@PathParam("identifier") String identifi
AuthenticatedUser user = authSvc.getAuthenticatedUser(identifier);
if (user!=null) {
authSvc.deleteAuthenticatedUser(user.getId());
- return okResponse("AuthenticatedUser " +identifier + " deleted. ");
+ return ok("AuthenticatedUser " +identifier + " deleted. ");
}
- return errorResponse(Response.Status.BAD_REQUEST, "User "+ identifier+" not found.");
+ return error(Response.Status.BAD_REQUEST, "User "+ identifier+" not found.");
}
@POST
@@ -241,9 +234,9 @@ public Response publishDataverseAsCreator(@PathParam("id") long id) {
Dataverse dataverse = dataverseSvc.find(id);
if (dataverse != null) {
AuthenticatedUser authenticatedUser = dataverse.getCreator();
- return okResponse(json(execCommand(new PublishDataverseCommand(createDataverseRequest(authenticatedUser), dataverse))));
+ return ok(json(execCommand(new PublishDataverseCommand(createDataverseRequest(authenticatedUser), dataverse))));
} else {
- return errorResponse(Status.BAD_REQUEST, "Could not find dataverse with id " + id);
+ return error(Status.BAD_REQUEST, "Could not find dataverse with id " + id);
}
} catch (WrappedResponse wr) {
return wr.getResponse();
@@ -256,16 +249,16 @@ public Response listAuthenticatedUsers() {
try {
AuthenticatedUser user = findAuthenticatedUserOrDie();
if (!user.isSuperuser()) {
- return errorResponse(Response.Status.FORBIDDEN, "Superusers only.");
+ return error(Response.Status.FORBIDDEN, "Superusers only.");
}
} catch (WrappedResponse ex) {
- return errorResponse(Response.Status.FORBIDDEN, "Superusers only.");
+ return error(Response.Status.FORBIDDEN, "Superusers only.");
}
JsonArrayBuilder userArray = Json.createArrayBuilder();
authSvc.findAllAuthenticatedUsers().stream().forEach((user) -> {
userArray.add(jsonForAuthUser(user));
});
- return okResponse(userArray);
+ return ok(userArray);
}
/**
@@ -278,20 +271,20 @@ public Response convertShibUserToBuiltin(@PathParam("id") Long id, String newEma
try {
AuthenticatedUser user = findAuthenticatedUserOrDie();
if (!user.isSuperuser()) {
- return errorResponse(Response.Status.FORBIDDEN, "Superusers only.");
+ return error(Response.Status.FORBIDDEN, "Superusers only.");
}
} catch (WrappedResponse ex) {
- return errorResponse(Response.Status.FORBIDDEN, "Superusers only.");
+ return error(Response.Status.FORBIDDEN, "Superusers only.");
}
try {
BuiltinUser builtinUser = authSvc.convertShibToBuiltIn(id, newEmailAddress);
if (builtinUser == null) {
- return errorResponse(Response.Status.BAD_REQUEST, "User id " + id + " could not be converted from Shibboleth to BuiltIn. An Exception was not thrown.");
+ return error(Response.Status.BAD_REQUEST, "User id " + id + " could not be converted from Shibboleth to BuiltIn. An Exception was not thrown.");
}
JsonObjectBuilder output = Json.createObjectBuilder();
output.add("email", builtinUser.getEmail());
output.add("username", builtinUser.getUserName());
- return okResponse(output);
+ return ok(output);
} catch (Throwable ex) {
StringBuilder sb = new StringBuilder();
sb.append(ex + " ");
@@ -301,7 +294,7 @@ public Response convertShibUserToBuiltin(@PathParam("id") Long id, String newEma
}
String msg = "User id " + id + " could not be converted from Shibboleth to BuiltIn. Details from Exception: " + sb;
logger.info(msg);
- return errorResponse(Response.Status.BAD_REQUEST, msg);
+ return error(Response.Status.BAD_REQUEST, msg);
}
}
@@ -316,14 +309,14 @@ public Response builtin2shib(String content) {
try {
AuthenticatedUser userToRunThisMethod = findAuthenticatedUserOrDie();
if (!userToRunThisMethod.isSuperuser()) {
- return errorResponse(Response.Status.FORBIDDEN, "Superusers only.");
+ return error(Response.Status.FORBIDDEN, "Superusers only.");
}
} catch (WrappedResponse ex) {
- return errorResponse(Response.Status.FORBIDDEN, "Superusers only.");
+ return error(Response.Status.FORBIDDEN, "Superusers only.");
}
boolean disabled = false;
if (disabled) {
- return errorResponse(Response.Status.BAD_REQUEST, "API endpoint disabled.");
+ return error(Response.Status.BAD_REQUEST, "API endpoint disabled.");
}
AuthenticatedUser builtInUserToConvert = null;
String emailToFind;
@@ -337,7 +330,7 @@ public Response builtin2shib(String content) {
newEmailAddressToUse = args[2];
// authuserId = args[666];
} catch (ArrayIndexOutOfBoundsException ex) {
- return errorResponse(Response.Status.BAD_REQUEST, "Problem with content <<<" + content + ">>>: " + ex.toString());
+ return error(Response.Status.BAD_REQUEST, "Problem with content <<<" + content + ">>>: " + ex.toString());
}
AuthenticatedUser existingAuthUserFoundByEmail = shibService.findAuthUserByEmail(emailToFind);
String existing = "NOT FOUND";
@@ -350,7 +343,7 @@ public Response builtin2shib(String content) {
if (specifiedUserToConvert != null) {
builtInUserToConvert = specifiedUserToConvert;
} else {
- return errorResponse(Response.Status.BAD_REQUEST, "No user to convert. We couldn't find a *single* existing user account based on " + emailToFind + " and no user was found using specified id " + longToLookup);
+ return error(Response.Status.BAD_REQUEST, "No user to convert. We couldn't find a *single* existing user account based on " + emailToFind + " and no user was found using specified id " + longToLookup);
}
}
String shibProviderId = ShibAuthenticationProvider.PROVIDER_ID;
@@ -369,7 +362,7 @@ public Response builtin2shib(String content) {
boolean validEmail = EMailValidator.isEmailValid(overwriteEmail, null);
if (!validEmail) {
// See https://github.com/IQSS/dataverse/issues/2998
- return errorResponse(Response.Status.BAD_REQUEST, "invalid email: " + overwriteEmail);
+ return error(Response.Status.BAD_REQUEST, "invalid email: " + overwriteEmail);
}
/**
* @todo If affiliation is not null, put it in RoleAssigneeDisplayInfo
@@ -426,7 +419,7 @@ public Response builtin2shib(String content) {
* @todo Someday we should make a errorResponse method that
* takes JSON arrays and objects.
*/
- return errorResponse(Status.BAD_REQUEST, problems.build().toString());
+ return error(Status.BAD_REQUEST, problems.build().toString());
}
// response.add("knows existing password", knowsExistingPassword);
}
@@ -441,7 +434,7 @@ public Response builtin2shib(String content) {
response.add("affiliation", overwriteAffiliation);
}
response.add("problems", problems);
- return okResponse(response);
+ return ok(response);
}
@DELETE
@@ -450,9 +443,9 @@ public Response deleteAuthenticatedUserById(@PathParam("id") Long id) {
AuthenticatedUser user = authSvc.findByID(id);
if (user != null) {
authSvc.deleteAuthenticatedUser(user.getId());
- return okResponse("AuthenticatedUser " + id + " deleted. ");
+ return ok("AuthenticatedUser " + id + " deleted. ");
}
- return errorResponse(Response.Status.BAD_REQUEST, "User " + id + " not found.");
+ return error(Response.Status.BAD_REQUEST, "User " + id + " not found.");
}
@Path("roles")
@@ -461,11 +454,11 @@ public Response createNewBuiltinRole(RoleDTO roleDto) {
ActionLogRecord alr = new ActionLogRecord(ActionLogRecord.ActionType.Admin, "createBuiltInRole")
.setInfo(roleDto.getAlias() + ":" + roleDto.getDescription() );
try {
- return okResponse(json(rolesSvc.save(roleDto.asRole())));
+ return ok(json(rolesSvc.save(roleDto.asRole())));
} catch (Exception e) {
alr.setActionResult(ActionLogRecord.Result.InternalError);
alr.setInfo( alr.getInfo() + "// " + e.getMessage() );
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
+ return error(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
} finally {
actionLogSvc.log(alr);
}
@@ -475,9 +468,9 @@ public Response createNewBuiltinRole(RoleDTO roleDto) {
@GET
public Response listBuiltinRoles() {
try {
- return okResponse( rolesToJson(rolesSvc.findBuiltinRoles()) );
+ return ok( rolesToJson(rolesSvc.findBuiltinRoles()) );
} catch (Exception e) {
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
+ return error(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
}
}
@@ -492,11 +485,11 @@ public Response toggleSuperuser(@PathParam("identifier") String identifier) {
user.setSuperuser(!user.isSuperuser());
- return okResponse("User " + user.getIdentifier() + " " + (user.isSuperuser() ? "set": "removed") + " as a superuser.");
+ return ok("User " + user.getIdentifier() + " " + (user.isSuperuser() ? "set": "removed") + " as a superuser.");
} catch (Exception e) {
alr.setActionResult(ActionLogRecord.Result.InternalError);
alr.setInfo( alr.getInfo() + "// " + e.getMessage() );
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
+ return error(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
} finally {
actionLogSvc.log(alr);
}
@@ -522,13 +515,13 @@ public Response validate() {
violation.add("entityClassDatabaseTableRowId", databaseRow);
violation.add("field", field);
violation.add("invalidValue", invalidValue);
- return okResponse(violation);
+ return ok(violation);
}
}
cause = cause.getCause();
}
}
- return okResponse(msg);
+ return ok(msg);
}
/**
@@ -544,10 +537,10 @@ public Response getConfirmEmailToken(@PathParam("userId") long userId) {
if (user != null) {
ConfirmEmailData confirmEmailData = confirmEmailSvc.findSingleConfirmEmailDataByUser(user);
if (confirmEmailData != null) {
- return okResponse(Json.createObjectBuilder().add("token", confirmEmailData.getToken()));
+ return ok(Json.createObjectBuilder().add("token", confirmEmailData.getToken()));
}
}
- return errorResponse(Status.BAD_REQUEST, "Could not find confirm email token for user " + userId);
+ return error(Status.BAD_REQUEST, "Could not find confirm email token for user " + userId);
}
/**
@@ -563,16 +556,16 @@ public Response startConfirmEmailProcess(@PathParam("userId") long userId) {
try {
ConfirmEmailInitResponse confirmEmailInitResponse = confirmEmailSvc.beginConfirm(user);
ConfirmEmailData confirmEmailData = confirmEmailInitResponse.getConfirmEmailData();
- return okResponse(
+ return ok(
Json.createObjectBuilder()
.add("tokenCreated", confirmEmailData.getCreated().toString())
.add("identifier", user.getUserIdentifier()
));
} catch (ConfirmEmailException ex) {
- return errorResponse(Status.BAD_REQUEST, "Could not start confirm email process for user " + userId + ": " + ex.getLocalizedMessage());
+ return error(Status.BAD_REQUEST, "Could not start confirm email process for user " + userId + ": " + ex.getLocalizedMessage());
}
}
- return errorResponse(Status.BAD_REQUEST, "Could not find user based on " + userId);
+ return error(Status.BAD_REQUEST, "Could not find user based on " + userId);
}
/**
@@ -588,7 +581,7 @@ public Response convertUserFromBcryptToSha1(String json) {
BuiltinUser builtinUser = builtinUserService.find(new Long(object.getInt("builtinUserId")));
builtinUser.updateEncryptedPassword("4G7xxL9z11/JKN4jHPn4g9iIQck=", 0); // password is "sha-1Pass", 0 means SHA-1
BuiltinUser savedUser = builtinUserService.save(builtinUser);
- return okResponse("foo: " + savedUser);
+ return ok("foo: " + savedUser);
}
@@ -601,6 +594,39 @@ public Response getAssignmentsFor( @PathParam("raIdtf") String raIdtf ) {
JsonArrayBuilder arr = Json.createArrayBuilder();
roleAssigneeSvc.getAssignmentsFor(raIdtf).forEach( a -> arr.add(json(a)));
- return okResponse(arr);
+ return ok(arr);
}
+
+ @Path("permissions/{dvo}")
+ @GET
+ public Response findPermissonsOn(@PathParam("dvo") String dvo) {
+ try {
+ DvObject dvObj = findDvo(dvo);
+ if (dvObj == null) {
+ return notFound("DvObject " + dvo + " not found");
+ }
+ try {
+ User aUser = findUserOrDie();
+ JsonObjectBuilder bld = Json.createObjectBuilder();
+ bld.add("user", aUser.getIdentifier());
+ bld.add("permissions", json(permissionSvc.permissionsFor(createDataverseRequest(aUser), dvObj)));
+ return ok(bld);
+
+ } catch (WrappedResponse wr) {
+ return wr.getResponse();
+ }
+ } catch (Exception e) {
+ logger.log(Level.SEVERE, "Error while testing permissions", e);
+ return error(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
+ }
+ }
+
+ @Path("assignee/{idtf}")
+ @GET
+ public Response findRoleAssignee(@PathParam("idtf") String idtf) {
+ RoleAssignee ra = roleAssigneeSvc.getRoleAssignee(idtf);
+ return (ra == null) ? notFound("Role Assignee '" + idtf + "' not found.")
+ : ok(json(ra.getDisplayInfo()));
+ }
+
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/ApiBlockingFilter.java b/src/main/java/edu/harvard/iq/dataverse/api/ApiBlockingFilter.java
index c75c16ce943..8f5b8333b8e 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/ApiBlockingFilter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/ApiBlockingFilter.java
@@ -158,7 +158,15 @@ public void doFilter(ServletRequest sr, ServletResponse sr1, FilterChain fc) thr
return;
}
}
- fc.doFilter(sr, sr1);
+ try {
+ fc.doFilter(sr, sr1);
+ } catch ( ServletException se ) {
+ logger.log(Level.WARNING, "Error processing " + requestURI +": " + se.getMessage(), se);
+ HttpServletResponse resp = (HttpServletResponse) sr1;
+ resp.setStatus(500);
+ resp.setHeader("PROCUDER", "ApiBlockingFilter");
+ resp.getWriter().append("Error: " + se.getMessage());
+ }
}
@Override
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/BatchImport.java b/src/main/java/edu/harvard/iq/dataverse/api/BatchImport.java
index a94d36ddd3b..7d29e9e2334 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/BatchImport.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/BatchImport.java
@@ -92,15 +92,15 @@ public Response postImport(String body, @QueryParam("dv") String parentIdtf, @Qu
}
Dataverse owner = findDataverse(parentIdtf);
if (owner == null) {
- return errorResponse(Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + parentIdtf + "'");
+ return error(Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + parentIdtf + "'");
}
try {
PrintWriter cleanupLog = null; // Cleanup log isn't needed for ImportType == NEW. We don't do any data cleanup in this mode.
String filename = null; // Since this is a single input from a POST, there is no file that we are reading from.
JsonObjectBuilder status = importService.doImport(dataverseRequest, owner, body, filename, ImportType.NEW, cleanupLog);
- return this.okResponse(status);
+ return this.ok(status);
} catch (ImportException | IOException e) {
- return this.errorResponse(Response.Status.BAD_REQUEST, e.getMessage());
+ return this.error(Response.Status.BAD_REQUEST, e.getMessage());
}
}
@@ -140,13 +140,13 @@ private Response startBatchJob(String fileDir, String parentIdtf, String apiKey,
if (createDV) {
owner = importService.createDataverse(parentIdtf, dataverseRequest);
} else {
- return errorResponse(Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + parentIdtf + "'");
+ return error(Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + parentIdtf + "'");
}
}
batchService.processFilePath(fileDir, parentIdtf, dataverseRequest, owner, importType, createDV);
} catch (ImportException e) {
- return this.errorResponse(Response.Status.BAD_REQUEST, "Import Exception, " + e.getMessage());
+ return this.error(Response.Status.BAD_REQUEST, "Import Exception, " + e.getMessage());
}
return this.accepted();
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java b/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java
index 2709e17387e..d1f0e7e1297 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/BuiltinUsers.java
@@ -69,7 +69,7 @@ public Response getApiToken( @PathParam("username") String username, @QueryParam
ApiToken t = authSvc.findApiTokenByUser(authUser);
- return (t != null ) ? okResponse(t.getTokenString()) : notFound("User " + username + " does not have an API token");
+ return (t != null ) ? ok(t.getTokenString()) : notFound("User " + username + " does not have an API token");
}
/**
@@ -98,7 +98,7 @@ private Response internalSave(BuiltinUser user, String password, String key) {
String expectedKey = settingsSvc.get(API_KEY_IN_SETTINGS);
if (expectedKey == null) {
- return errorResponse(Status.SERVICE_UNAVAILABLE, "Dataverse config issue: No API key defined for built in user management");
+ return error(Status.SERVICE_UNAVAILABLE, "Dataverse config issue: No API key defined for built in user management");
}
if (!expectedKey.equals(key)) {
return badApiKey(key);
@@ -115,7 +115,7 @@ private Response internalSave(BuiltinUser user, String password, String key) {
// Make sure the identifier is unique
if ( (builtinUserSvc.findByUserName(user.getUserName()) != null)
|| ( authSvc.identifierExists(user.getUserName())) ) {
- return errorResponse(Status.BAD_REQUEST, "username '" + user.getUserName() + "' already exists");
+ return error(Status.BAD_REQUEST, "username '" + user.getUserName() + "' already exists");
}
user = builtinUserSvc.save(user);
@@ -150,23 +150,23 @@ private Response internalSave(BuiltinUser user, String password, String key) {
resp.add("apiToken", token.getTokenString());
alr.setInfo("builtinUser:" + user.getUserName() + " authenticatedUser:" + au.getIdentifier() );
- return okResponse(resp);
+ return ok(resp);
} catch ( EJBException ejbx ) {
alr.setActionResult(ActionLogRecord.Result.InternalError);
alr.setInfo( alr.getInfo() + "// " + ejbx.getMessage());
if ( ejbx.getCausedByException() instanceof IllegalArgumentException ) {
- return errorResponse(Status.BAD_REQUEST, "Bad request: can't save user. " + ejbx.getCausedByException().getMessage());
+ return error(Status.BAD_REQUEST, "Bad request: can't save user. " + ejbx.getCausedByException().getMessage());
} else {
logger.log(Level.WARNING, "Error saving user: ", ejbx);
- return errorResponse(Status.INTERNAL_SERVER_ERROR, "Can't save user: " + ejbx.getMessage());
+ return error(Status.INTERNAL_SERVER_ERROR, "Can't save user: " + ejbx.getMessage());
}
} catch (Exception e) {
logger.log(Level.WARNING, "Error saving user", e);
alr.setActionResult(ActionLogRecord.Result.InternalError);
alr.setInfo( alr.getInfo() + "// " + e.getMessage());
- return errorResponse(Status.INTERNAL_SERVER_ERROR, "Can't save user: " + e.getMessage());
+ return error(Status.INTERNAL_SERVER_ERROR, "Can't save user: " + e.getMessage());
} finally {
actionLogSvc.log(alr);
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/CrashBoomBangEndpoint.java b/src/main/java/edu/harvard/iq/dataverse/api/CrashBoomBangEndpoint.java
new file mode 100644
index 00000000000..86cfabb57ba
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/api/CrashBoomBangEndpoint.java
@@ -0,0 +1,30 @@
+package edu.harvard.iq.dataverse.api;
+
+import javax.ws.rs.GET;
+import javax.ws.rs.Path;
+import javax.ws.rs.core.Response;
+
+/**
+ * An API endpoint that crashes. Used for testing the error handlers. Should
+ * be removed once #3423 is closed.
+ *
+ * @author michael
+ */
+@Path("boom")
+public class CrashBoomBangEndpoint extends AbstractApiBean {
+
+ @GET
+ @Path("aoob")
+ public Response arrayError() {
+ String boom = "abc".split("3")[9];
+ return ok("Not gonna happen");
+ }
+
+ @GET
+ @Path("npe")
+ public Response nullPointer() {
+ String boom = null;
+ boom.length();
+ return ok("Not gonna happen");
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/DataTagsAPI.java b/src/main/java/edu/harvard/iq/dataverse/api/DataTagsAPI.java
index f81778f0408..063033d4747 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/DataTagsAPI.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/DataTagsAPI.java
@@ -83,7 +83,7 @@ public Response receiveTags(JsonObject tags, @PathParam("uniqueCacheId") String
CACHE.put(uniqueCacheId, container);
// return an OK message with the redirect URL for the user to return to Dataverse through postBackTo or unacceptableDataset in DataTags
- return okResponse( USER_REDIRECT_URL );
+ return ok( USER_REDIRECT_URL );
}
}
\ No newline at end of file
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java b/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
index 438c076efd9..2095161fc52 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/DatasetFieldServiceApi.java
@@ -76,7 +76,7 @@ public Response getAll() {
for ( DatasetFieldType dt : requiredFields ) {
requiredFieldNames.add( dt.getName() );
}
- return okResponse( Json.createObjectBuilder().add("haveParents", asJsonArray(listOfIsHasParentsTrue))
+ return ok( Json.createObjectBuilder().add("haveParents", asJsonArray(listOfIsHasParentsTrue))
.add("noParents", asJsonArray(listOfIsHasParentsFalse))
.add("allowsMultiples", asJsonArray(listOfIsAllowsMultiplesTrue))
.add("allowsMultiples", asJsonArray(listOfIsAllowsMultiplesTrue))
@@ -107,7 +107,7 @@ public Response getAll() {
}
}
}
- return errorResponse(Status.INTERNAL_SERVER_ERROR, sb.toString());
+ return error(Status.INTERNAL_SERVER_ERROR, sb.toString());
}
}
@@ -132,7 +132,7 @@ public Response getByName(@PathParam("name") String name) {
parentAllowsMultiplesBoolean = parent.isAllowMultiples();
parentAllowsMultiplesDisplay = Boolean.toString(parentAllowsMultiplesBoolean);
}
- return okResponse(NullSafeJsonBuilder.jsonObjectBuilder()
+ return ok(NullSafeJsonBuilder.jsonObjectBuilder()
.add("name", dsf.getName())
.add("id", id )
.add("title", title)
@@ -163,7 +163,7 @@ public Response getByName(@PathParam("name") String name) {
}
}
}
- return errorResponse( Status.INTERNAL_SERVER_ERROR, sb.toString() );
+ return error( Status.INTERNAL_SERVER_ERROR, sb.toString() );
}
}
@@ -185,7 +185,7 @@ public Response showControlledVocabularyForSubject() {
possibleSubjects.add(subject);
}
}
- return okResponse(possibleSubjects);
+ return ok(possibleSubjects);
}
@@ -202,10 +202,10 @@ public Response loadNAControlledVocabularyValue() {
ControlledVocabularyValue naValue = new ControlledVocabularyValue();
naValue.setStrValue(DatasetField.NA_VALUE);
datasetFieldService.save(naValue);
- return okResponse("NA value created.");
+ return ok("NA value created.");
} else {
- return okResponse("NA value exists.");
+ return ok("NA value exists.");
}
}
@@ -274,13 +274,13 @@ public Response loadDatasetFields(File file) {
} catch (FileNotFoundException e) {
alr.setActionResult(ActionLogRecord.Result.BadRequest);
alr.setInfo( alr.getInfo() + "// file not found");
- return errorResponse(Status.EXPECTATION_FAILED, "File not found");
+ return error(Status.EXPECTATION_FAILED, "File not found");
} catch (Exception e) {
Logger.getLogger(DatasetFieldServiceApi.class.getName()).log(Level.WARNING, "Error parsing dataset fields:" + e.getMessage(), e);
alr.setActionResult(ActionLogRecord.Result.InternalError);
alr.setInfo( alr.getInfo() + "// " + e.getMessage());
- return errorResponse(Status.INTERNAL_SERVER_ERROR, e.getMessage());
+ return error(Status.INTERNAL_SERVER_ERROR, e.getMessage());
} finally {
if (br != null) {
@@ -294,7 +294,7 @@ public Response loadDatasetFields(File file) {
actionLogSvc.log(alr);
}
- return okResponse( Json.createObjectBuilder().add("added", responseArr) );
+ return ok( Json.createObjectBuilder().add("added", responseArr) );
}
private String parseMetadataBlock(String[] values) {
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
index f0af8490c3d..5e5670ab57a 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Datasets.java
@@ -1,29 +1,21 @@
package edu.harvard.iq.dataverse.api;
import edu.harvard.iq.dataverse.DOIEZIdServiceBean;
-import edu.harvard.iq.dataverse.DataFile;
import edu.harvard.iq.dataverse.Dataset;
import edu.harvard.iq.dataverse.DatasetField;
import edu.harvard.iq.dataverse.DatasetFieldServiceBean;
import edu.harvard.iq.dataverse.DatasetFieldType;
-import edu.harvard.iq.dataverse.DatasetFieldValue;
import edu.harvard.iq.dataverse.DatasetServiceBean;
import edu.harvard.iq.dataverse.DatasetVersion;
import edu.harvard.iq.dataverse.Dataverse;
-import edu.harvard.iq.dataverse.DataverseServiceBean;
import edu.harvard.iq.dataverse.MetadataBlock;
import edu.harvard.iq.dataverse.MetadataBlockServiceBean;
-import edu.harvard.iq.dataverse.RoleAssignment;
-import edu.harvard.iq.dataverse.api.imports.ImportException;
-import edu.harvard.iq.dataverse.api.imports.ImportUtil;
import edu.harvard.iq.dataverse.authorization.DataverseRole;
import edu.harvard.iq.dataverse.authorization.RoleAssignee;
import edu.harvard.iq.dataverse.authorization.users.User;
import edu.harvard.iq.dataverse.engine.command.Command;
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
-import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.impl.AssignRoleCommand;
-import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetCommand;
import edu.harvard.iq.dataverse.engine.command.impl.CreateDatasetVersionCommand;
import edu.harvard.iq.dataverse.engine.command.impl.CreatePrivateUrlCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeleteDatasetCommand;
@@ -49,15 +41,12 @@
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.SystemConfig;
import edu.harvard.iq.dataverse.util.json.JsonParseException;
-import edu.harvard.iq.dataverse.util.json.JsonParser;
import static edu.harvard.iq.dataverse.util.json.JsonPrinter.*;
import java.io.ByteArrayOutputStream;
-import java.io.InputStream;
import java.io.OutputStream;
import java.io.StringReader;
import java.util.List;
import java.util.Map;
-import java.util.Set;
import java.util.logging.Level;
import java.util.logging.Logger;
import javax.ejb.EJB;
@@ -65,11 +54,6 @@
import javax.json.JsonArrayBuilder;
import javax.json.JsonObject;
import javax.json.JsonObjectBuilder;
-import javax.json.JsonReader;
-import javax.validation.ConstraintViolation;
-import javax.validation.Validation;
-import javax.validation.Validator;
-import javax.validation.ValidatorFactory;
import javax.ws.rs.DELETE;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
@@ -91,9 +75,6 @@ public class Datasets extends AbstractApiBean {
@EJB
DatasetServiceBean datasetService;
- @EJB
- DataverseServiceBean dataverseService;
-
@EJB
DOIEZIdServiceBean doiEZIdServiceBean;
@@ -127,100 +108,15 @@ private interface DsVersionHandler {
@GET
@Path("{id}")
public Response getDataset(@PathParam("id") String id) {
-
- try {
- final DataverseRequest r = createDataverseRequest(findUserOrDie());
-
- Dataset retrieved = execCommand(new GetDatasetCommand(r, findDatasetOrDie(id)));
- DatasetVersion latest = execCommand(new GetLatestAccessibleDatasetVersionCommand(r, retrieved));
+ return response( req -> {
+ final Dataset retrieved = execCommand(new GetDatasetCommand(req, findDatasetOrDie(id)));
+ final DatasetVersion latest = execCommand(new GetLatestAccessibleDatasetVersionCommand(req, retrieved));
final JsonObjectBuilder jsonbuilder = json(retrieved);
- return okResponse(jsonbuilder.add("latestVersion", (latest != null) ? json(latest) : null));
- } catch (WrappedResponse ex) {
- return ex.refineResponse("GETting dataset " + id + " failed.");
- }
-
+ return ok(jsonbuilder.add("latestVersion", (latest != null) ? json(latest) : null));
+ });
}
- /* An experimental method for creating a new dataset, from scratch, all from json metadata file
- @POST
- @Path("")
- public Response createDataset(String jsonBody) {
- Dataset importedDataset = null;
- try {
- final DataverseRequest r = createDataverseRequest(findUserOrDie());
-
- StringReader rdr = new StringReader(jsonBody);
- JsonObject json = Json.createReader(rdr).readObject();
- JsonParser parser = new JsonParser(datasetfieldService, metadataBlockService, settingsService);
- parser.setLenient(true);
- Dataset ds = parser.parseDataset(json);
-
-
- Dataverse owner = dataverseService.find(1L);
- ds.setOwner(owner);
- ds.getLatestVersion().setDatasetFields(ds.getLatestVersion().initDatasetFields());
-
- // Check data against required contraints
- List violations = ds.getVersions().get(0).validateRequired();
- if (!violations.isEmpty()) {
- // For migration and harvest, add NA for missing required values
- for (ConstraintViolation v : violations) {
- DatasetField f = ((DatasetField) v.getRootBean());
- f.setSingleValue(DatasetField.NA_VALUE);
- }
- }
-
-
- Set invalidViolations = ds.getVersions().get(0).validate();
- ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
- Validator validator = factory.getValidator();
- if (!invalidViolations.isEmpty()) {
- for (ConstraintViolation v : invalidViolations) {
- DatasetFieldValue f = ((DatasetFieldValue) v.getRootBean());
- boolean fixed = false;
- boolean converted = false;
- // TODO: Is this scrubbing something we want to continue doing?
- //
- //if (settingsService.isTrueForKey(SettingsServiceBean.Key.ScrubMigrationData, false)) {
- // fixed = processMigrationValidationError(f, cleanupLog, metadataFile.getName());
- // converted = true;
- // if (fixed) {
- // Set> scrubbedViolations = validator.validate(f);
- // if (!scrubbedViolations.isEmpty()) {
- // fixed = false;
- // }
- // }
- //}
- if (!fixed) {
- String msg = "Field: " + f.getDatasetField().getDatasetFieldType().getDisplayName() + "; "
- + "Invalid value: '" + f.getValue() + "'" + " Converted Value:'" + DatasetField.NA_VALUE + "'";
- Logger.getLogger(Datasets.class.getName()).log(Level.INFO, null, msg);
- f.setValue(DatasetField.NA_VALUE);
- }
- }
- }
-
- //ds.setHarvestedFrom(harvestingClient);
- //ds.setHarvestIdentifier(harvestIdentifier);
-
- importedDataset = engineSvc.submit(new CreateDatasetCommand(ds, r, false, ImportUtil.ImportType.HARVEST));
-
- } catch (JsonParseException ex) {
- Logger.getLogger(Datasets.class.getName()).log(Level.INFO, null, "Error parsing datasetVersion: " + ex.getMessage());
- return errorResponse(Response.Status.NOT_FOUND, "error parsing dataset");
- } catch (CommandException ex) {
- Logger.getLogger(Datasets.class.getName()).log(Level.INFO, null, "Error excuting Create dataset command: " + ex.getMessage());
- return errorResponse(Response.Status.NOT_FOUND, "error executing create dataset command");
- } catch (WrappedResponse ex) {
- return ex.refineResponse("Error: "+ex.getWrappedMessageWhenJson());
- }
-
- final JsonObjectBuilder jsonbuilder = json(importedDataset);
-
- return okResponse(jsonbuilder.add("latestVersion", json(importedDataset.getLatestVersion())));
- } */
-
// TODO:
// This API call should, ideally, call findUserOrDie() and the GetDatasetCommand
// to obtain the dataset that we are trying to export - which would handle
@@ -235,7 +131,7 @@ public Response exportDataset(@QueryParam("persistentId") String persistentId, @
try {
Dataset dataset = datasetService.findByGlobalId(persistentId);
if (dataset == null) {
- return errorResponse(Response.Status.NOT_FOUND, "A dataset with the persistentId " + persistentId + " could not be found.");
+ return error(Response.Status.NOT_FOUND, "A dataset with the persistentId " + persistentId + " could not be found.");
}
ExportService instance = ExportService.getInstance();
@@ -262,132 +158,93 @@ public Response exportDataset(@QueryParam("persistentId") String persistentId, @
.type(mediaType).
build();
} catch (Exception wr) {
- return errorResponse(Response.Status.FORBIDDEN, "Export Failed");
+ return error(Response.Status.FORBIDDEN, "Export Failed");
}
}
@DELETE
@Path("{id}")
public Response deleteDataset( @PathParam("id") String id) {
-
- try {
- execCommand( new DeleteDatasetCommand(createDataverseRequest(findUserOrDie()), findDatasetOrDie(id)));
- return okResponse("Dataset " + id + " deleted");
-
- } catch (WrappedResponse ex) {
- return ex.refineResponse( "Failed to delete dataset " + id );
- }
-
+ return response( req -> {
+ execCommand( new DeleteDatasetCommand(req, findDatasetOrDie(id)));
+ return ok("Dataset " + id + " deleted");
+ });
}
@DELETE
@Path("{id}/destroy")
public Response destroyDataset( @PathParam("id") String id) {
- try {
- execCommand( new DestroyDatasetCommand(findDatasetOrDie(id), createDataverseRequest(findUserOrDie()) ));
- return okResponse("Dataset " + id + " destroyed");
-
- } catch (WrappedResponse ex) {
- return ex.refineResponse( "Failed to detroy dataset " + id );
- }
+ return response( req -> {
+ execCommand( new DestroyDatasetCommand(findDatasetOrDie(id), req) );
+ return ok("Dataset " + id + " destroyed");
+ });
}
@PUT
@Path("{id}/citationdate")
public Response setCitationDate( @PathParam("id") String id, String dsfTypeName) {
- try {
- if ( dsfTypeName.trim().isEmpty() ){
- throw new WrappedResponse( badRequest("Please provide a dataset field type in the requst body.") );
- }
- DatasetFieldType dsfType = null;
- if (!":publicationDate".equals(dsfTypeName)) {
- dsfType = datasetFieldSvc.findByName(dsfTypeName);
- if (dsfType == null) {
- throw new WrappedResponse( badRequest("Dataset Field Type Name " + dsfTypeName + " not found.") );
- }
+ return response( req -> {
+ if ( dsfTypeName.trim().isEmpty() ){
+ return badRequest("Please provide a dataset field type in the requst body.");
+ }
+ DatasetFieldType dsfType = null;
+ if (!":publicationDate".equals(dsfTypeName)) {
+ dsfType = datasetFieldSvc.findByName(dsfTypeName);
+ if (dsfType == null) {
+ return badRequest("Dataset Field Type Name " + dsfTypeName + " not found.");
}
-
- execCommand(new SetDatasetCitationDateCommand(createDataverseRequest(findUserOrDie()), findDatasetOrDie(id), dsfType));
-
- return okResponse("Citation Date for dataset " + id + " set to: " + (dsfType != null ? dsfType.getDisplayName() : "default"));
-
- } catch (WrappedResponse ex) {
- return ex.refineResponse("Unable to set citation date for dataset " + id + ".");
}
+
+ execCommand(new SetDatasetCitationDateCommand(req, findDatasetOrDie(id), dsfType));
+ return ok("Citation Date for dataset " + id + " set to: " + (dsfType != null ? dsfType.getDisplayName() : "default"));
+ });
}
@DELETE
@Path("{id}/citationdate")
public Response useDefaultCitationDate( @PathParam("id") String id) {
- try {
- execCommand(new SetDatasetCitationDateCommand(createDataverseRequest(findUserOrDie()), findDatasetOrDie(id), null));
- return okResponse("Citation Date for dataset " + id + " set to default");
- } catch (WrappedResponse ex) {
- return ex.refineResponse("Unable to restore default citation date for dataset " + id + ".");
- }
+ return response( req -> {
+ execCommand(new SetDatasetCitationDateCommand(req, findDatasetOrDie(id), null));
+ return ok("Citation Date for dataset " + id + " set to default");
+ });
}
@GET
@Path("{id}/versions")
public Response listVersions( @PathParam("id") String id ) {
- try {
- JsonArrayBuilder bld = Json.createArrayBuilder();
- for ( DatasetVersion dsv : execCommand(
- new ListVersionsCommand(
- createDataverseRequest(findUserOrDie()), findDatasetOrDie(id)) ) ) {
- bld.add( json(dsv) );
- }
- return okResponse( bld );
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ return response( req -> {
+ return ok(
+ execCommand(
+ new ListVersionsCommand(req, findDatasetOrDie(id)) )
+ .stream()
+ .map( d -> json(d) )
+ .collect(toJsonArray()));});
}
@GET
@Path("{id}/versions/{versionId}")
public Response getVersion( @PathParam("id") String datasetId, @PathParam("versionId") String versionId) {
-
- try {
- DatasetVersion dsv = getDatasetVersionOrDie(createDataverseRequest(findUserOrDie()), versionId, findDatasetOrDie(datasetId));
-
+ return response( req -> {
+ DatasetVersion dsv = getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId));
return (dsv == null || dsv.getId() == null) ? notFound("Dataset version not found")
- : okResponse(json(dsv));
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ : ok(json(dsv));
+ });
}
@GET
@Path("{id}/versions/{versionId}/files")
public Response getVersionFiles( @PathParam("id") String datasetId, @PathParam("versionId") String versionId) {
-
- try {
-
- return okResponse( jsonFileMetadatas(
- getDatasetVersionOrDie(createDataverseRequest(findUserOrDie()),
- versionId,
- findDatasetOrDie(datasetId)).getFileMetadatas()));
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ return response( req -> ok( jsonFileMetadatas(
+ getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId)).getFileMetadatas())) );
}
@GET
@Path("{id}/versions/{versionId}/metadata")
public Response getVersionMetadata( @PathParam("id") String datasetId, @PathParam("versionId") String versionId) {
-
- try {
- return okResponse(
+ return response( req -> ok(
jsonByBlocks(
- getDatasetVersionOrDie( createDataverseRequest(findUserOrDie()), versionId, findDatasetOrDie(datasetId) )
- .getDatasetFields()));
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ getDatasetVersionOrDie(req, versionId, findDatasetOrDie(datasetId) )
+ .getDatasetFields())));
}
@GET
@@ -396,21 +253,17 @@ public Response getVersionMetadataBlock( @PathParam("id") String datasetId,
@PathParam("versionNumber") String versionNumber,
@PathParam("block") String blockName ) {
- try {
- DatasetVersion dsv = getDatasetVersionOrDie(createDataverseRequest(findUserOrDie()), versionNumber, findDatasetOrDie(datasetId) );
+ return response( req -> {
+ DatasetVersion dsv = getDatasetVersionOrDie(req, versionNumber, findDatasetOrDie(datasetId) );
Map> fieldsByBlock = DatasetField.groupByBlock(dsv.getDatasetFields());
for ( Map.Entry> p : fieldsByBlock.entrySet() ) {
if ( p.getKey().getName().equals(blockName) ) {
- return okResponse( json(p.getKey(), p.getValue()) );
+ return ok( json(p.getKey(), p.getValue()) );
}
}
return notFound("metadata block named " + blockName + " not found");
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
-
+ });
}
@DELETE
@@ -419,45 +272,36 @@ public Response deleteDraftVersion( @PathParam("id") String id, @PathParam("ver
if ( ! ":draft".equals(versionId) ) {
return badRequest("Only the :draft version can be deleted");
}
-
- try {
- execCommand( new DeleteDatasetVersionCommand(createDataverseRequest(findUserOrDie()), findDatasetOrDie(id)) );
- return okResponse("Draft version of dataset " + id + " deleted");
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+
+ return response( req -> {
+ execCommand( new DeleteDatasetVersionCommand(req, findDatasetOrDie(id)) );
+ return ok("Draft version of dataset " + id + " deleted");
+ });
}
@GET
@Path("{id}/modifyRegistration")
public Response updateDatasetTargetURL(@PathParam("id") String id ) {
-
- try {
- execCommand(new UpdateDatasetTargetURLCommand(findDatasetOrDie(id), createDataverseRequest(findUserOrDie())));
- return okResponse("Dataset " + id + " target url updated");
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
-
+ return response( req -> {
+ execCommand(new UpdateDatasetTargetURLCommand(findDatasetOrDie(id), req));
+ return ok("Dataset " + id + " target url updated");
+ });
}
@GET
@Path("/modifyRegistrationAll")
public Response updateDatasetTargetURLAll() {
- List allDatasets = datasetService.findAll();
-
- for (Dataset ds : allDatasets){
-
- try {
- execCommand(new UpdateDatasetTargetURLCommand(findDatasetOrDie(ds.getId().toString()), createDataverseRequest(findUserOrDie())));
- } catch (WrappedResponse ex) {
- Logger.getLogger(Datasets.class.getName()).log(Level.SEVERE, null, ex);
- }
-
- }
- return okResponse("Update All Dataset target url completed");
+ return response( req -> {
+ datasetService.findAll().forEach( ds -> {
+ try {
+ execCommand(new UpdateDatasetTargetURLCommand(findDatasetOrDie(ds.getId().toString()), req));
+ } catch (WrappedResponse ex) {
+ Logger.getLogger(Datasets.class.getName()).log(Level.SEVERE, null, ex);
+ }
+ });
+ return ok("Update All Dataset target url completed");
+ });
}
@PUT
@@ -465,7 +309,7 @@ public Response updateDatasetTargetURLAll() {
public Response updateDraftVersion( String jsonBody, @PathParam("id") String id, @PathParam("versionId") String versionId ){
if ( ! ":draft".equals(versionId) ) {
- return errorResponse( Response.Status.BAD_REQUEST, "Only the :draft version can be updated");
+ return error( Response.Status.BAD_REQUEST, "Only the :draft version can be updated");
}
try ( StringReader rdr = new StringReader(jsonBody) ) {
@@ -487,11 +331,11 @@ public Response updateDraftVersion( String jsonBody, @PathParam("id") String id,
DatasetVersion managedVersion = execCommand( updateDraft
? new UpdateDatasetVersionCommand(req, incomingVersion)
: new CreateDatasetVersionCommand(req, ds, incomingVersion));
- return okResponse( json(managedVersion) );
+ return ok( json(managedVersion) );
} catch (JsonParseException ex) {
LOGGER.log(Level.SEVERE, "Semantic error parsing dataset version Json: " + ex.getMessage(), ex);
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing dataset version: " + ex.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing dataset version: " + ex.getMessage() );
} catch (WrappedResponse ex) {
return ex.getResponse();
@@ -504,7 +348,7 @@ public Response updateDraftVersion( String jsonBody, @PathParam("id") String id,
public Response publishDataset( @PathParam("id") String id, @QueryParam("type") String type ) {
try {
if ( type == null ) {
- return errorResponse( Response.Status.BAD_REQUEST, "Missing 'type' parameter (either 'major' or 'minor').");
+ return error( Response.Status.BAD_REQUEST, "Missing 'type' parameter (either 'major' or 'minor').");
}
type = type.toLowerCase();
@@ -512,19 +356,19 @@ public Response publishDataset( @PathParam("id") String id, @QueryParam("type")
switch ( type ) {
case "minor": isMinor = true; break;
case "major": isMinor = false; break;
- default: return errorResponse( Response.Status.BAD_REQUEST, "Illegal 'type' parameter value '" + type + "'. It needs to be either 'major' or 'minor'.");
+ default: return error( Response.Status.BAD_REQUEST, "Illegal 'type' parameter value '" + type + "'. It needs to be either 'major' or 'minor'.");
}
long dsId;
try {
dsId = Long.parseLong(id);
} catch ( NumberFormatException nfe ) {
- return errorResponse( Response.Status.BAD_REQUEST, "Bad dataset id. Please provide a number.");
+ return error( Response.Status.BAD_REQUEST, "Bad dataset id. Please provide a number.");
}
Dataset ds = datasetService.find(dsId);
return ( ds == null ) ? notFound("Can't find dataset with id '" + id + "'")
- : okResponse( json(execCommand(new PublishDatasetCommand(ds,
+ : ok( json(execCommand(new PublishDatasetCommand(ds,
createDataverseRequest(findAuthenticatedUserOrDie()),
isMinor))) );
@@ -539,7 +383,7 @@ public Response getLinks(@PathParam("id") String idSupplied ) {
try {
User u = findUserOrDie();
if (!u.isSuperuser()) {
- return errorResponse(Response.Status.FORBIDDEN, "Not a superuser");
+ return error(Response.Status.FORBIDDEN, "Not a superuser");
}
Dataset dataset = findDatasetOrDie(idSupplied);
@@ -551,99 +395,12 @@ public Response getLinks(@PathParam("id") String idSupplied ) {
}
JsonObjectBuilder response = Json.createObjectBuilder();
response.add("dataverses that link to dataset id " + datasetId, dataversesThatLinkToThisDatasetIdBuilder);
- return okResponse(response);
+ return ok(response);
} catch (WrappedResponse wr) {
return wr.getResponse();
}
}
-
- private T handleVersion( String versionId, DsVersionHandler hdl )
- throws WrappedResponse {
- switch (versionId) {
- case ":latest": return hdl.handleLatest();
- case ":draft": return hdl.handleDraft();
- case ":latest-published": return hdl.handleLatestPublished();
- default:
- try {
- String[] versions = versionId.split("\\.");
- switch (versions.length) {
- case 1:
- return hdl.handleSpecific(Long.parseLong(versions[0]), (long)0.0);
- case 2:
- return hdl.handleSpecific( Long.parseLong(versions[0]), Long.parseLong(versions[1]) );
- default:
- throw new WrappedResponse(errorResponse( Response.Status.BAD_REQUEST, "Illegal version identifier '" + versionId + "'"));
- }
- } catch ( NumberFormatException nfe ) {
- throw new WrappedResponse( errorResponse( Response.Status.BAD_REQUEST, "Illegal version identifier '" + versionId + "'") );
- }
- }
- }
-
- private DatasetVersion getDatasetVersionOrDie( final DataverseRequest req, String versionNumber, final Dataset ds ) throws WrappedResponse {
- DatasetVersion dsv = execCommand( handleVersion(versionNumber, new DsVersionHandler>(){
-
- @Override
- public Command handleLatest() {
- return new GetLatestAccessibleDatasetVersionCommand(req, ds);
- }
-
- @Override
- public Command handleDraft() {
- return new GetDraftDatasetVersionCommand(req, ds);
- }
-
- @Override
- public Command handleSpecific(long major, long minor) {
- return new GetSpecificPublishedDatasetVersionCommand(req, ds, major, minor);
- }
-
- @Override
- public Command handleLatestPublished() {
- return new GetLatestPublishedDatasetVersionCommand(req, ds);
- }
- }));
- if ( dsv == null || dsv.getId() == null ) {
- throw new WrappedResponse( notFound("Dataset version " + versionNumber + " of dataset " + ds.getId() + " not found") );
- }
- return dsv;
- }
-
- Dataset findDatasetOrDie( String id ) throws WrappedResponse {
- Dataset dataset;
- LOGGER.info("Looking for dataset " + id);
- if ( id.equals(PERSISTENT_ID_KEY) ) {
- String persistentId = getRequestParameter(PERSISTENT_ID_KEY.substring(1));
- LOGGER.info("Looking for dataset " + persistentId);
- if ( persistentId == null ) {
- throw new WrappedResponse(
- badRequest("When accessing a dataset based on persistent id, "
- + "a " + PERSISTENT_ID_KEY.substring(1) + " query parameter "
- + "must be present"));
- }
- dataset = datasetService.findByGlobalId(persistentId);
- if (dataset == null) {
- throw new WrappedResponse( notFound("dataset " + persistentId + " not found") );
- }
- return dataset;
-
- } else {
- try {
- dataset = datasetService.find( Long.parseLong(id) );
- if (dataset == null) {
- throw new WrappedResponse( notFound("dataset " + id + " not found") );
- }
- return dataset;
- } catch ( NumberFormatException nfe ) {
- throw new WrappedResponse(
- badRequest("Bad dataset id number: '" + id + "'"));
- }
- }
-
- }
-
-
/**
* @todo Implement this for real as part of
* https://github.com/IQSS/dataverse/issues/2579
@@ -655,18 +412,18 @@ Dataset findDatasetOrDie( String id ) throws WrappedResponse {
public Response getDdi(@QueryParam("id") long id, @QueryParam("persistentId") String persistentId, @QueryParam("dto") boolean dto) {
boolean ddiExportEnabled = systemConfig.isDdiExportEnabled();
if (!ddiExportEnabled) {
- return errorResponse(Response.Status.FORBIDDEN, "Disabled");
+ return error(Response.Status.FORBIDDEN, "Disabled");
}
try {
User u = findUserOrDie();
if (!u.isSuperuser()) {
- return errorResponse(Response.Status.FORBIDDEN, "Not a superuser");
+ return error(Response.Status.FORBIDDEN, "Not a superuser");
}
LOGGER.fine("looking up " + persistentId);
Dataset dataset = datasetService.findByGlobalId(persistentId);
if (dataset == null) {
- return errorResponse(Response.Status.NOT_FOUND, "A dataset with the persistentId " + persistentId + " could not be found.");
+ return error(Response.Status.NOT_FOUND, "A dataset with the persistentId " + persistentId + " could not be found.");
}
String xml = "XML_BEING_COOKED ";
@@ -703,17 +460,17 @@ public Response getDdi(@QueryParam("id") long id, @QueryParam("persistentId") St
public Response createAssignment(String userOrGroup, @PathParam("identifier") String id, @QueryParam("key") String apiKey) {
boolean apiTestingOnly = true;
if (apiTestingOnly) {
- return errorResponse(Response.Status.FORBIDDEN, "This is only for API tests.");
+ return error(Response.Status.FORBIDDEN, "This is only for API tests.");
}
try {
Dataset dataset = findDatasetOrDie(id);
RoleAssignee assignee = findAssignee(userOrGroup);
if (assignee == null) {
- return errorResponse(Response.Status.BAD_REQUEST, "Assignee not found");
+ return error(Response.Status.BAD_REQUEST, "Assignee not found");
}
DataverseRole theRole = rolesSvc.findBuiltinRoleByAlias("admin");
String privateUrlToken = null;
- return okResponse(
+ return ok(
json(execCommand(new AssignRoleCommand(assignee, theRole, dataset, createDataverseRequest(findUserOrDie()), privateUrlToken))));
} catch (WrappedResponse ex) {
LOGGER.log(Level.WARNING, "Can''t create assignment: {0}", ex.getMessage());
@@ -724,59 +481,127 @@ public Response createAssignment(String userOrGroup, @PathParam("identifier") St
@GET
@Path("{identifier}/assignments")
public Response getAssignments(@PathParam("identifier") String id) {
- try {
- JsonArrayBuilder jab = Json.createArrayBuilder();
- for (RoleAssignment ra : execCommand(new ListRoleAssignments(createDataverseRequest(findUserOrDie()), findDatasetOrDie(id)))) {
- jab.add(json(ra));
- }
- return okResponse(jab);
- } catch (WrappedResponse ex) {
- LOGGER.log(Level.WARNING, "Can't list assignments: {0}", ex.getMessage());
- return ex.getResponse();
- }
+ return response( req ->
+ ok( execCommand(
+ new ListRoleAssignments(req, findDatasetOrDie(id)))
+ .stream().map(ra->json(ra)).collect(toJsonArray())) );
}
@GET
@Path("{id}/privateUrl")
public Response getPrivateUrlData(@PathParam("id") String idSupplied) {
- try {
- PrivateUrl privateUrl = execCommand(new GetPrivateUrlCommand(createDataverseRequest(findUserOrDie()), findDatasetOrDie(idSupplied)));
- if (privateUrl != null) {
- return okResponse(json(privateUrl));
- } else {
- return errorResponse(Response.Status.NOT_FOUND, "Private URL not found.");
- }
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ return response( req -> {
+ PrivateUrl privateUrl = execCommand(new GetPrivateUrlCommand(req, findDatasetOrDie(idSupplied)));
+ return (privateUrl != null) ? ok(json(privateUrl))
+ : error(Response.Status.NOT_FOUND, "Private URL not found.");
+ });
}
@POST
@Path("{id}/privateUrl")
public Response createPrivateUrl(@PathParam("id") String idSupplied) {
- try {
- return okResponse(json(execCommand(new CreatePrivateUrlCommand(createDataverseRequest(findUserOrDie()), findDatasetOrDie(idSupplied)))));
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ return response( req ->
+ ok(json(execCommand(
+ new CreatePrivateUrlCommand(req, findDatasetOrDie(idSupplied))))));
}
@DELETE
@Path("{id}/privateUrl")
public Response deletePrivateUrl(@PathParam("id") String idSupplied) {
- try {
- User user = findUserOrDie();
+ return response( req -> {
Dataset dataset = findDatasetOrDie(idSupplied);
- PrivateUrl privateUrl = execCommand(new GetPrivateUrlCommand(createDataverseRequest(user), dataset));
+ PrivateUrl privateUrl = execCommand(new GetPrivateUrlCommand(req, dataset));
if (privateUrl != null) {
- execCommand(new DeletePrivateUrlCommand(createDataverseRequest(user), dataset));
- return okResponse("Private URL deleted.");
+ execCommand(new DeletePrivateUrlCommand(req, dataset));
+ return ok("Private URL deleted.");
} else {
- return errorResponse(Response.Status.NOT_FOUND, "No Private URL to delete.");
+ return notFound("No Private URL to delete.");
+ }
+ });
+ }
+
+ private Dataset findDatasetOrDie( String id ) throws WrappedResponse {
+ Dataset dataset;
+ if ( id.equals(PERSISTENT_ID_KEY) ) {
+ String persistentId = getRequestParameter(PERSISTENT_ID_KEY.substring(1));
+ if ( persistentId == null ) {
+ throw new WrappedResponse(
+ badRequest("When accessing a dataset based on persistent id, "
+ + "a " + PERSISTENT_ID_KEY.substring(1) + " query parameter "
+ + "must be present"));
+ }
+ dataset = datasetService.findByGlobalId(persistentId);
+ if (dataset == null) {
+ throw new WrappedResponse( notFound("dataset " + persistentId + " not found") );
+ }
+ return dataset;
+
+ } else {
+ try {
+ dataset = datasetService.find( Long.parseLong(id) );
+ if (dataset == null) {
+ throw new WrappedResponse( notFound("dataset " + id + " not found") );
+ }
+ return dataset;
+ } catch ( NumberFormatException nfe ) {
+ throw new WrappedResponse(
+ badRequest("Bad dataset id number: '" + id + "'"));
}
- } catch (WrappedResponse wr) {
- return wr.getResponse();
}
+
}
+
+
+ private T handleVersion( String versionId, DsVersionHandler hdl )
+ throws WrappedResponse {
+ switch (versionId) {
+ case ":latest": return hdl.handleLatest();
+ case ":draft": return hdl.handleDraft();
+ case ":latest-published": return hdl.handleLatestPublished();
+ default:
+ try {
+ String[] versions = versionId.split("\\.");
+ switch (versions.length) {
+ case 1:
+ return hdl.handleSpecific(Long.parseLong(versions[0]), (long)0.0);
+ case 2:
+ return hdl.handleSpecific( Long.parseLong(versions[0]), Long.parseLong(versions[1]) );
+ default:
+ throw new WrappedResponse(error( Response.Status.BAD_REQUEST, "Illegal version identifier '" + versionId + "'"));
+ }
+ } catch ( NumberFormatException nfe ) {
+ throw new WrappedResponse( error( Response.Status.BAD_REQUEST, "Illegal version identifier '" + versionId + "'") );
+ }
+ }
+ }
+
+ private DatasetVersion getDatasetVersionOrDie( final DataverseRequest req, String versionNumber, final Dataset ds ) throws WrappedResponse {
+ DatasetVersion dsv = execCommand( handleVersion(versionNumber, new DsVersionHandler>(){
+
+ @Override
+ public Command handleLatest() {
+ return new GetLatestAccessibleDatasetVersionCommand(req, ds);
+ }
+
+ @Override
+ public Command handleDraft() {
+ return new GetDraftDatasetVersionCommand(req, ds);
+ }
+ @Override
+ public Command handleSpecific(long major, long minor) {
+ return new GetSpecificPublishedDatasetVersionCommand(req, ds, major, minor);
+ }
+
+ @Override
+ public Command handleLatestPublished() {
+ return new GetLatestPublishedDatasetVersionCommand(req, ds);
+ }
+ }));
+ if ( dsv == null || dsv.getId() == null ) {
+ throw new WrappedResponse( notFound("Dataset version " + versionNumber + " of dataset " + ds.getId() + " not found") );
+ }
+ return dsv;
+ }
+
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java b/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
index 3c676f5d665..78c29e81c90 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Dataverses.java
@@ -90,7 +90,10 @@
public class Dataverses extends AbstractApiBean {
private static final Logger LOGGER = Logger.getLogger(Dataverses.class.getName());
-
+
+ @EJB
+ ExplicitGroupServiceBean explicitGroupSvc;
+
@POST
public Response addRoot( String body ) {
LOGGER.info("Creating root dataverse");
@@ -108,10 +111,10 @@ public Response addDataverse( String body, @PathParam("identifier") String paren
d = jsonParser().parseDataverse(dvJson);
} catch ( JsonParsingException jpe ) {
LOGGER.log(Level.SEVERE, "Json: {0}", body);
- return errorResponse( Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
+ return error( Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
} catch (JsonParseException ex) {
Logger.getLogger(Dataverses.class.getName()).log(Level.SEVERE, "Error parsing dataverse from json: " + ex.getMessage(), ex);
- return errorResponse( Response.Status.BAD_REQUEST,
+ return error( Response.Status.BAD_REQUEST,
"Error parsing the POSTed json into a dataverse: " + ex.getMessage() );
}
@@ -128,8 +131,27 @@ public Response addDataverse( String body, @PathParam("identifier") String paren
AuthenticatedUser u = findAuthenticatedUserOrDie();
d = execCommand( new CreateDataverseCommand(d, createDataverseRequest(u), null, null) );
- return createdResponse( "/dataverses/"+d.getAlias(), json(d) );
+ return created( "/dataverses/"+d.getAlias(), json(d) );
} catch ( WrappedResponse ww ) {
+ Throwable cause = ww.getCause();
+ StringBuilder sb = new StringBuilder();
+ while (cause.getCause() != null) {
+ cause = cause.getCause();
+ if (cause instanceof ConstraintViolationException) {
+ ConstraintViolationException constraintViolationException = (ConstraintViolationException) cause;
+ for (ConstraintViolation> violation : constraintViolationException.getConstraintViolations()) {
+ sb.append(" Invalid value: <<<").append(violation.getInvalidValue()).append(">>> for ")
+ .append(violation.getPropertyPath()).append(" at ")
+ .append(violation.getLeafBean()).append(" - ")
+ .append(violation.getMessage());
+ }
+ }
+ }
+ String error = sb.toString();
+ if (!error.isEmpty()) {
+ LOGGER.log(Level.INFO, error);
+ return ww.refineResponse(error);
+ }
return ww.getResponse();
} catch (EJBException ex) {
@@ -149,10 +171,10 @@ public Response addDataverse( String body, @PathParam("identifier") String paren
}
}
LOGGER.log(Level.SEVERE, sb.toString());
- return errorResponse( Response.Status.INTERNAL_SERVER_ERROR, "Error creating dataverse: " + sb.toString() );
+ return error( Response.Status.INTERNAL_SERVER_ERROR, "Error creating dataverse: " + sb.toString() );
} catch ( Exception ex ) {
LOGGER.log(Level.SEVERE, "Error creating dataverse", ex);
- return errorResponse( Response.Status.INTERNAL_SERVER_ERROR, "Error creating dataverse: " + ex.getMessage() );
+ return error( Response.Status.INTERNAL_SERVER_ERROR, "Error creating dataverse: " + ex.getMessage() );
}
}
@@ -169,7 +191,7 @@ public Response createDataset( String jsonBody, @PathParam("identifier") String
json = Json.createReader(rdr).readObject();
} catch ( JsonParsingException jpe ) {
LOGGER.log(Level.SEVERE, "Json: {0}", jsonBody);
- return errorResponse( Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
+ return error( Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
}
Dataset ds = new Dataset();
@@ -177,11 +199,14 @@ public Response createDataset( String jsonBody, @PathParam("identifier") String
JsonObject jsonVersion = json.getJsonObject("datasetVersion");
if ( jsonVersion == null) {
- return errorResponse(Status.BAD_REQUEST, "Json POST data are missing datasetVersion object.");
+ return error(Status.BAD_REQUEST, "Json POST data are missing datasetVersion object.");
}
try {
try {
- DatasetVersion version = jsonParser().parseDatasetVersion(jsonVersion);
+ DatasetVersion version = new DatasetVersion();
+ version.setDataset(ds);
+ // Use the two argument version so that the version knows which dataset it's associated with.
+ version = jsonParser().parseDatasetVersion(jsonVersion, version);
// force "initial version" properties
version.setMinorVersionNumber(null);
@@ -197,14 +222,14 @@ public Response createDataset( String jsonBody, @PathParam("identifier") String
}
} catch (JsonParseException ex) {
LOGGER.log( Level.INFO, "Error parsing dataset version from Json", ex);
- return errorResponse(Status.BAD_REQUEST, "Error parsing datasetVersion: " + ex.getMessage() );
+ return error(Status.BAD_REQUEST, "Error parsing datasetVersion: " + ex.getMessage() );
} catch ( Exception e ) {
LOGGER.log( Level.WARNING, "Error parsing dataset version from Json", e);
- return errorResponse(Status.INTERNAL_SERVER_ERROR, "Error parsing datasetVersion: " + e.getMessage() );
+ return error(Status.INTERNAL_SERVER_ERROR, "Error parsing datasetVersion: " + e.getMessage() );
}
Dataset managedDs = execCommand(new CreateDatasetCommand(ds, createDataverseRequest(u)));
- return createdResponse( "/datasets/" + managedDs.getId(),
+ return created( "/datasets/" + managedDs.getId(),
Json.createObjectBuilder().add("id", managedDs.getId()) );
} catch ( WrappedResponse ex ) {
@@ -215,38 +240,31 @@ public Response createDataset( String jsonBody, @PathParam("identifier") String
@GET
@Path("{identifier}")
public Response viewDataverse( @PathParam("identifier") String idtf ) {
- try {
- Dataverse retrieved = execCommand( new GetDataverseCommand( createDataverseRequest(findUserOrDie()), findDataverseOrDie(idtf)) );
- return okResponse( json(retrieved) );
- } catch ( WrappedResponse ex ) {
- return ex.getResponse();
- }
+ return response( req -> ok(json(execCommand(
+ new GetDataverseCommand(req, findDataverseOrDie(idtf))))));
}
@DELETE
@Path("{identifier}")
public Response deleteDataverse( @PathParam("identifier") String idtf ) {
- try {
- execCommand( new DeleteDataverseCommand(createDataverseRequest(findUserOrDie()), findDataverseOrDie(idtf)) );
- return okResponse( "Dataverse " + idtf +" deleted");
- } catch ( WrappedResponse ex ) {
- return ex.getResponse();
- }
+ return response( req -> {
+ execCommand( new DeleteDataverseCommand(req, findDataverseOrDie(idtf)));
+ return ok( "Dataverse " + idtf +" deleted");
+ });
}
@GET
@Path("{identifier}/metadatablocks")
public Response listMetadataBlocks( @PathParam("identifier") String dvIdtf ) {
try {
- JsonArrayBuilder jab = Json.createArrayBuilder();
- for ( MetadataBlock blk : execCommand( new ListMetadataBlocksCommand(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf)) )){
- jab.add( brief.json(blk) );
+ JsonArrayBuilder arr = Json.createArrayBuilder();
+ final List blocks = execCommand( new ListMetadataBlocksCommand(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf)));
+ for ( MetadataBlock mdb : blocks) {
+ arr.add( brief.json(mdb) );
}
-
- return okResponse(jab);
-
- } catch (WrappedResponse ex) {
- return ex.refineResponse( "Error listing metadata blocks for dataverse " + dvIdtf + ":");
+ return ok(arr);
+ } catch (WrappedResponse we ){
+ return we.getResponse();
}
}
@@ -262,17 +280,17 @@ public Response setMetadataBlocks( @PathParam("identifier")String dvIdtf, String
? findMetadataBlock( ((JsonNumber)blockId).longValue() )
: findMetadataBlock( ((JsonString)blockId).getString() );
if ( blk == null ) {
- return errorResponse(Response.Status.BAD_REQUEST, "Can't find metadata block '"+ blockId + "'");
+ return error(Response.Status.BAD_REQUEST, "Can't find metadata block '"+ blockId + "'");
}
blocks.add( blk );
}
} catch( Exception e ) {
- return errorResponse(Response.Status.BAD_REQUEST, e.getMessage());
+ return error(Response.Status.BAD_REQUEST, e.getMessage());
}
try {
execCommand( new UpdateDataverseMetadataBlocksCommand.SetBlocks(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf), blocks));
- return okResponse("Metadata blocks of dataverse " + dvIdtf + " updated.");
+ return ok("Metadata blocks of dataverse " + dvIdtf + " updated.");
} catch (WrappedResponse ex) {
return ex.getResponse();
@@ -281,54 +299,53 @@ public Response setMetadataBlocks( @PathParam("identifier")String dvIdtf, String
@GET
@Path("{identifier}/metadatablocks/:isRoot")
+ public Response getMetadataRoot_legacy( @PathParam("identifier")String dvIdtf ) {
+ return getMetadataRoot(dvIdtf);
+ }
+
+ @GET
+ @Path("{identifier}/metadatablocks/isRoot")
@Produces(MediaType.APPLICATION_JSON)
public Response getMetadataRoot( @PathParam("identifier")String dvIdtf ) {
-
- try {
- Dataverse dataverse = findDataverseOrDie(dvIdtf);
- if ( permissionSvc.request( createDataverseRequest(findUserOrDie()) )
+ return response( req -> {
+ final Dataverse dataverse = findDataverseOrDie(dvIdtf);
+ if ( permissionSvc.request(req)
.on(dataverse)
.has(Permission.EditDataverse) ) {
- return okResponseWithValue( dataverse.isMetadataBlockRoot() );
+ return ok( dataverse.isMetadataBlockRoot() );
} else {
- return errorResponse( Status.FORBIDDEN, "Not authorized" );
+ return error( Status.FORBIDDEN, "Not authorized" );
}
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ });
}
@POST
@Path("{identifier}/metadatablocks/:isRoot")
@Produces(MediaType.APPLICATION_JSON)
@Consumes(MediaType.WILDCARD)
+ public Response setMetadataRoot_legacy( @PathParam("identifier")String dvIdtf, String body ) {
+ return setMetadataRoot(dvIdtf, body);
+ }
+
+ @PUT
+ @Path("{identifier}/metadatablocks/isRoot")
+ @Produces(MediaType.APPLICATION_JSON)
+ @Consumes(MediaType.WILDCARD)
public Response setMetadataRoot( @PathParam("identifier")String dvIdtf, String body ) {
-
- if ( ! Util.isBoolean(body) ) {
- return errorResponse(Response.Status.BAD_REQUEST, "Illegal value '" + body + "'. Try 'true' or 'false'");
- }
- boolean root = Util.isTrue(body);
-
- try {
- Dataverse dataverse = findDataverseOrDie(dvIdtf);
- execute(new UpdateDataverseMetadataBlocksCommand.SetRoot(createDataverseRequest(findUserOrDie()), dataverse, root));
- return okResponseWithValue("Dataverse " + dataverse.getName() + " is now a metadata " + (root? "" : "non-") + "root");
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
-
+ return response( req -> {
+ final boolean root = parseBooleanOrDie(body);
+ final Dataverse dataverse = findDataverseOrDie(dvIdtf);
+ execCommand(new UpdateDataverseMetadataBlocksCommand.SetRoot(req, dataverse, root));
+ return ok("Dataverse " + dataverse.getName() + " is now a metadata " + (root? "" : "non-") + "root");
+ });
}
@GET
@Path("{identifier}/facets/")
public Response listFacets( @PathParam("identifier") String dvIdtf ) {
- try {
- return okResponse(
- execCommand(new ListFacetsCommand(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf)) )
- .stream().map(f->json(f).build()).collect(toJsonArray()));
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ return response( req -> ok(
+ execCommand(new ListFacetsCommand(req, findDataverseOrDie(dvIdtf)) )
+ .stream().map(f->json(f)).collect(toJsonArray())));
}
@POST
@@ -340,9 +357,9 @@ public Response setFacets( @PathParam("identifier")String dvIdtf, String facetId
for ( JsonString facetId : Util.asJsonArray(facetIds).getValuesAs(JsonString.class) ) {
DatasetFieldType dsfType = findDatasetFieldType(facetId.getString());
if ( dsfType == null ) {
- return errorResponse(Response.Status.BAD_REQUEST, "Can't find dataset field type '"+ facetId + "'");
+ return error(Response.Status.BAD_REQUEST, "Can't find dataset field type '"+ facetId + "'");
} else if (!dsfType.isFacetable()) {
- return errorResponse(Response.Status.BAD_REQUEST, "Dataset field type '"+ facetId + "' is not facetable");
+ return error(Response.Status.BAD_REQUEST, "Dataset field type '"+ facetId + "' is not facetable");
}
facets.add( dsfType );
}
@@ -351,7 +368,7 @@ public Response setFacets( @PathParam("identifier")String dvIdtf, String facetId
Dataverse dataverse = findDataverseOrDie(dvIdtf);
// by passing null for Featured Dataverses and DataverseFieldTypeInputLevel, those are not changed
execCommand( new UpdateDataverseCommand(dataverse, facets, null, createDataverseRequest(findUserOrDie()), null) );
- return okResponse("Facets of dataverse " + dvIdtf + " updated.");
+ return ok("Facets of dataverse " + dvIdtf + " updated.");
} catch (WrappedResponse ex) {
return ex.getResponse();
@@ -361,81 +378,57 @@ public Response setFacets( @PathParam("identifier")String dvIdtf, String facetId
@GET
@Path("{identifier}/contents")
public Response listContent( @PathParam("identifier") String dvIdtf ) {
-
- final JsonArrayBuilder jab = Json.createArrayBuilder();
- DvObject.Visitor ser = new DvObject.Visitor() {
+ DvObject.Visitor ser = new DvObject.Visitor() {
@Override
- public Void visit(Dataverse dv) {
- jab.add( Json.createObjectBuilder().add("type", "dataverse")
+ public JsonObjectBuilder visit(Dataverse dv) {
+ return Json.createObjectBuilder().add("type", "dataverse")
.add("id", dv.getId())
- .add("title",dv.getName() ));
- return null;
+ .add("title",dv.getName() );
}
@Override
- public Void visit(Dataset ds) {
- // TODO: check for permission to view drafts
- jab.add( json(ds).add("type", "dataset") );
- return null;
+ public JsonObjectBuilder visit(Dataset ds) {
+ return json(ds).add("type", "dataset");
}
@Override
- public Void visit(DataFile df) { throw new UnsupportedOperationException("Files don't live directly in Dataverses"); }
+ public JsonObjectBuilder visit(DataFile df) { throw new UnsupportedOperationException("Files don't live directly in Dataverses"); }
};
- try {
- Dataverse dataverse = findDataverseOrDie(dvIdtf);
-
- for ( DvObject o : execCommand(new ListDataverseContentCommand(createDataverseRequest(findUserOrDie()), dataverse)) ) {
- o.accept(ser);
- }
- return okResponse(jab);
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ return response( req -> ok(
+ execCommand(new ListDataverseContentCommand(req, findDataverseOrDie(dvIdtf)))
+ .stream()
+ .map( dvo->(JsonObjectBuilder)dvo.accept(ser))
+ .collect(toJsonArray())
+ ));
}
@GET
@Path("{identifier}/roles")
public Response listRoles( @PathParam("identifier") String dvIdtf ) {
-
- try {
- Dataverse d = findDataverseOrDie(dvIdtf);
- JsonArrayBuilder jab = Json.createArrayBuilder();
- for ( DataverseRole r : execCommand( new ListRolesCommand(createDataverseRequest(findUserOrDie()), d)) ){
- jab.add( json(r) );
- }
- return okResponse(jab);
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ return response( req -> ok(
+ execCommand( new ListRolesCommand(req, findDataverseOrDie(dvIdtf)) )
+ .stream().map(r->json(r))
+ .collect( toJsonArray() )
+ ));
}
@POST
@Path("{identifier}/roles")
public Response createRole( RoleDTO roleDto, @PathParam("identifier") String dvIdtf ) {
- try {
- Dataverse dataverse = findDataverseOrDie(dvIdtf);
- return okResponse( json(execCommand(new CreateRoleCommand(roleDto.asRole(), createDataverseRequest(findUserOrDie()), dataverse))));
- } catch ( WrappedResponse ce ) {
- return ce.getResponse();
- }
+ return response( req -> ok( json(execCommand(new CreateRoleCommand(roleDto.asRole(), req, findDataverseOrDie(dvIdtf))))));
}
@GET
@Path("{identifier}/assignments")
public Response listAssignments( @PathParam("identifier") String dvIdtf) {
- try {
- JsonArrayBuilder jab = Json.createArrayBuilder();
- for ( RoleAssignment ra : execCommand(new ListRoleAssignments(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf))) ){
- jab.add( json(ra) );
- }
- return okResponse(jab);
-
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ return response( req -> ok(
+ execCommand(new ListRoleAssignments(req, findDataverseOrDie(dvIdtf)))
+ .stream()
+ .map( a -> json(a) )
+ .collect(toJsonArray())
+ ));
}
@POST
@@ -443,11 +436,12 @@ public Response listAssignments( @PathParam("identifier") String dvIdtf) {
public Response createAssignment( RoleAssignmentDTO ra, @PathParam("identifier") String dvIdtf, @QueryParam("key") String apiKey ) {
try {
- Dataverse dataverse = findDataverseOrDie(dvIdtf);
+ final DataverseRequest req = createDataverseRequest(findUserOrDie());
+ final Dataverse dataverse = findDataverseOrDie(dvIdtf);
RoleAssignee assignee = findAssignee(ra.getAssignee());
if ( assignee==null ) {
- return errorResponse( Status.BAD_REQUEST, "Assignee not found" );
+ return error( Status.BAD_REQUEST, "Assignee not found" );
}
DataverseRole theRole;
@@ -463,13 +457,11 @@ public Response createAssignment( RoleAssignmentDTO ra, @PathParam("identifier")
dv = dv.getOwner();
}
if ( theRole == null ) {
- return errorResponse( Status.BAD_REQUEST, "Can't find role named '" + ra.getRole() + "' in dataverse " + dataverse);
+ return error( Status.BAD_REQUEST, "Can't find role named '" + ra.getRole() + "' in dataverse " + dataverse);
}
String privateUrlToken = null;
- return okResponse(
- json(
- execCommand(new AssignRoleCommand(assignee, theRole, dataverse, createDataverseRequest(findUserOrDie()), privateUrlToken))));
+ return ok(json(execCommand(new AssignRoleCommand(assignee, theRole, dataverse, req, privateUrlToken))));
} catch (WrappedResponse ex) {
LOGGER.log(Level.WARNING, "Can''t create assignment: {0}", ex.getMessage());
@@ -485,14 +477,14 @@ public Response deleteAssignment( @PathParam("id") long assignmentId, @PathParam
try {
findDataverseOrDie(dvIdtf);
execCommand( new RevokeRoleCommand(ra, createDataverseRequest(findUserOrDie())));
- return okResponse("Role " + ra.getRole().getName()
+ return ok("Role " + ra.getRole().getName()
+ " revoked for assignee " + ra.getAssigneeIdentifier()
+ " in " + ra.getDefinitionPoint().accept(DvObject.NamePrinter) );
} catch (WrappedResponse ex) {
return ex.getResponse();
}
} else {
- return errorResponse( Status.NOT_FOUND, "Role assignment " + assignmentId + " not found" );
+ return error( Status.NOT_FOUND, "Role assignment " + assignmentId + " not found" );
}
}
@@ -501,72 +493,44 @@ public Response deleteAssignment( @PathParam("id") long assignmentId, @PathParam
public Response publishDataverse( @PathParam("identifier") String dvIdtf ) {
try {
Dataverse dv = findDataverseOrDie(dvIdtf);
- return okResponse( json(execCommand( new PublishDataverseCommand(createDataverseRequest(findAuthenticatedUserOrDie()), dv))) );
+ return ok( json(execCommand( new PublishDataverseCommand(createDataverseRequest(findAuthenticatedUserOrDie()), dv))) );
} catch (WrappedResponse wr) {
return wr.getResponse();
}
}
-
- private Dataverse findDataverseOrDie( String dvIdtf ) throws WrappedResponse {
- Dataverse dv = findDataverse(dvIdtf);
- if ( dv == null ) {
- throw new WrappedResponse(errorResponse( Response.Status.NOT_FOUND, "Can't find dataverse with identifier='" + dvIdtf + "'"));
- }
- return dv;
- }
-
- @EJB
- ExplicitGroupServiceBean explicitGroupSvc;
-
+
@POST
@Path("{identifier}/groups/")
public Response createExplicitGroup( ExplicitGroupDTO dto, @PathParam("identifier") String dvIdtf) {
- try {
-
+ return response( req ->{
ExplicitGroupProvider prv = explicitGroupSvc.getProvider();
ExplicitGroup newGroup = dto.apply(prv.makeGroup());
- newGroup = execCommand( new CreateExplicitGroupCommand(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf), newGroup));
+ newGroup = execCommand( new CreateExplicitGroupCommand(req, findDataverseOrDie(dvIdtf), newGroup));
String groupUri = String.format("%s/groups/%s", dvIdtf, newGroup.getGroupAliasInOwner());
- return createdResponse( groupUri, json(newGroup) );
-
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ return created( groupUri, json(newGroup) );
+ });
}
@GET
@Path("{identifier}/groups/")
public Response listGroups( @PathParam("identifier") String dvIdtf, @QueryParam("key") String apiKey ) {
- try {
- JsonArrayBuilder arr = Json.createArrayBuilder();
- execCommand(new ListExplicitGroupsCommand(createDataverseRequest(findUserOrDie()), findDataverseOrDie(dvIdtf)))
+ return response( req -> ok(
+ execCommand(new ListExplicitGroupsCommand(req, findDataverseOrDie(dvIdtf)))
.stream().map( eg->json(eg))
- .forEach( arr::add );
- return okResponse( arr );
-
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ .collect( toJsonArray() )
+ ));
}
@GET
@Path("{identifier}/groups/{aliasInOwner}")
public Response getGroupByOwnerAndAliasInOwner( @PathParam("identifier") String dvIdtf,
- @PathParam("aliasInOwner") String grpAliasInOwner )
- {
- try {
- ExplicitGroup eg = findExplicitGroupOrDie(findDataverseOrDie(dvIdtf),
- createDataverseRequest(findUserOrDie()),
- grpAliasInOwner);
-
- return (eg!=null) ? okResponse( json(eg) ) : notFound("Can't find " + grpAliasInOwner + " in dataverse " + dvIdtf);
-
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ @PathParam("aliasInOwner") String grpAliasInOwner ){
+ return response( req -> ok(json(findExplicitGroupOrDie(findDataverseOrDie(dvIdtf),
+ req,
+ grpAliasInOwner))));
}
@PUT
@@ -575,17 +539,9 @@ public Response updateGroup(ExplicitGroupDTO groupDto,
@PathParam("identifier") String dvIdtf,
@PathParam("aliasInOwner") String grpAliasInOwner )
{
- try {
- final DataverseRequest request = createDataverseRequest(findUserOrDie());
- return okResponse(
- json(
- execCommand(
- new UpdateExplicitGroupCommand(request,
- groupDto.apply( findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), request, grpAliasInOwner))))));
-
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
+ return response( req-> ok(json(execCommand(
+ new UpdateExplicitGroupCommand(req,
+ groupDto.apply( findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), req, grpAliasInOwner)))))));
}
@DELETE
@@ -593,16 +549,11 @@ public Response updateGroup(ExplicitGroupDTO groupDto,
public Response deleteGroup(@PathParam("identifier") String dvIdtf,
@PathParam("aliasInOwner") String grpAliasInOwner )
{
- try {
- final DataverseRequest req = createDataverseRequest(findUserOrDie());
+ return response( req -> {
execCommand( new DeleteExplicitGroupCommand(req,
- findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), req, grpAliasInOwner)) );
-
- return okResponse( "Group " + dvIdtf + "/" + grpAliasInOwner + " deleted" );
-
- } catch (WrappedResponse wr) {
- return wr.refineResponse("Error deleting group " + dvIdtf + "/" + grpAliasInOwner);
- }
+ findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), req, grpAliasInOwner)) );
+ return ok( "Group " + dvIdtf + "/" + grpAliasInOwner + " deleted" );
+ });
}
@POST
@@ -612,17 +563,12 @@ public Response addRoleAssingees(List roleAssingeeIdentifiers,
@PathParam("identifier") String dvIdtf,
@PathParam("aliasInOwner") String grpAliasInOwner)
{
- try {
- final DataverseRequest req = createDataverseRequest(findUserOrDie());
- return okResponse(
+ return response( req -> ok(
json(
execCommand(
new AddRoleAssigneesToExplicitGroupCommand(req,
findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), req, grpAliasInOwner),
- new TreeSet<>(roleAssingeeIdentifiers)))));
- } catch (WrappedResponse wr) {
- return wr.refineResponse( "Adding role assignees to group " + dvIdtf + "/" + grpAliasInOwner );
- }
+ new TreeSet<>(roleAssingeeIdentifiers))))));
}
@PUT
@@ -638,18 +584,10 @@ public Response addRoleAssingee( @PathParam("identifier") String dvIdtf,
public Response deleteRoleAssingee( @PathParam("identifier") String dvIdtf,
@PathParam("aliasInOwner") String grpAliasInOwner,
@PathParam("roleAssigneeIdentifier") String roleAssigneeIdentifier ) {
-
- try {
- final DataverseRequest req = createDataverseRequest(findUserOrDie());
- return okResponse(
- json(
- execCommand(
+ return response( req ->ok(json(execCommand(
new RemoveRoleAssigneesFromExplicitGroupCommand(req,
findExplicitGroupOrDie(findDataverseOrDie(dvIdtf), req, grpAliasInOwner),
- Collections.singleton(roleAssigneeIdentifier)))));
- } catch (WrappedResponse wr) {
- return wr.refineResponse( "Adding role assignees to group " + dvIdtf + "/" + grpAliasInOwner );
- }
+ Collections.singleton(roleAssigneeIdentifier))))));
}
private ExplicitGroup findExplicitGroupOrDie( DvObject dv, DataverseRequest req, String groupIdtf ) throws WrappedResponse {
@@ -662,11 +600,10 @@ private ExplicitGroup findExplicitGroupOrDie( DvObject dv, DataverseRequest req,
@Path("{identifier}/links")
public Response listLinks(@PathParam("identifier") String dvIdtf ) {
try {
-
- Dataverse dv = findDataverseOrDie(dvIdtf);
User u = findUserOrDie();
+ Dataverse dv = findDataverseOrDie(dvIdtf);
if (!u.isSuperuser()) {
- return errorResponse(Status.FORBIDDEN, "Not a superuser");
+ return error(Status.FORBIDDEN, "Not a superuser");
}
List dvsThisDvHasLinkedToList = dataverseSvc.findDataversesThisIdHasLinkedTo(dv.getId());
@@ -691,7 +628,7 @@ public Response listLinks(@PathParam("identifier") String dvIdtf ) {
response.add("dataverses that the " + dv.getAlias() + " dataverse has linked to", dvsThisDvHasLinkedToBuilder);
response.add("dataverses that link to the " + dv.getAlias(), dvsThatLinkToThisDvBuilder);
response.add("datasets that the " + dv.getAlias() + " has linked to", datasetsThisDvHasLinkedToBuilder);
- return okResponse(response);
+ return ok(response);
} catch (WrappedResponse wr) {
return wr.getResponse();
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstance.java b/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstance.java
index b84896739e9..94d7af1ba77 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstance.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstance.java
@@ -6,6 +6,9 @@
package edu.harvard.iq.dataverse.api;
//import java.io.ByteArrayOutputStream;
+import edu.harvard.iq.dataverse.DataverseRequestServiceBean;
+import edu.harvard.iq.dataverse.EjbDataverseEngine;
+import edu.harvard.iq.dataverse.GuestbookResponse;
import java.util.List;
import edu.harvard.iq.dataverse.dataaccess.OptionalAccessService;
@@ -40,6 +43,13 @@ public void setExtraArguments(List extraArguments) {
private DownloadInfo downloadInfo = null;
private String conversionParam = null;
private String conversionParamValue = null;
+
+ private EjbDataverseEngine command;
+
+ private DataverseRequestServiceBean dataverseRequestService;
+
+ private GuestbookResponse gbr;
+
public DownloadInstance() {
@@ -142,4 +152,31 @@ public String getServiceFormatType(String serviceArg, String serviceArgValue) {
}
return null;
}
+
+
+ public EjbDataverseEngine getCommand() {
+ return command;
+ }
+
+ public void setCommand(EjbDataverseEngine command) {
+ this.command = command;
+ }
+
+ public GuestbookResponse getGbr() {
+ return gbr;
+ }
+
+ public void setGbr(GuestbookResponse gbr) {
+ this.gbr = gbr;
+ }
+
+
+ public DataverseRequestServiceBean getDataverseRequestService() {
+ return dataverseRequestService;
+ }
+
+ public void setDataverseRequestService(DataverseRequestServiceBean dataverseRequestService) {
+ this.dataverseRequestService = dataverseRequestService;
+ }
+
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstanceWriter.java b/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstanceWriter.java
index 76a5d035a13..7b0c6c414dd 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstanceWriter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/DownloadInstanceWriter.java
@@ -22,8 +22,13 @@
import javax.ws.rs.ext.Provider;
import edu.harvard.iq.dataverse.DataFile;
+import edu.harvard.iq.dataverse.DataverseRequestServiceBean;
import edu.harvard.iq.dataverse.dataaccess.*;
import edu.harvard.iq.dataverse.datavariable.DataVariable;
+import edu.harvard.iq.dataverse.engine.command.Command;
+import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
+import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
+import edu.harvard.iq.dataverse.engine.command.impl.CreateGuestbookResponseCommand;
import java.util.ArrayList;
import java.util.List;
import java.util.logging.Logger;
@@ -245,6 +250,26 @@ public void writeTo(DownloadInstance di, Class> clazz, Type type, Annotation[]
outstream.write(chunkClosing.getBytes());
}
+
+ logger.fine("di conversion param: "+di.getConversionParam()+", value: "+di.getConversionParamValue());
+
+ // Downloads of thumbnail images (scaled down, low-res versions of graphic image files) and
+ // "preprocessed metadata" records for tabular data files are NOT considered "real" downloads,
+ // so these should not produce guestbook entries:
+
+ if (di.getGbr() != null && !(isThumbnailDownload(di) || isPreprocessedMetadataDownload(di))) {
+ try {
+ logger.fine("writing guestbook response.");
+ Command cmd = new CreateGuestbookResponseCommand(di.getDataverseRequestService().getDataverseRequest(), di.getGbr(), di.getGbr().getDataFile().getOwner());
+ di.getCommand().submit(cmd);
+ } catch (CommandException e) {
+ //if an error occurs here then download won't happen no need for response recs...
+ }
+ } else {
+ logger.fine("not writing guestbook response");
+ }
+
+
instream.close();
outstream.close();
return;
@@ -256,6 +281,24 @@ public void writeTo(DownloadInstance di, Class> clazz, Type type, Annotation[]
}
+ private boolean isThumbnailDownload(DownloadInstance downloadInstance) {
+ if (downloadInstance == null) return false;
+
+ if (downloadInstance.getConversionParam() == null) return false;
+
+ return downloadInstance.getConversionParam().equals("imageThumb");
+ }
+
+ private boolean isPreprocessedMetadataDownload(DownloadInstance downloadInstance) {
+ if (downloadInstance == null) return false;
+
+ if (downloadInstance.getConversionParam() == null) return false;
+
+ if (downloadInstance.getConversionParamValue() == null) return false;
+
+ return downloadInstance.getConversionParam().equals("format") && downloadInstance.getConversionParamValue().equals("prep");
+ }
+
private long getContentSize(DataFileIO accessObject) {
long contentSize = 0;
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Groups.java b/src/main/java/edu/harvard/iq/dataverse/api/Groups.java
index ec6815405d9..2641e36939f 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Groups.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Groups.java
@@ -61,11 +61,11 @@ public Response postIpGroup( JsonObject dto ){
grp.getPersistedGroupAlias()==null ? "ipGroup" : grp.getPersistedGroupAlias()));
grp = ipGroupPrv.store(grp);
- return createdResponse("/groups/ip/" + grp.getPersistedGroupAlias(), json(grp) );
+ return created("/groups/ip/" + grp.getPersistedGroupAlias(), json(grp) );
} catch ( Exception e ) {
logger.log( Level.WARNING, "Error while storing a new IP group: " + e.getMessage(), e);
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, "Error: " + e.getMessage() );
+ return error(Response.Status.INTERNAL_SERVER_ERROR, "Error: " + e.getMessage() );
}
}
@@ -91,11 +91,11 @@ public Response putIpGroups( @PathParam("groupName") String groupName, JsonObjec
grp.setGroupProvider( ipGroupPrv );
grp.setPersistedGroupAlias( groupName );
grp = ipGroupPrv.store(grp);
- return createdResponse("/groups/ip/" + grp.getPersistedGroupAlias(), json(grp) );
+ return created("/groups/ip/" + grp.getPersistedGroupAlias(), json(grp) );
} catch ( Exception e ) {
logger.log( Level.WARNING, "Error while storing a new IP group: " + e.getMessage(), e);
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, "Error: " + e.getMessage() );
+ return error(Response.Status.INTERNAL_SERVER_ERROR, "Error: " + e.getMessage() );
}
}
@@ -103,12 +103,8 @@ public Response putIpGroups( @PathParam("groupName") String groupName, JsonObjec
@GET
@Path("ip")
public Response listIpGroups() {
-
- JsonArrayBuilder arrBld = Json.createArrayBuilder();
- for ( IpGroup g : ipGroupPrv.findGlobalGroups() ) {
- arrBld.add( json(g) );
- }
- return okResponse( arrBld );
+ return ok( ipGroupPrv.findGlobalGroups()
+ .stream().map(g->json(g)).collect(toJsonArray()) );
}
@GET
@@ -121,7 +117,7 @@ public Response getIpGroup( @PathParam("groupIdtf") String groupIdtf ) {
grp = ipGroupPrv.get(groupIdtf);
}
- return (grp == null) ? notFound( "Group " + groupIdtf + " not found") : okResponse(json(grp));
+ return (grp == null) ? notFound( "Group " + groupIdtf + " not found") : ok(json(grp));
}
@DELETE
@@ -138,7 +134,7 @@ public Response deleteIpGroup( @PathParam("groupIdtf") String groupIdtf ) {
try {
ipGroupPrv.deleteGroup(grp);
- return okResponse("Group " + grp.getAlias() + " deleted.");
+ return ok("Group " + grp.getAlias() + " deleted.");
} catch ( Exception topExp ) {
// get to the cause (unwraps EJB exception wrappers).
Throwable e = topExp;
@@ -147,7 +143,7 @@ public Response deleteIpGroup( @PathParam("groupIdtf") String groupIdtf ) {
}
if ( e instanceof IllegalArgumentException ) {
- return errorResponse(Response.Status.BAD_REQUEST, e.getMessage());
+ return error(Response.Status.BAD_REQUEST, e.getMessage());
} else {
throw topExp;
}
@@ -161,7 +157,7 @@ public Response listShibGroups() {
for (ShibGroup g : shibGroupPrv.findGlobalGroups()) {
arrBld.add(json(g));
}
- return okResponse(arrBld);
+ return ok(arrBld);
}
@POST
@@ -170,24 +166,24 @@ public Response createShibGroup(JsonObject shibGroupInput) {
String expectedNameKey = "name";
JsonString name = shibGroupInput.getJsonString(expectedNameKey);
if (name == null) {
- return errorResponse(Response.Status.BAD_REQUEST, "required field missing: " + expectedNameKey);
+ return error(Response.Status.BAD_REQUEST, "required field missing: " + expectedNameKey);
}
String expectedAttributeKey = "attribute";
JsonString attribute = shibGroupInput.getJsonString(expectedAttributeKey);
if (attribute == null) {
- return errorResponse(Response.Status.BAD_REQUEST, "required field missing: " + expectedAttributeKey);
+ return error(Response.Status.BAD_REQUEST, "required field missing: " + expectedAttributeKey);
}
String expectedPatternKey = "pattern";
JsonString pattern = shibGroupInput.getJsonString(expectedPatternKey);
if (pattern == null) {
- return errorResponse(Response.Status.BAD_REQUEST, "required field missing: " + expectedPatternKey);
+ return error(Response.Status.BAD_REQUEST, "required field missing: " + expectedPatternKey);
}
ShibGroup shibGroupToPersist = new ShibGroup(name.getString(), attribute.getString(), pattern.getString(), shibGroupPrv);
ShibGroup persitedShibGroup = shibGroupPrv.persist(shibGroupToPersist);
if (persitedShibGroup != null) {
- return okResponse("Shibboleth group persisted: " + persitedShibGroup);
+ return ok("Shibboleth group persisted: " + persitedShibGroup);
} else {
- return errorResponse(Response.Status.BAD_REQUEST, "Could not persist Shibboleth group");
+ return error(Response.Status.BAD_REQUEST, "Could not persist Shibboleth group");
}
}
@@ -200,15 +196,15 @@ public Response deleteShibGroup( @PathParam("primaryKey") String id ) {
try {
deleted = shibGroupPrv.delete(doomed);
} catch (Exception ex) {
- return errorResponse(Response.Status.BAD_REQUEST, ex.getMessage());
+ return error(Response.Status.BAD_REQUEST, ex.getMessage());
}
if (deleted) {
- return okResponse("Shibboleth group " + id + " deleted");
+ return ok("Shibboleth group " + id + " deleted");
} else {
- return errorResponse(Response.Status.BAD_REQUEST, "Could not delete Shibboleth group with an id of " + id);
+ return error(Response.Status.BAD_REQUEST, "Could not delete Shibboleth group with an id of " + id);
}
} else {
- return errorResponse(Response.Status.BAD_REQUEST, "Could not find Shibboleth group with an id of " + id);
+ return error(Response.Status.BAD_REQUEST, "Could not find Shibboleth group with an id of " + id);
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java b/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java
index 8df72e4f9ca..08336869f49 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/HarvestingClients.java
@@ -62,12 +62,12 @@ public Response harvestingClients(@QueryParam("key") String apiKey) throws IOExc
try {
harvestingClients = harvestingClientService.getAllHarvestingClients();
} catch (Exception ex) {
- return errorResponse( Response.Status.INTERNAL_SERVER_ERROR, "Caught an exception looking up configured harvesting clients; " + ex.getMessage() );
+ return error( Response.Status.INTERNAL_SERVER_ERROR, "Caught an exception looking up configured harvesting clients; " + ex.getMessage() );
}
if (harvestingClients == null) {
// returning an empty list:
- return okResponse(jsonObjectBuilder().add("harvestingClients",""));
+ return ok(jsonObjectBuilder().add("harvestingClients",""));
}
JsonArrayBuilder hcArr = Json.createArrayBuilder();
@@ -93,7 +93,7 @@ public Response harvestingClients(@QueryParam("key") String apiKey) throws IOExc
}
}
- return okResponse(jsonObjectBuilder().add("harvestingClients", hcArr));
+ return ok(jsonObjectBuilder().add("harvestingClients", hcArr));
}
@GET
@@ -105,11 +105,11 @@ public Response harvestingClient(@PathParam("nickName") String nickName, @QueryP
harvestingClient = harvestingClientService.findByNickname(nickName);
} catch (Exception ex) {
logger.warning("Exception caught looking up harvesting client " + nickName + ": " + ex.getMessage());
- return errorResponse( Response.Status.BAD_REQUEST, "Internal error: failed to look up harvesting client " + nickName + ".");
+ return error( Response.Status.BAD_REQUEST, "Internal error: failed to look up harvesting client " + nickName + ".");
}
if (harvestingClient == null) {
- return errorResponse(Response.Status.NOT_FOUND, "Harvesting client " + nickName + " not found.");
+ return error(Response.Status.NOT_FOUND, "Harvesting client " + nickName + " not found.");
}
HarvestingClient retrievedHarvestingClient = null;
@@ -128,15 +128,15 @@ public Response harvestingClient(@PathParam("nickName") String nickName, @QueryP
}
if (retrievedHarvestingClient == null) {
- return errorResponse( Response.Status.BAD_REQUEST,
+ return error( Response.Status.BAD_REQUEST,
"Internal error: failed to retrieve harvesting client " + nickName + ".");
}
try {
- return okResponse(harvestingConfigAsJson(retrievedHarvestingClient));
+ return ok(harvestingConfigAsJson(retrievedHarvestingClient));
} catch (Exception ex) {
logger.warning("Unknown exception caught while trying to format harvesting client config as json: "+ex.getMessage());
- return errorResponse( Response.Status.BAD_REQUEST,
+ return error( Response.Status.BAD_REQUEST,
"Internal error: failed to produce output for harvesting client " + nickName + ".");
}
}
@@ -155,7 +155,7 @@ public Response createHarvestingClient(String jsonBody, @PathParam("nickName") S
Dataverse ownerDataverse = dataverseService.findByAlias(dataverseAlias);
if (ownerDataverse == null) {
- return errorResponse(Response.Status.BAD_REQUEST, "No such dataverse: " + dataverseAlias);
+ return error(Response.Status.BAD_REQUEST, "No such dataverse: " + dataverseAlias);
}
harvestingClient.setDataverse(ownerDataverse);
@@ -166,10 +166,10 @@ public Response createHarvestingClient(String jsonBody, @PathParam("nickName") S
DataverseRequest req = createDataverseRequest(findUserOrDie());
HarvestingClient managedHarvestingClient = execCommand( new CreateHarvestingClientCommand(req, harvestingClient));
- return createdResponse( "/harvest/clients/" + nickName, harvestingConfigAsJson(managedHarvestingClient));
+ return created( "/harvest/clients/" + nickName, harvestingConfigAsJson(managedHarvestingClient));
} catch (JsonParseException ex) {
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing harvesting client: " + ex.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing harvesting client: " + ex.getMessage() );
} catch (WrappedResponse ex) {
return ex.getResponse();
@@ -190,7 +190,7 @@ public Response modifyHarvestingClient(String jsonBody, @PathParam("nickName") S
}
if (harvestingClient == null) {
- return errorResponse( Response.Status.NOT_FOUND, "Harvesting client " + nickName + " not found.");
+ return error( Response.Status.NOT_FOUND, "Harvesting client " + nickName + " not found.");
}
String ownerDataverseAlias = harvestingClient.getDataverse().getAlias();
@@ -204,13 +204,13 @@ public Response modifyHarvestingClient(String jsonBody, @PathParam("nickName") S
if (newDataverseAlias != null
&& !newDataverseAlias.equals("")
&& !newDataverseAlias.equals(ownerDataverseAlias)) {
- return errorResponse(Response.Status.BAD_REQUEST, "Bad \"dataverseAlias\" supplied. Harvesting client "+nickName+" belongs to the dataverse "+ownerDataverseAlias);
+ return error(Response.Status.BAD_REQUEST, "Bad \"dataverseAlias\" supplied. Harvesting client "+nickName+" belongs to the dataverse "+ownerDataverseAlias);
}
HarvestingClient managedHarvestingClient = execCommand( new UpdateHarvestingClientCommand(req, harvestingClient));
- return createdResponse( "/datasets/" + nickName, harvestingConfigAsJson(managedHarvestingClient));
+ return created( "/datasets/" + nickName, harvestingConfigAsJson(managedHarvestingClient));
} catch (JsonParseException ex) {
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing harvesting client: " + ex.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing harvesting client: " + ex.getMessage() );
} catch (WrappedResponse ex) {
return ex.getResponse();
@@ -237,24 +237,24 @@ public Response startHarvestingJob(@PathParam("nickName") String clientNickname,
try {
authenticatedUser = findAuthenticatedUserOrDie();
} catch (WrappedResponse wr) {
- return errorResponse(Response.Status.UNAUTHORIZED, "Authentication required to use this API method");
+ return error(Response.Status.UNAUTHORIZED, "Authentication required to use this API method");
}
if (authenticatedUser == null || !authenticatedUser.isSuperuser()) {
- return errorResponse(Response.Status.FORBIDDEN, "Only the Dataverse Admin user can run harvesting jobs");
+ return error(Response.Status.FORBIDDEN, "Only the Dataverse Admin user can run harvesting jobs");
}
HarvestingClient harvestingClient = harvestingClientService.findByNickname(clientNickname);
if (harvestingClient == null) {
- return errorResponse(Response.Status.NOT_FOUND, "No such dataverse: "+clientNickname);
+ return error(Response.Status.NOT_FOUND, "No such dataverse: "+clientNickname);
}
DataverseRequest dataverseRequest = createDataverseRequest(authenticatedUser);
harvesterService.doAsyncHarvest(dataverseRequest, harvestingClient);
} catch (Exception e) {
- return this.errorResponse(Response.Status.BAD_REQUEST, "Exception thrown when running harvesting client\""+clientNickname+"\" via REST API; " + e.getMessage());
+ return this.error(Response.Status.BAD_REQUEST, "Exception thrown when running harvesting client\""+clientNickname+"\" via REST API; " + e.getMessage());
}
return this.accepted();
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/HarvestingServer.java b/src/main/java/edu/harvard/iq/dataverse/api/HarvestingServer.java
index f54190597b2..6d7cf218e35 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/HarvestingServer.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/HarvestingServer.java
@@ -63,12 +63,12 @@ public Response oaiSets(@QueryParam("key") String apiKey) throws IOException {
try {
oaiSets = oaiSetService.findAll();
} catch (Exception ex) {
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, "Caught an exception looking up available OAI sets; " + ex.getMessage());
+ return error(Response.Status.INTERNAL_SERVER_ERROR, "Caught an exception looking up available OAI sets; " + ex.getMessage());
}
if (oaiSets == null) {
// returning an empty list:
- return okResponse(jsonObjectBuilder().add("oaisets", ""));
+ return ok(jsonObjectBuilder().add("oaisets", ""));
}
JsonArrayBuilder hcArr = Json.createArrayBuilder();
@@ -77,7 +77,7 @@ public Response oaiSets(@QueryParam("key") String apiKey) throws IOException {
hcArr.add(oaiSetAsJson(set));
}
- return okResponse(jsonObjectBuilder().add("oaisets", hcArr));
+ return ok(jsonObjectBuilder().add("oaisets", hcArr));
}
@GET
@@ -89,18 +89,18 @@ public Response oaiSet(@PathParam("specname") String spec, @QueryParam("key") St
set = oaiSetService.findBySpec(spec);
} catch (Exception ex) {
logger.warning("Exception caught looking up OAI set " + spec + ": " + ex.getMessage());
- return errorResponse( Response.Status.BAD_REQUEST, "Internal error: failed to look up OAI set " + spec + ".");
+ return error( Response.Status.BAD_REQUEST, "Internal error: failed to look up OAI set " + spec + ".");
}
if (set == null) {
- return errorResponse(Response.Status.NOT_FOUND, "OAI set " + spec + " not found.");
+ return error(Response.Status.NOT_FOUND, "OAI set " + spec + " not found.");
}
try {
- return okResponse(oaiSetAsJson(set));
+ return ok(oaiSetAsJson(set));
} catch (Exception ex) {
logger.warning("Unknown exception caught while trying to format OAI set " + spec + " as json: "+ex.getMessage());
- return errorResponse( Response.Status.BAD_REQUEST,
+ return error( Response.Status.BAD_REQUEST,
"Internal error: failed to produce output for OAI set " + spec + ".");
}
}
@@ -120,7 +120,7 @@ public Response createOaiSet(String jsonBody, @PathParam("specname") String spec
oaiSetService.save(set);
- return createdResponse( "/harvest/server/oaisets" + spec, oaiSetAsJson(set));
+ return created( "/harvest/server/oaisets" + spec, oaiSetAsJson(set));
//} catch (JsonParseException ex) {
// return errorResponse( Response.Status.BAD_REQUEST, "Error parsing OAI set: " + ex.getMessage() );
@@ -135,7 +135,7 @@ public Response createOaiSet(String jsonBody, @PathParam("specname") String spec
public Response modifyOaiSet(String jsonBody, @PathParam("specname") String spec, @QueryParam("key") String apiKey) throws IOException, JsonParseException {
// TODO:
// ...
- return createdResponse("/harvest/server/oaisets" + spec, null);
+ return created("/harvest/server/oaisets" + spec, null);
}
@DELETE
@@ -146,21 +146,21 @@ public Response deleteOaiSet(@PathParam("specname") String spec, @QueryParam("ke
set = oaiSetService.findBySpec(spec);
} catch (Exception ex) {
logger.warning("Exception caught looking up OAI set " + spec + ": " + ex.getMessage());
- return errorResponse( Response.Status.BAD_REQUEST, "Internal error: failed to look up OAI set " + spec + ".");
+ return error( Response.Status.BAD_REQUEST, "Internal error: failed to look up OAI set " + spec + ".");
}
if (set == null) {
- return errorResponse(Response.Status.NOT_FOUND, "OAI set " + spec + " not found.");
+ return error(Response.Status.NOT_FOUND, "OAI set " + spec + " not found.");
}
try {
oaiSetService.setDeleteInProgress(set.getId());
oaiSetService.remove(set.getId());
} catch (Exception ex) {
- return errorResponse( Response.Status.BAD_REQUEST, "Internal error: failed to delete OAI set " + spec + "; " + ex.getMessage());
+ return error( Response.Status.BAD_REQUEST, "Internal error: failed to delete OAI set " + spec + "; " + ex.getMessage());
}
- return okResponse("OAI Set " + spec + " deleted");
+ return ok("OAI Set " + spec + " deleted");
}
@@ -172,10 +172,10 @@ public Response oaiSetListDatasets(@PathParam("specname") String spec, @QueryPar
set = oaiSetService.findBySpec(spec);
} catch (Exception ex) {
logger.warning("Exception caught looking up OAI set " + spec + ": " + ex.getMessage());
- return errorResponse( Response.Status.BAD_REQUEST, "Internal error: failed to look up OAI set " + spec + ".");
+ return error( Response.Status.BAD_REQUEST, "Internal error: failed to look up OAI set " + spec + ".");
}
- return okResponse("");
+ return ok("");
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Index.java b/src/main/java/edu/harvard/iq/dataverse/api/Index.java
index e217b146dd6..b9724a81ef3 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Index.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Index.java
@@ -112,7 +112,7 @@ private Response indexAllOrSubset(Long numPartitionsSelected, Long partitionIdTo
long numPartitions = 1;
if (numPartitionsSelected != null) {
if (numPartitionsSelected < 1) {
- return errorResponse(Status.BAD_REQUEST, "numPartitions must be 1 or higher but was " + numPartitionsSelected);
+ return error(Status.BAD_REQUEST, "numPartitions must be 1 or higher but was " + numPartitionsSelected);
} else {
numPartitions = numPartitionsSelected;
}
@@ -122,7 +122,7 @@ private Response indexAllOrSubset(Long numPartitionsSelected, Long partitionIdTo
availablePartitionIds.add(i);
}
- Response invalidParitionIdSelection = errorResponse(Status.BAD_REQUEST, "You specified " + numPartitions + " partition(s) and your selected partitionId was " + partitionIdToProcess + " but you must select from these availableParitionIds: " + availablePartitionIds);
+ Response invalidParitionIdSelection = error(Status.BAD_REQUEST, "You specified " + numPartitions + " partition(s) and your selected partitionId was " + partitionIdToProcess + " but you must select from these availableParitionIds: " + availablePartitionIds);
if (partitionIdToProcess != null) {
long selected = partitionIdToProcess;
if (!availablePartitionIds.contains(selected)) {
@@ -151,7 +151,7 @@ private Response indexAllOrSubset(Long numPartitionsSelected, Long partitionIdTo
if (previewOnly) {
preview.add("args", args);
preview.add("availablePartitionIds", availablePartitionIdsBuilder);
- return okResponse(preview);
+ return ok(preview);
}
JsonObjectBuilder response = Json.createObjectBuilder();
@@ -167,7 +167,7 @@ private Response indexAllOrSubset(Long numPartitionsSelected, Long partitionIdTo
int datasetCount = workloadPreview.getInt("datasetCount");
String status = "indexAllOrSubset has begun of " + dataverseCount + " dataverses and " + datasetCount + " datasets.";
response.add("message", status);
- return okResponse(response);
+ return ok(response);
} catch (EJBException ex) {
Throwable cause = ex;
StringBuilder sb = new StringBuilder();
@@ -195,9 +195,9 @@ private Response indexAllOrSubset(Long numPartitionsSelected, Long partitionIdTo
}
}
if (sb.toString().equals("javax.ejb.EJBException: Transaction aborted javax.transaction.RollbackException java.lang.IllegalStateException ")) {
- return okResponse("indexing went as well as can be expected... got java.lang.IllegalStateException but some indexing may have happened anyway");
+ return ok("indexing went as well as can be expected... got java.lang.IllegalStateException but some indexing may have happened anyway");
} else {
- return errorResponse(Status.INTERNAL_SERVER_ERROR, sb.toString());
+ return error(Status.INTERNAL_SERVER_ERROR, sb.toString());
}
}
}
@@ -207,9 +207,9 @@ private Response indexAllOrSubset(Long numPartitionsSelected, Long partitionIdTo
public Response clearSolrIndex() {
try {
JsonObjectBuilder response = SolrIndexService.deleteAllFromSolrAndResetIndexTimes();
- return okResponse(response);
+ return ok(response);
} catch (SolrServerException | IOException ex) {
- return errorResponse(Status.INTERNAL_SERVER_ERROR, ex.getLocalizedMessage());
+ return error(Status.INTERNAL_SERVER_ERROR, ex.getLocalizedMessage());
}
}
@@ -224,7 +224,7 @@ public Response indexTypeById(@PathParam("type") String type, @PathParam("id") L
* @todo Can we display the result of indexing to the user?
*/
Future indexDataverseFuture = indexService.indexDataverse(dataverse);
- return okResponse("starting reindex of dataverse " + id);
+ return ok("starting reindex of dataverse " + id);
} else {
String response = indexService.removeSolrDocFromIndex(IndexServiceBean.solrDocIdentifierDataverse + id);
return notFound("Could not find dataverse with id of " + id + ". Result from deletion attempt: " + response);
@@ -234,7 +234,7 @@ public Response indexTypeById(@PathParam("type") String type, @PathParam("id") L
if (dataset != null) {
boolean doNormalSolrDocCleanUp = true;
Future indexDatasetFuture = indexService.indexDataset(dataset, doNormalSolrDocCleanUp);
- return okResponse("starting reindex of dataset " + id);
+ return ok("starting reindex of dataset " + id);
} else {
/**
* @todo what about published, deaccessioned, etc.? Need
@@ -251,9 +251,9 @@ public Response indexTypeById(@PathParam("type") String type, @PathParam("id") L
*/
boolean doNormalSolrDocCleanUp = true;
Future indexDatasetFuture = indexService.indexDataset(datasetThatOwnsTheFile, doNormalSolrDocCleanUp);
- return okResponse("started reindexing " + type + "/" + id);
+ return ok("started reindexing " + type + "/" + id);
} else {
- return errorResponse(Status.BAD_REQUEST, "illegal type: " + type);
+ return error(Status.BAD_REQUEST, "illegal type: " + type);
}
} catch (EJBException ex) {
Throwable cause = ex;
@@ -282,7 +282,7 @@ public Response indexTypeById(@PathParam("type") String type, @PathParam("id") L
}
}
}
- return errorResponse(Status.INTERNAL_SERVER_ERROR, sb.toString());
+ return error(Status.INTERNAL_SERVER_ERROR, sb.toString());
}
}
@@ -290,13 +290,13 @@ public Response indexTypeById(@PathParam("type") String type, @PathParam("id") L
@Path("dataset")
public Response indexDatasetByPersistentId(@QueryParam("persistentId") String persistentId) {
if (persistentId == null) {
- return errorResponse(Status.BAD_REQUEST, "No persistent id given.");
+ return error(Status.BAD_REQUEST, "No persistent id given.");
}
Dataset dataset = null;
try {
dataset = datasetService.findByGlobalId(persistentId);
} catch (Exception ex) {
- return errorResponse(Status.BAD_REQUEST, "Problem looking up dataset with persistent id \"" + persistentId + "\". Error: " + ex.getMessage());
+ return error(Status.BAD_REQUEST, "Problem looking up dataset with persistent id \"" + persistentId + "\". Error: " + ex.getMessage());
}
if (dataset != null) {
boolean doNormalSolrDocCleanUp = true;
@@ -313,9 +313,9 @@ public Response indexDatasetByPersistentId(@QueryParam("persistentId") String pe
versions.add(versionObject);
}
data.add("versions", versions);
- return okResponse(data);
+ return ok(data);
} else {
- return errorResponse(Status.BAD_REQUEST, "Could not find dataset with persistent id " + persistentId);
+ return error(Status.BAD_REQUEST, "Could not find dataset with persistent id " + persistentId);
}
}
@@ -335,14 +335,14 @@ public Response indexMod(@QueryParam("partitions") long partitions, @QueryParam(
response.add("partitions", partitions);
response.add("which", which);
response.add("mine", mine.toString());
- return okResponse(response);
+ return ok(response);
}
@GET
@Path("perms")
public Response indexAllPermissions() {
IndexResponse indexResponse = solrIndexService.indexAllPermissions();
- return okResponse(indexResponse.getMessage());
+ return ok(indexResponse.getMessage());
}
@GET
@@ -350,10 +350,10 @@ public Response indexAllPermissions() {
public Response indexPermissions(@PathParam("id") Long id) {
DvObject dvObject = dvObjectService.findDvObject(id);
if (dvObject == null) {
- return errorResponse(Status.BAD_REQUEST, "Could not find DvObject based on id " + id);
+ return error(Status.BAD_REQUEST, "Could not find DvObject based on id " + id);
} else {
IndexResponse indexResponse = solrIndexService.indexPermissionsForOneDvObject(dvObject);
- return okResponse(indexResponse.getMessage());
+ return ok(indexResponse.getMessage());
}
}
@@ -366,7 +366,7 @@ public Response indexStatus() {
try {
contentInSolrButNotDatabase = getContentInSolrButNotDatabase();
} catch (SearchException ex) {
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, "Can not determine index status. " + ex.getLocalizedMessage() + ". Is Solr down? Exception: " + ex.getCause().getLocalizedMessage());
+ return error(Response.Status.INTERNAL_SERVER_ERROR, "Can not determine index status. " + ex.getLocalizedMessage() + ". Is Solr down? Exception: " + ex.getCause().getLocalizedMessage());
}
JsonObjectBuilder permissionsInDatabaseButStaleInOrMissingFromSolr = getPermissionsInDatabaseButStaleInOrMissingFromSolr();
@@ -378,7 +378,7 @@ public Response indexStatus() {
.add("permissionsInDatabaseButStaleInOrMissingFromIndex", permissionsInDatabaseButStaleInOrMissingFromSolr)
.add("permissionsInIndexButNotDatabase", permissionsInSolrButNotDatabase);
- return okResponse(data);
+ return ok(data);
}
private JsonObjectBuilder getContentInDatabaseButStaleInOrMissingFromSolr() {
@@ -560,7 +560,7 @@ public Response searchDebug(
User user = findUserByApiToken(apiToken);
if (user == null) {
- return errorResponse(Response.Status.UNAUTHORIZED, "Invalid apikey '" + apiToken + "'");
+ return error(Response.Status.UNAUTHORIZED, "Invalid apikey '" + apiToken + "'");
}
Dataverse subtreeScope = dataverseService.findRootDataverse();
@@ -574,7 +574,7 @@ public Response searchDebug(
try {
solrQueryResponse = searchService.search(createDataverseRequest(user), subtreeScope, query, filterQueries, sortField, sortOrder, paginationStart, dataRelatedToMe, numResultsPerPage);
} catch (SearchException ex) {
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, ex.getLocalizedMessage() + ": " + ex.getCause().getLocalizedMessage());
+ return error(Response.Status.INTERNAL_SERVER_ERROR, ex.getLocalizedMessage() + ": " + ex.getCause().getLocalizedMessage());
}
JsonArrayBuilder itemsArrayBuilder = Json.createArrayBuilder();
@@ -583,7 +583,7 @@ public Response searchDebug(
itemsArrayBuilder.add(solrSearchResult.getType() + ":" + solrSearchResult.getNameSort());
}
- return okResponse(itemsArrayBuilder);
+ return ok(itemsArrayBuilder);
}
/**
@@ -597,12 +597,12 @@ public Response searchPermsDebug(
User user = findUserByApiToken(apiToken);
if (user == null) {
- return errorResponse(Response.Status.UNAUTHORIZED, "Invalid apikey '" + apiToken + "'");
+ return error(Response.Status.UNAUTHORIZED, "Invalid apikey '" + apiToken + "'");
}
DvObject dvObjectToLookUp = dvObjectService.findDvObject(dvObjectId);
if (dvObjectToLookUp == null) {
- return errorResponse(Status.BAD_REQUEST, "Could not find DvObject based on id " + dvObjectId);
+ return error(Status.BAD_REQUEST, "Could not find DvObject based on id " + dvObjectId);
}
List solrDocs = SolrIndexService.determineSolrDocs(dvObjectToLookUp);
@@ -637,21 +637,21 @@ public Response searchPermsDebug(
data.add("timestamps", timestamps);
data.add("roleAssignments", roleAssignmentsData);
- return okResponse(data);
+ return ok(data);
}
@DELETE
@Path("timestamps")
public Response deleteAllTimestamps() {
int numItemsCleared = dvObjectService.clearAllIndexTimes();
- return okResponse("cleared: " + numItemsCleared);
+ return ok("cleared: " + numItemsCleared);
}
@DELETE
@Path("timestamps/{dvObjectId}")
public Response deleteTimestamp(@PathParam("dvObjectId") long dvObjectId) {
int numItemsCleared = dvObjectService.clearIndexTimes(dvObjectId);
- return okResponse("cleared: " + numItemsCleared);
+ return ok("cleared: " + numItemsCleared);
}
@GET
@@ -659,7 +659,7 @@ public Response deleteTimestamp(@PathParam("dvObjectId") long dvObjectId) {
public Response filesearch(@QueryParam("persistentId") String persistentId, @QueryParam("semanticVersion") String semanticVersion, @QueryParam("q") String userSuppliedQuery) {
Dataset dataset = datasetService.findByGlobalId(persistentId);
if (dataset == null) {
- return errorResponse(Status.BAD_REQUEST, "Could not find dataset with persistent id " + persistentId);
+ return error(Status.BAD_REQUEST, "Could not find dataset with persistent id " + persistentId);
}
User user = GuestUser.get();
try {
@@ -671,12 +671,12 @@ public Response filesearch(@QueryParam("persistentId") String persistentId, @Que
}
RetrieveDatasetVersionResponse datasetVersionResponse = datasetVersionService.retrieveDatasetVersionByPersistentId(persistentId, semanticVersion);
if (datasetVersionResponse == null) {
- return errorResponse(Status.BAD_REQUEST, "Problem searching for files. Could not find dataset version based on " + persistentId + " and " + semanticVersion);
+ return error(Status.BAD_REQUEST, "Problem searching for files. Could not find dataset version based on " + persistentId + " and " + semanticVersion);
}
DatasetVersion datasetVersion = datasetVersionResponse.getDatasetVersion();
FileView fileView = searchFilesService.getFileView(datasetVersion, user, userSuppliedQuery);
if (fileView == null) {
- return errorResponse(Status.BAD_REQUEST, "Problem searching for files. Null returned from getFileView.");
+ return error(Status.BAD_REQUEST, "Problem searching for files. Null returned from getFileView.");
}
JsonArrayBuilder filesFound = Json.createArrayBuilder();
JsonArrayBuilder cards = Json.createArrayBuilder();
@@ -714,7 +714,7 @@ public Response filesearch(@QueryParam("persistentId") String persistentId, @Que
data.add("filterQueries", filterQueries);
data.add("allDataverVersionIds", allDatasetVersionIds);
data.add("semanticVersion", datasetVersion.getSemanticVersion());
- return okResponse(data);
+ return ok(data);
}
@GET
@@ -730,12 +730,12 @@ public Response getFileMetadataByDatasetId(
try {
fileMetadatasFound = dataFileService.findFileMetadataByDatasetVersionId(datasetIdToLookUp, maxResults, sortField, sortOrder);
} catch (Exception ex) {
- return errorResponse(Status.BAD_REQUEST, "error: " + ex.getCause().getMessage() + ex);
+ return error(Status.BAD_REQUEST, "error: " + ex.getCause().getMessage() + ex);
}
for (FileMetadata fileMetadata : fileMetadatasFound) {
data.add(fileMetadata.getLabel());
}
- return okResponse(data);
+ return ok(data);
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Info.java b/src/main/java/edu/harvard/iq/dataverse/api/Info.java
index 24122b7c28d..044860032f7 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Info.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Info.java
@@ -1,8 +1,10 @@
package edu.harvard.iq.dataverse.api;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
+import edu.harvard.iq.dataverse.util.SystemConfig;
import javax.ejb.EJB;
import javax.json.Json;
+import javax.json.JsonValue;
import javax.ws.rs.GET;
import javax.ws.rs.Path;
import javax.ws.rs.core.Response;
@@ -12,15 +14,36 @@ public class Info extends AbstractApiBean {
@EJB
SettingsServiceBean settingsService;
+
+ @EJB
+ SystemConfig systemConfig;
@GET
@Path("settings/:DatasetPublishPopupCustomText")
public Response getDatasetPublishPopupCustomText() {
String setting = settingsService.getValueForKey(SettingsServiceBean.Key.DatasetPublishPopupCustomText);
if (setting != null) {
- return okResponse(Json.createObjectBuilder().add("message", setting));
+ return ok(Json.createObjectBuilder().add("message", setting));
} else {
return notFound("Setting " + SettingsServiceBean.Key.DatasetPublishPopupCustomText + " not found");
}
}
+
+ @GET
+ @Path("version")
+ public Response getInfo() {
+ String versionStr = systemConfig.getVersion(true);
+ String[] comps = versionStr.split("build",2);
+ String version = comps[0].trim();
+ JsonValue build = comps.length > 1 ? Json.createArrayBuilder().add(comps[1].trim()).build().get(0) : JsonValue.NULL;
+
+ return response( req -> ok( Json.createObjectBuilder().add("version", version)
+ .add("build", build)));
+ }
+
+ @GET
+ @Path("server")
+ public Response getServer() {
+ return response( req -> ok(systemConfig.getDataverseServer()));
+ }
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Mail.java b/src/main/java/edu/harvard/iq/dataverse/api/Mail.java
index 4f135af1054..3b5050b480b 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Mail.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Mail.java
@@ -24,7 +24,7 @@ public Response sendMail() {
ActionLogRecord alr = new ActionLogRecord(ActionLogRecord.ActionType.Admin, "sendMail");
// mailService.bulkSendNotifications();
actionLogSvc.log(alr);
- return okResponse("bulk send notification is deprecated");
+ return ok("bulk send notification is deprecated");
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/MetadataBlocks.java b/src/main/java/edu/harvard/iq/dataverse/api/MetadataBlocks.java
index e0376f71412..26564eaef56 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/MetadataBlocks.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/MetadataBlocks.java
@@ -5,14 +5,10 @@
import javax.ws.rs.Path;
import javax.ws.rs.Produces;
import javax.ws.rs.core.Response;
-import javax.json.Json;
-import javax.json.JsonArrayBuilder;
import static edu.harvard.iq.dataverse.util.json.JsonPrinter.brief;
-import static edu.harvard.iq.dataverse.util.json.JsonPrinter.json;
import javax.ws.rs.PathParam;
import static edu.harvard.iq.dataverse.util.json.JsonPrinter.json;
-import static edu.harvard.iq.dataverse.util.json.JsonPrinter.json;
-import static edu.harvard.iq.dataverse.util.json.JsonPrinter.json;
+import static edu.harvard.iq.dataverse.util.json.JsonPrinter.toJsonArray;
/**
* Api bean for managing metadata blocks.
@@ -24,12 +20,7 @@ public class MetadataBlocks extends AbstractApiBean {
@GET
public Response list() {
- JsonArrayBuilder bld = Json.createArrayBuilder();
- for ( MetadataBlock block : metadataBlockSvc.listMetadataBlocks() ) {
- bld.add( brief.json(block) );
- }
-
- return okResponse(bld);
+ return ok(metadataBlockSvc.listMetadataBlocks().stream().map(brief::json).collect(toJsonArray()));
}
@Path("{identifier}")
@@ -37,7 +28,7 @@ public Response list() {
public Response getBlock( @PathParam("identifier") String idtf ) {
MetadataBlock b = findMetadataBlock(idtf);
- return (b != null ) ? okResponse(json(b)) : notFound("Can't find metadata block '" + idtf + "'");
+ return (b != null ) ? ok(json(b)) : notFound("Can't find metadata block '" + idtf + "'");
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Roles.java b/src/main/java/edu/harvard/iq/dataverse/api/Roles.java
index d573274998b..b3f75e00c5a 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Roles.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Roles.java
@@ -1,17 +1,13 @@
package edu.harvard.iq.dataverse.api;
import edu.harvard.iq.dataverse.api.dto.RoleDTO;
-import edu.harvard.iq.dataverse.Dataverse;
import edu.harvard.iq.dataverse.authorization.DataverseRole;
-import edu.harvard.iq.dataverse.DataverseServiceBean;
import edu.harvard.iq.dataverse.authorization.Permission;
-import edu.harvard.iq.dataverse.authorization.providers.builtin.BuiltinUserServiceBean;
-import javax.ejb.EJB;
+import edu.harvard.iq.dataverse.authorization.users.User;
import javax.ws.rs.GET;
import javax.ws.rs.POST;
import javax.ws.rs.Path;
import javax.ws.rs.PathParam;
-
import static edu.harvard.iq.dataverse.util.json.JsonPrinter.*;
import edu.harvard.iq.dataverse.engine.command.impl.CreateRoleCommand;
import edu.harvard.iq.dataverse.engine.command.impl.DeleteRoleCommand;
@@ -19,7 +15,6 @@
import javax.ws.rs.DELETE;
import javax.ws.rs.QueryParam;
import javax.ws.rs.core.Response;
-import javax.ws.rs.core.Response.Status;
/**
* Util API for managing roles. Might not make it to the production version.
@@ -29,58 +24,39 @@
@Path("roles")
public class Roles extends AbstractApiBean {
- @EJB
- BuiltinUserServiceBean usersSvc;
-
- @EJB
- DataverseServiceBean dvSvc;
-
@GET
@Path("{id}")
public Response viewRole( @PathParam("id") Long id) {
- try {
- DataverseRole role = rolesSvc.find(id);
- if ( role == null ) {
- return notFound("role with id " + id + " not found");
- } else {
- return ( permissionSvc.userOn(findUserOrDie(), role.getOwner()).has(Permission.ManageDataversePermissions) )
- ? okResponse( json(role) )
- : errorResponse(Status.UNAUTHORIZED, "");
- }
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
+ return response( ()-> {
+ final User user = findUserOrDie();
+ final DataverseRole role = findRoleOrDie(id);
+ return ( permissionSvc.userOn(user, role.getOwner()).has(Permission.ManageDataversePermissions) )
+ ? ok( json(role) ) : permissionError("Permission required to view roles.");
+ });
}
@DELETE
@Path("{id}")
public Response deleteRole( @PathParam("id") Long id ) {
- DataverseRole role = rolesSvc.find(id);
- if ( role == null ) {
- return notFound( "role with id " + id + " not found");
- } else {
- try {
- execCommand( new DeleteRoleCommand(createDataverseRequest(findUserOrDie()), role) );
- return okResponse("role " + id + " deleted.");
-
- } catch (WrappedResponse ex) {
- return ex.refineResponse( "Cannot delete role " + id + "." );
- }
- }
+ return response( req -> {
+ execCommand( new DeleteRoleCommand(req, findRoleOrDie(id)) );
+ return ok("role " + id + " deleted.");
+ });
}
@POST
public Response createNewRole( RoleDTO roleDto,
- @QueryParam("dvo") String dvoIdtf ) {
-
- Dataverse d = findDataverse(dvoIdtf);
- if ( d == null ) return errorResponse( Status.BAD_REQUEST, "no dataverse with id " + dvoIdtf );
-
- try {
- return okResponse(json(execCommand(new CreateRoleCommand(roleDto.asRole(), createDataverseRequest(findUserOrDie()), d))));
- } catch ( WrappedResponse ce ) {
- return ce.refineResponse("Role creation failed.");
- }
+ @QueryParam("dvo") String dvoIdtf ) {
+ return response( req -> ok(json(execCommand(
+ new CreateRoleCommand(roleDto.asRole(),
+ req,findDataverseOrDie(dvoIdtf))))));
}
-
+
+ private DataverseRole findRoleOrDie( long id ) throws WrappedResponse {
+ DataverseRole role = rolesSvc.find(id);
+ if ( role != null ) {
+ return role;
+ }
+ throw new WrappedResponse(notFound( "role with id " + id + " not found"));
+ }
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/SavedSearches.java b/src/main/java/edu/harvard/iq/dataverse/api/SavedSearches.java
index 2f093392f61..7ead0d23711 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/SavedSearches.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/SavedSearches.java
@@ -42,7 +42,7 @@ public Response meta() {
endpoints.add("GET /id");
endpoints.add("POST");
endpoints.add("DELETE /id");
- return okResponse(endpoints);
+ return ok(endpoints);
}
@GET
@@ -56,7 +56,7 @@ public Response list() {
}
JsonObjectBuilder response = Json.createObjectBuilder();
response.add("saved searches", savedSearchesBuilder);
- return okResponse(response);
+ return ok(response);
}
@GET
@@ -65,9 +65,9 @@ public Response show(@PathParam("id") long id) {
SavedSearch savedSearch = savedSearchSvc.find(id);
if (savedSearch != null) {
JsonObjectBuilder response = toJson(savedSearch);
- return okResponse(response);
+ return ok(response);
} else {
- return errorResponse(NOT_FOUND, "Could not find saved search id " + id);
+ return error(NOT_FOUND, "Could not find saved search id " + id);
}
}
@@ -92,7 +92,7 @@ private JsonObjectBuilder toJson(SavedSearch savedSearch) {
public Response add(JsonObject body) {
if (body == null) {
- return errorResponse(BAD_REQUEST, "JSON is expected.");
+ return error(BAD_REQUEST, "JSON is expected.");
}
String keyForAuthenticatedUserId = "creatorId";
@@ -100,16 +100,16 @@ public Response add(JsonObject body) {
try {
creatorIdToLookUp = body.getInt(keyForAuthenticatedUserId);
} catch (NullPointerException ex) {
- return errorResponse(BAD_REQUEST, "Required field missing: " + keyForAuthenticatedUserId);
+ return error(BAD_REQUEST, "Required field missing: " + keyForAuthenticatedUserId);
} catch (ClassCastException ex) {
- return errorResponse(BAD_REQUEST, "A number is required for " + keyForAuthenticatedUserId);
+ return error(BAD_REQUEST, "A number is required for " + keyForAuthenticatedUserId);
} catch (Exception ex) {
- return errorResponse(BAD_REQUEST, "Problem with " + keyForAuthenticatedUserId + ": " + ex);
+ return error(BAD_REQUEST, "Problem with " + keyForAuthenticatedUserId + ": " + ex);
}
AuthenticatedUser creator = authSvc.findByID(creatorIdToLookUp);
if (creator == null) {
- return errorResponse(Response.Status.NOT_FOUND, "Could not find user based on " + keyForAuthenticatedUserId + ": " + creatorIdToLookUp);
+ return error(Response.Status.NOT_FOUND, "Could not find user based on " + keyForAuthenticatedUserId + ": " + creatorIdToLookUp);
}
String keyForQuery = "query";
@@ -117,7 +117,7 @@ public Response add(JsonObject body) {
try {
query = body.getString(keyForQuery);
} catch (NullPointerException ex) {
- return errorResponse(BAD_REQUEST, "Required field missing: " + keyForQuery);
+ return error(BAD_REQUEST, "Required field missing: " + keyForQuery);
}
String keyForDefinitionPointId = "definitionPointId";
@@ -125,15 +125,15 @@ public Response add(JsonObject body) {
try {
dataverseIdToLookup = body.getInt(keyForDefinitionPointId);
} catch (NullPointerException ex) {
- return errorResponse(BAD_REQUEST, "Required field missing: " + keyForDefinitionPointId);
+ return error(BAD_REQUEST, "Required field missing: " + keyForDefinitionPointId);
} catch (ClassCastException ex) {
- return errorResponse(BAD_REQUEST, "A number is required for " + keyForDefinitionPointId);
+ return error(BAD_REQUEST, "A number is required for " + keyForDefinitionPointId);
} catch (Exception ex) {
- return errorResponse(BAD_REQUEST, "Problem with " + keyForDefinitionPointId + ": " + ex);
+ return error(BAD_REQUEST, "Problem with " + keyForDefinitionPointId + ": " + ex);
}
Dataverse definitionPoint = dataverseSvc.find(dataverseIdToLookup);
if (definitionPoint == null) {
- return errorResponse(NOT_FOUND, "Could not find a dataverse based on id " + dataverseIdToLookup);
+ return error(NOT_FOUND, "Could not find a dataverse based on id " + dataverseIdToLookup);
}
SavedSearch toPersist = new SavedSearch(query, definitionPoint, creator);
@@ -150,7 +150,7 @@ public Response add(JsonObject body) {
} catch (NullPointerException ex) {
// filter queries are not required, keep going
} catch (Exception ex) {
- return errorResponse(BAD_REQUEST, "Problem getting filter queries: " + ex);
+ return error(BAD_REQUEST, "Problem getting filter queries: " + ex);
}
if (!savedSearchFilterQuerys.isEmpty()) {
@@ -159,7 +159,7 @@ public Response add(JsonObject body) {
try {
SavedSearch persistedSavedSearch = savedSearchSvc.add(toPersist);
- return okResponse("Added: " + persistedSavedSearch);
+ return ok("Added: " + persistedSavedSearch);
} catch (EJBException ex) {
StringBuilder errors = new StringBuilder();
Throwable throwable = ex.getCause();
@@ -167,7 +167,7 @@ public Response add(JsonObject body) {
errors.append(throwable).append(" ");
throwable = throwable.getCause();
}
- return errorResponse(BAD_REQUEST, "Problem adding saved search: " + errors);
+ return error(BAD_REQUEST, "Problem adding saved search: " + errors);
}
}
@@ -176,17 +176,17 @@ public Response add(JsonObject body) {
public Response delete(@PathParam("id") long doomedId) {
boolean disabled = true;
if (disabled) {
- return errorResponse(BAD_REQUEST, "Saved Searches can not safely be deleted because links can not safely be deleted. See https://github.com/IQSS/dataverse/issues/1364 for details.");
+ return error(BAD_REQUEST, "Saved Searches can not safely be deleted because links can not safely be deleted. See https://github.com/IQSS/dataverse/issues/1364 for details.");
}
SavedSearch doomed = savedSearchSvc.find(doomedId);
if (doomed == null) {
- return errorResponse(NOT_FOUND, "Could not find saved search id " + doomedId);
+ return error(NOT_FOUND, "Could not find saved search id " + doomedId);
}
boolean wasDeleted = savedSearchSvc.delete(doomedId);
if (wasDeleted) {
- return okResponse(Json.createObjectBuilder().add("Deleted", doomedId));
+ return ok(Json.createObjectBuilder().add("Deleted", doomedId));
} else {
- return errorResponse(INTERNAL_SERVER_ERROR, "Problem deleting id " + doomedId);
+ return error(INTERNAL_SERVER_ERROR, "Problem deleting id " + doomedId);
}
}
@@ -196,11 +196,11 @@ public Response makeLinksForAllSavedSearches(@QueryParam("debug") boolean debug)
JsonObjectBuilder makeLinksResponse;
try {
makeLinksResponse = savedSearchSvc.makeLinksForAllSavedSearches(debug);
- return okResponse(makeLinksResponse);
+ return ok(makeLinksResponse);
} catch (CommandException ex) {
- return errorResponse(BAD_REQUEST, ex.getLocalizedMessage());
+ return error(BAD_REQUEST, ex.getLocalizedMessage());
} catch (SearchException ex) {
- return errorResponse(INTERNAL_SERVER_ERROR, ex.getLocalizedMessage());
+ return error(INTERNAL_SERVER_ERROR, ex.getLocalizedMessage());
}
}
@@ -209,16 +209,16 @@ public Response makeLinksForAllSavedSearches(@QueryParam("debug") boolean debug)
public Response makeLinksForSingleSavedSearch(@PathParam("id") long savedSearchIdToLookUp, @QueryParam("debug") boolean debug) {
SavedSearch savedSearchToMakeLinksFor = savedSearchSvc.find(savedSearchIdToLookUp);
if (savedSearchToMakeLinksFor == null) {
- return errorResponse(BAD_REQUEST, "Count not find saved search id " + savedSearchIdToLookUp);
+ return error(BAD_REQUEST, "Count not find saved search id " + savedSearchIdToLookUp);
}
try {
DataverseRequest dataverseRequest = new DataverseRequest(savedSearchToMakeLinksFor.getCreator(), SavedSearchServiceBean.getHttpServletRequest());
JsonObjectBuilder response = savedSearchSvc.makeLinksForSingleSavedSearch(dataverseRequest, savedSearchToMakeLinksFor, debug);
- return okResponse(response);
+ return ok(response);
} catch (CommandException ex) {
- return errorResponse(BAD_REQUEST, ex.getLocalizedMessage());
+ return error(BAD_REQUEST, ex.getLocalizedMessage());
} catch (SearchException ex) {
- return errorResponse(INTERNAL_SERVER_ERROR, ex.getLocalizedMessage());
+ return error(INTERNAL_SERVER_ERROR, ex.getLocalizedMessage());
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/Search.java b/src/main/java/edu/harvard/iq/dataverse/api/Search.java
index 9b8dfeb2045..f7bd6672035 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/Search.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/Search.java
@@ -101,7 +101,7 @@ public Response search(
filterQueries.add(filterDownToSubtree);
}
} catch (Exception ex) {
- return errorResponse(Response.Status.BAD_REQUEST, ex.getLocalizedMessage());
+ return error(Response.Status.BAD_REQUEST, ex.getLocalizedMessage());
}
// users can't change these (yet anyway)
@@ -132,7 +132,7 @@ public Response search(
}
String message = "Exception running search for [" + query + "] with filterQueries " + filterQueries + " and paginationStart [" + paginationStart + "]: " + sb.toString();
logger.info(message);
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, message);
+ return error(Response.Status.INTERNAL_SERVER_ERROR, message);
}
JsonArrayBuilder itemsArrayBuilder = Json.createArrayBuilder();
@@ -183,12 +183,12 @@ public Response search(
* @todo You get here if you pass only ":" as a query, for
* example. Should we return more or better information?
*/
- return errorResponse(Response.Status.BAD_REQUEST, solrQueryResponse.getError());
+ return error(Response.Status.BAD_REQUEST, solrQueryResponse.getError());
}
response.setHeader("Access-Control-Allow-Origin", "*");
- return okResponse(value);
+ return ok(value);
} else {
- return errorResponse(Response.Status.BAD_REQUEST, "q parameter is missing");
+ return error(Response.Status.BAD_REQUEST, "q parameter is missing");
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java b/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java
deleted file mode 100644
index 0cebd7dd716..00000000000
--- a/src/main/java/edu/harvard/iq/dataverse/api/TestApi.java
+++ /dev/null
@@ -1,147 +0,0 @@
-package edu.harvard.iq.dataverse.api;
-
-import edu.harvard.iq.dataverse.DvObject;
-import edu.harvard.iq.dataverse.authorization.RoleAssignee;
-import edu.harvard.iq.dataverse.authorization.groups.impl.explicit.ExplicitGroup;
-import edu.harvard.iq.dataverse.authorization.groups.impl.explicit.ExplicitGroupServiceBean;
-import edu.harvard.iq.dataverse.authorization.groups.impl.ipaddress.IpGroup;
-import edu.harvard.iq.dataverse.authorization.groups.impl.ipaddress.IpGroupsServiceBean;
-import edu.harvard.iq.dataverse.authorization.groups.impl.ipaddress.ip.IPv4Address;
-import edu.harvard.iq.dataverse.authorization.groups.impl.ipaddress.ip.IpAddress;
-import edu.harvard.iq.dataverse.authorization.providers.builtin.PasswordEncryption;
-import edu.harvard.iq.dataverse.authorization.users.User;
-import javax.ejb.Stateless;
-import javax.ws.rs.GET;
-import javax.ws.rs.Path;
-import javax.ws.rs.PathParam;
-import javax.ws.rs.core.Response;
-import static edu.harvard.iq.dataverse.util.json.JsonPrinter.*;
-import edu.harvard.iq.dataverse.util.json.NullSafeJsonBuilder;
-import java.util.Set;
-import java.util.logging.Level;
-import java.util.logging.Logger;
-import javax.ejb.EJB;
-import javax.json.Json;
-import javax.json.JsonObjectBuilder;
-import javax.ws.rs.QueryParam;
-import org.mindrot.jbcrypt.BCrypt;
-
-/**
- * An API to test internal models without the need to deal with UI etc.
- *
- * @todo Can this entire class be removed and its methods be moved to Admin.java
- * if they are still needed? Once this is done we can remove this warning:
- * "There is a “test” API endpoint used for development and troubleshooting that
- * has some potentially dangerous methods."
- * http://guides.dataverse.org/en/4.2.4/installation/config.html#blocking-api-endpoints
- *
- * @author michael
- */
-@Stateless
-@Path("test")
-public class TestApi extends AbstractApiBean {
- private static final Logger logger = Logger.getLogger(TestApi.class.getName());
-
- @EJB
- ExplicitGroupServiceBean explicitGroups;
-
- @EJB
- IpGroupsServiceBean ipGroupsSvc;
-
- @Path("echo/{whatever}")
- @GET
- public Response echo( @PathParam("whatever") String body ) {
- return okResponse(body);
- }
-
- @Path("permissions/{dvo}")
- @GET
- public Response findPermissonsOn(@PathParam("dvo") String dvo) {
- try {
- DvObject dvObj = findDvo(dvo);
- if (dvObj == null) {
- return notFound("DvObject " + dvo + " not found");
- }
- try {
- User aUser = findUserOrDie();
- JsonObjectBuilder bld = Json.createObjectBuilder();
- bld.add("user", aUser.getIdentifier() );
- bld.add("permissions", json(permissionSvc.permissionsFor(createDataverseRequest(aUser), dvObj)) );
- return okResponse(bld);
-
- } catch (WrappedResponse wr) {
- return wr.getResponse();
- }
- } catch (Exception e) {
- logger.log( Level.SEVERE, "Error while testing permissions", e );
- return errorResponse(Response.Status.INTERNAL_SERVER_ERROR, e.getMessage());
- }
- }
-
- @Path("assignee/{idtf}")
- @GET
- public Response findRoleAssignee(@PathParam("idtf") String idtf) {
- RoleAssignee ra = roleAssigneeSvc.getRoleAssignee(idtf);
- return (ra == null) ? notFound("Role Assignee '" + idtf + "' not found.")
- : okResponse(json(ra.getDisplayInfo()));
- }
-
- @Path("bcrypt/encrypt/{word}")
- @GET
- public String encrypt( @PathParam("word")String word, @QueryParam("len") String len ) {
- int saltLen = (len==null || len.trim().isEmpty() ) ? 10 : Integer.parseInt(len);
- return BCrypt.hashpw(word, BCrypt.gensalt(saltLen)) + "\n";
- }
-
- @Path("password/{w1}")
- @GET
- public String test( @PathParam("w1") String w1 ) {
- StringBuilder sb = new StringBuilder();
- sb.append("[0] ").append( PasswordEncryption.getVersion(0).encrypt(w1)).append("\n");
- sb.append("[1] ").append( PasswordEncryption.getVersion(1).encrypt(w1)).append("\n");
-
- return sb.toString();
- }
-
- @Path("apikey")
- @GET
- public Response testUserLookup() {
- try {
- return okResponse( json(findAuthenticatedUserOrDie()) );
- } catch (WrappedResponse ex) {
- return ex.getResponse();
- }
- }
-
- @Path("explicitGroups/{identifier: .*}")
- @GET
- public Response explicitGroupMembership( @PathParam("identifier") String idtf) {
- final RoleAssignee roleAssignee = roleAssigneeSvc.getRoleAssignee(idtf);
- if (roleAssignee==null ) {
- return notFound("Can't find a role assignee with identifier " + idtf);
- }
- Set groups = explicitGroups.findGroups(roleAssignee);
- logger.log(Level.INFO, "Groups for {0}: {1}", new Object[]{roleAssignee, groups});
- return okResponse( groups.stream().map( g->json(g).build()).collect(toJsonArray()) );
- }
-
- @Path("ipGroups/containing/{address}")
- @GET
- public Response getIpGroupsContaining( @PathParam("address") String addrStr ) {
- try {
- IpAddress addr = IpAddress.valueOf(addrStr);
-
- JsonObjectBuilder r = NullSafeJsonBuilder.jsonObjectBuilder();
- r.add( "address", addr.toString() );
- r.add( "addressRaw", (addr instanceof IPv4Address) ? ((IPv4Address)addr).toBigInteger().toString(): null);
- r.add("groups", ipGroupsSvc.findAllIncludingIp(addr).stream()
- .map( IpGroup::toString )
- .collect(stringsToJsonArray()));
- return okResponse( r );
-
- } catch ( IllegalArgumentException iae ) {
- return badRequest(addrStr + " is not a valid address: " + iae.getMessage());
- }
- }
-
-}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/WorldMapRelatedData.java b/src/main/java/edu/harvard/iq/dataverse/api/WorldMapRelatedData.java
index 616d5c6ccce..6382f7c72a1 100755
--- a/src/main/java/edu/harvard/iq/dataverse/api/WorldMapRelatedData.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/WorldMapRelatedData.java
@@ -149,7 +149,7 @@ public Response checkWorldMapAPI(@Context HttpServletRequest request
logger.info("Failed to retrieve image. Error:" + e2);
}
- return okResponse( "Looks good " + identifier + " " + mapLayerMetadata.getLayerName());
+ return ok( "Looks good " + identifier + " " + mapLayerMetadata.getLayerName());
}
@@ -201,7 +201,7 @@ private Response mapDataFileTokenOnlyOption(@Context HttpServletRequest request
}
}
if (user==null){
- return errorResponse(Response.Status.FORBIDDEN, "Not logged in");
+ return error(Response.Status.FORBIDDEN, "Not logged in");
}
@@ -214,27 +214,27 @@ private Response mapDataFileTokenOnlyOption(@Context HttpServletRequest request
// Check if the user exists
AuthenticatedUser dvUser = dataverseUserService.findByID(dvuser_id);
if ( dvUser == null ){
- return errorResponse(Response.Status.FORBIDDEN, "Invalid user");
+ return error(Response.Status.FORBIDDEN, "Invalid user");
}
// Check if this file exists
DataFile dfile = dataFileService.find(datafile_id);
if (dfile==null){
- return errorResponse(Response.Status.NOT_FOUND, "DataFile not found for id: " + datafile_id);
+ return error(Response.Status.NOT_FOUND, "DataFile not found for id: " + datafile_id);
}
/*
Is the dataset public?
*/
if (!dfile.getOwner().isReleased()){
- return errorResponse(Response.Status.FORBIDDEN, "Mapping is only permitted for public datasets/files");
+ return error(Response.Status.FORBIDDEN, "Mapping is only permitted for public datasets/files");
}
// Does this user have permission to edit metadata for this file?
if (!permissionService.request(createDataverseRequest(dvUser)).on(dfile.getOwner()).has(Permission.EditDataset)){
String errMsg = "The user does not have permission to edit metadata for this file.";
- return errorResponse(Response.Status.FORBIDDEN, errMsg);
+ return error(Response.Status.FORBIDDEN, errMsg);
}
WorldMapToken token = tokenServiceBean.getNewToken(dfile, dvUser);
@@ -243,7 +243,7 @@ private Response mapDataFileTokenOnlyOption(@Context HttpServletRequest request
// Return only the token in a JSON object
final JsonObjectBuilder jsonInfo = Json.createObjectBuilder();
jsonInfo.add(WorldMapToken.GEOCONNECT_TOKEN_KEY, token.getToken());
- return okResponse(jsonInfo);
+ return ok(jsonInfo);
}
// Redirect to geoconnect url
@@ -256,7 +256,7 @@ private Response mapDataFileTokenOnlyOption(@Context HttpServletRequest request
try {
redirect_uri = new URI(redirect_url_str);
} catch (URISyntaxException ex) {
- return errorResponse(Response.Status.NOT_FOUND, "Faile to create URI from: " + redirect_url_str);
+ return error(Response.Status.NOT_FOUND, "Faile to create URI from: " + redirect_url_str);
}
// Response.
return Response.seeOther(redirect_uri).build();
@@ -354,7 +354,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
jsonTokenInfo = Json.createReader(rdr).readObject();
} catch ( JsonParsingException jpe ) {
logger.log(Level.SEVERE, "Json: " + jsonTokenData);
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
}
logger.info("(1a) jsonTokenInfo: " + jsonTokenInfo);
@@ -362,7 +362,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
String worldmapTokenParam = this.retrieveTokenValueFromJson(jsonTokenInfo);
logger.info("(1b) token from JSON: " + worldmapTokenParam);
if (worldmapTokenParam==null){
- return errorResponse(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
+ return error(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
}
// Retrieve WorldMapToken and make sure it is valid
@@ -371,7 +371,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
logger.info("(2) token retrieved from db: " + wmToken);
if (wmToken==null){
- return errorResponse(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
+ return error(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
}
// Make sure the token's User still has permissions to access the file
@@ -379,7 +379,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
logger.info("(3) check permissions");
if (!(tokenServiceBean.canTokenUserEditFile(wmToken))){
tokenServiceBean.expireToken(wmToken);
- return errorResponse(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
+ return error(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
}
@@ -389,24 +389,24 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
//
AuthenticatedUser dvUser = wmToken.getDataverseUser();
if (dvUser == null) {
- return errorResponse(Response.Status.NOT_FOUND, "DataverseUser not found for token");
+ return error(Response.Status.NOT_FOUND, "DataverseUser not found for token");
}
DataFile dfile = wmToken.getDatafile();
if (dfile == null) {
- return errorResponse(Response.Status.NOT_FOUND, "DataFile not found for token");
+ return error(Response.Status.NOT_FOUND, "DataFile not found for token");
}
// (1a) Retrieve FileMetadata
FileMetadata dfile_meta = dfile.getFileMetadata();
if (dfile_meta==null){
- return errorResponse(Response.Status.NOT_FOUND, "FileMetadata not found");
+ return error(Response.Status.NOT_FOUND, "FileMetadata not found");
}
// (2) Now get the dataset and the latest DatasetVersion
Dataset dset = dfile.getOwner();
if (dset==null){
- return errorResponse(Response.Status.NOT_FOUND, "Owning Dataset for this DataFile not found");
+ return error(Response.Status.NOT_FOUND, "Owning Dataset for this DataFile not found");
}
// (2a) latest DatasetVersion
@@ -414,13 +414,13 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
//
DatasetVersion dset_version = dset.getLatestVersion();
if (dset_version==null){
- return errorResponse(Response.Status.NOT_FOUND, "Latest DatasetVersion for this DataFile not found");
+ return error(Response.Status.NOT_FOUND, "Latest DatasetVersion for this DataFile not found");
}
// (3) get Dataverse
Dataverse dverse = dset.getOwner();
if (dverse==null){
- return errorResponse(Response.Status.NOT_FOUND, "Dataverse for this DataFile's Dataset not found");
+ return error(Response.Status.NOT_FOUND, "Dataverse for this DataFile's Dataset not found");
}
// (4) Roll it all up in a JSON response
@@ -437,7 +437,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
jsonData.add("mapping_type", "tabular");
}else{
logger.log(Level.SEVERE, "This was neither a Shapefile nor a Tabular data file. DataFile id: " + dfile.getId());
- return errorResponse( Response.Status.BAD_REQUEST, "Sorry! This file does not have mapping data. Please contact the Dataverse administrator. DataFile id: " + dfile.getId());
+ return error( Response.Status.BAD_REQUEST, "Sorry! This file does not have mapping data. Please contact the Dataverse administrator. DataFile id: " + dfile.getId());
}
@@ -494,7 +494,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
jsonData.add("datafile_id", dfile.getId());
jsonData.add("datafile_label", dfile_meta.getLabel());
//jsonData.add("filename", dfile_meta.getLabel());
- jsonData.add("datafile_expected_md5_checksum", dfile.getmd5());
+ jsonData.add("datafile_expected_md5_checksum", dfile.getChecksumValue());
Long fsize = dfile.getFilesize();
if (fsize == null){
fsize= new Long(-1);
@@ -504,7 +504,7 @@ public Response getWorldMapDatafileInfo(String jsonTokenData, @Context HttpServl
jsonData.add("datafile_content_type", dfile.getContentType());
jsonData.add("datafile_create_datetime", dfile.getCreateDate().toString());
- return okResponse(jsonData);
+ return ok(jsonData);
}
@@ -532,7 +532,7 @@ public Response updateWorldMapLayerData(String jsonLayerData){
//----------------------------------
if (jsonLayerData==null){
logger.log(Level.SEVERE, "jsonLayerData is null");
- return errorResponse( Response.Status.BAD_REQUEST, "No JSON data");
+ return error( Response.Status.BAD_REQUEST, "No JSON data");
}
// (1) Parse JSON
@@ -542,27 +542,27 @@ public Response updateWorldMapLayerData(String jsonLayerData){
jsonInfo = Json.createReader(rdr).readObject();
} catch ( JsonParsingException jpe ) {
logger.log(Level.SEVERE, "Json: " + jsonLayerData);
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
}
// Retrieve token string
String worldmapTokenParam = this.retrieveTokenValueFromJson(jsonInfo);
if (worldmapTokenParam==null){
- return errorResponse(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
+ return error(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
}
// Retrieve WorldMapToken and make sure it is valid
//
WorldMapToken wmToken = this.tokenServiceBean.retrieveAndRefreshValidToken(worldmapTokenParam);
if (wmToken==null){
- return errorResponse(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
+ return error(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
}
// Make sure the token's User still has permissions to access the file
//
if (!(tokenServiceBean.canTokenUserEditFile(wmToken))){
tokenServiceBean.expireToken(wmToken);
- return errorResponse(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
+ return error(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
}
@@ -570,26 +570,26 @@ public Response updateWorldMapLayerData(String jsonLayerData){
//
for (String attr : MapLayerMetadata.MANDATORY_JSON_FIELDS ){
if (!jsonInfo.containsKey(attr)){
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing Json. Key not found [" + attr + "]\nRequired keys are: " + MapLayerMetadata.MANDATORY_JSON_FIELDS );
+ return error( Response.Status.BAD_REQUEST, "Error parsing Json. Key not found [" + attr + "]\nRequired keys are: " + MapLayerMetadata.MANDATORY_JSON_FIELDS );
}
}
// (3) Attempt to retrieve DataverseUser
AuthenticatedUser dvUser = wmToken.getDataverseUser();
if (dvUser == null) {
- return errorResponse(Response.Status.NOT_FOUND, "DataverseUser not found for token");
+ return error(Response.Status.NOT_FOUND, "DataverseUser not found for token");
}
// (4) Attempt to retrieve DataFile
DataFile dfile = wmToken.getDatafile();
if (dfile==null){
- return errorResponse(Response.Status.NOT_FOUND, "DataFile not found for token");
+ return error(Response.Status.NOT_FOUND, "DataFile not found for token");
}
// check permissions!
if (!permissionService.request( createDataverseRequest(dvUser) ).on(dfile.getOwner()).has(Permission.EditDataset)){
String errMsg = "The user does not have permission to edit metadata for this file. (MapLayerMetadata)";
- return errorResponse(Response.Status.FORBIDDEN, errMsg);
+ return error(Response.Status.FORBIDDEN, errMsg);
}
@@ -639,7 +639,7 @@ public Response updateWorldMapLayerData(String jsonLayerData){
MapLayerMetadata savedMapLayerMetadata = mapLayerMetadataService.save(mapLayerMetadata);
if (savedMapLayerMetadata==null){
logger.log(Level.SEVERE, "Json: " + jsonLayerData);
- return errorResponse( Response.Status.BAD_REQUEST, "Failed to save map layer! Original JSON: ");
+ return error( Response.Status.BAD_REQUEST, "Failed to save map layer! Original JSON: ");
}
@@ -658,7 +658,7 @@ public Response updateWorldMapLayerData(String jsonLayerData){
Logger.getLogger(WorldMapRelatedData.class.getName()).log(Level.SEVERE, null, ex);
}
- return okResponse("map layer object saved!");
+ return ok("map layer object saved!");
} // end updateWorldMapLayerData
@@ -684,7 +684,7 @@ public Response deleteWorldMapLayerData(String jsonData){
//----------------------------------*/
if (jsonData==null){
logger.log(Level.SEVERE, "jsonData is null");
- return errorResponse( Response.Status.BAD_REQUEST, "No JSON data");
+ return error( Response.Status.BAD_REQUEST, "No JSON data");
}
// (1) Parse JSON
//
@@ -693,44 +693,44 @@ public Response deleteWorldMapLayerData(String jsonData){
jsonInfo = Json.createReader(rdr).readObject();
} catch ( JsonParsingException jpe ) {
logger.log(Level.SEVERE, "Json: " + jsonData);
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
}
// (2) Retrieve token string
String worldmapTokenParam = this.retrieveTokenValueFromJson(jsonInfo);
if (worldmapTokenParam==null){
- return errorResponse(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
+ return error(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
}
// (3) Retrieve WorldMapToken and make sure it is valid
//
WorldMapToken wmToken = this.tokenServiceBean.retrieveAndRefreshValidToken(worldmapTokenParam);
if (wmToken==null){
- return errorResponse(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
+ return error(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
}
// (4) Make sure the token's User still has permissions to access the file
//
if (!(tokenServiceBean.canTokenUserEditFile(wmToken))){
tokenServiceBean.expireToken(wmToken);
- return errorResponse(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
+ return error(Response.Status.UNAUTHORIZED, "No access. Invalid token.");
}
// (5) Attempt to retrieve DataFile and mapLayerMetadata
DataFile dfile = wmToken.getDatafile();
MapLayerMetadata mapLayerMetadata = this.mapLayerMetadataService.findMetadataByDatafile(dfile);
if (mapLayerMetadata==null){
- return errorResponse(Response.Status.EXPECTATION_FAILED, "No map layer metadata found.");
+ return error(Response.Status.EXPECTATION_FAILED, "No map layer metadata found.");
}
// (6) Delete the mapLayerMetadata
// (note: permissions checked here for a second time by the mapLayerMetadataService call)
//
if (!(this.mapLayerMetadataService.deleteMapLayerMetadataObject(mapLayerMetadata, wmToken.getDataverseUser()))){
- return errorResponse(Response.Status.PRECONDITION_FAILED, "Failed to delete layer");
+ return error(Response.Status.PRECONDITION_FAILED, "Failed to delete layer");
};
- return okResponse("Map layer metadata deleted.");
+ return ok("Map layer metadata deleted.");
} // end deleteWorldMapLayerData
@@ -753,7 +753,7 @@ public Response deleteWorldMapToken(String jsonData){
//----------------------------------*/
if (jsonData==null){
logger.log(Level.SEVERE, "jsonData is null");
- return errorResponse( Response.Status.BAD_REQUEST, "No JSON data");
+ return error( Response.Status.BAD_REQUEST, "No JSON data");
}
// (1) Parse JSON
//
@@ -762,26 +762,26 @@ public Response deleteWorldMapToken(String jsonData){
jsonInfo = Json.createReader(rdr).readObject();
} catch ( JsonParsingException jpe ) {
logger.log(Level.SEVERE, "Json: " + jsonData);
- return errorResponse( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
+ return error( Response.Status.BAD_REQUEST, "Error parsing Json: " + jpe.getMessage() );
}
// (2) Retrieve token string
String worldmapTokenParam = this.retrieveTokenValueFromJson(jsonInfo);
if (worldmapTokenParam==null){
- return errorResponse(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
+ return error(Response.Status.BAD_REQUEST, "Token not found in JSON request.");
}
// (3) Retrieve WorldMapToken
//
WorldMapToken wmToken = this.tokenServiceBean.findByName(worldmapTokenParam);
if (wmToken==null){
- return errorResponse(Response.Status.NOT_FOUND, "Token not found.");
+ return error(Response.Status.NOT_FOUND, "Token not found.");
}
// (4) Delete the token
//
tokenServiceBean.deleteToken(wmToken);
- return okResponse("Token has been deleted.");
+ return ok("Token has been deleted.");
} // end deleteWorldMapLayerData
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java
index 6537ec488da..b08a576f6d9 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/datadeposit/MediaResourceManagerImpl.java
@@ -13,12 +13,15 @@
import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
import edu.harvard.iq.dataverse.engine.command.impl.UpdateDatasetCommand;
import edu.harvard.iq.dataverse.ingest.IngestServiceBean;
+import edu.harvard.iq.dataverse.util.FileUtil;
+import edu.harvard.iq.dataverse.util.SystemConfig;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.InputStream;
import java.util.ArrayList;
import java.util.List;
import java.util.Map;
+import java.util.Set;
import java.util.logging.Logger;
import javax.ejb.EJB;
import javax.ejb.EJBException;
@@ -50,6 +53,8 @@ public class MediaResourceManagerImpl implements MediaResourceManager {
IngestServiceBean ingestService;
@EJB
PermissionServiceBean permissionService;
+ @EJB
+ SystemConfig systemConfig;
@Inject
SwordAuth swordAuth;
@Inject
@@ -256,7 +261,7 @@ DepositReceipt replaceOrAddFiles(String uri, Deposit deposit, AuthCredentials au
List dataFiles = new ArrayList<>();
try {
try {
- dataFiles = ingestService.createDataFiles(editVersion, deposit.getInputStream(), uploadedZipFilename, guessContentTypeForMe);
+ dataFiles = FileUtil.createDataFiles(editVersion, deposit.getInputStream(), uploadedZipFilename, guessContentTypeForMe, systemConfig);
} catch (EJBException ex) {
Throwable cause = ex.getCause();
if (cause != null) {
@@ -282,7 +287,13 @@ DepositReceipt replaceOrAddFiles(String uri, Deposit deposit, AuthCredentials au
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to add file(s) to dataset: " + ex.getMessage());
}
if (!dataFiles.isEmpty()) {
- ingestService.addFiles(editVersion, dataFiles);
+ Set constraintViolations = editVersion.validate();
+ if (constraintViolations.size() > 0) {
+ ConstraintViolation violation = constraintViolations.iterator().next();
+ throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "Unable to add file(s) to dataset: " + violation.getMessage() + " The invalid value was \"" + violation.getInvalidValue() + "\".");
+ } else {
+ ingestService.addFiles(editVersion, dataFiles);
+ }
} else {
throw new SwordError(UriRegistry.ERROR_BAD_REQUEST, "No files to add to dataset. Perhaps the zip file was empty.");
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/ArrayOutOfBoundsExceptionHandler.java b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/ArrayOutOfBoundsExceptionHandler.java
new file mode 100644
index 00000000000..112f50eb3ed
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/ArrayOutOfBoundsExceptionHandler.java
@@ -0,0 +1,38 @@
+package edu.harvard.iq.dataverse.api.errorhandlers;
+
+import java.util.UUID;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+import javax.json.Json;
+import javax.servlet.http.HttpServletRequest;
+import javax.ws.rs.core.Context;
+import javax.ws.rs.core.Response;
+import javax.ws.rs.ext.ExceptionMapper;
+import javax.ws.rs.ext.Provider;
+
+/**
+ * Produces custom 500 messages for the API.
+ * @author michael
+ */
+@Provider
+public class ArrayOutOfBoundsExceptionHandler implements ExceptionMapper{
+
+ private static final Logger logger = Logger.getLogger(ServeletExceptionHandler.class.getName());
+
+ @Context
+ HttpServletRequest request;
+
+ @Override
+ public Response toResponse(java.lang.ArrayIndexOutOfBoundsException ex){
+ String incidentId = UUID.randomUUID().toString();
+ logger.log(Level.SEVERE, "API internal error " + incidentId +": ArrayOutOfBounds:" + ex.getMessage(), ex);
+ return Response.status(500)
+ .entity( Json.createObjectBuilder()
+ .add("status", "ERROR")
+ .add("code", 500)
+ .add("message", "Internal server error. More details available at the server logs.")
+ .add("incidentId", incidentId)
+ .build())
+ .type("application/json").build();
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NotAllowedExceptionHandler.java b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NotAllowedExceptionHandler.java
new file mode 100644
index 00000000000..5df16c9596d
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NotAllowedExceptionHandler.java
@@ -0,0 +1,31 @@
+package edu.harvard.iq.dataverse.api.errorhandlers;
+
+import javax.json.Json;
+import javax.servlet.http.HttpServletRequest;
+import javax.ws.rs.NotAllowedException;
+import javax.ws.rs.core.Context;
+import javax.ws.rs.core.Response;
+import javax.ws.rs.ext.ExceptionMapper;
+import javax.ws.rs.ext.Provider;
+
+@Provider
+public class NotAllowedExceptionHandler implements ExceptionMapper{
+
+ @Context
+ HttpServletRequest request;
+
+ @Override
+ public Response toResponse(NotAllowedException ex){
+ String uri = request.getRequestURI();
+ return Response.status(405)
+ .entity( Json.createObjectBuilder()
+ .add("status", "ERROR")
+ .add("code", 405)
+ .add("message", "'" + uri + "' endpoint does not support method '"+request.getMethod()+"'. Consult our API guide at http://guides.dataverse.org.")
+ .build())
+ .type("application/json").build();
+
+
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NotFoundExceptionHandler.java b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NotFoundExceptionHandler.java
new file mode 100644
index 00000000000..8d916a59b69
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NotFoundExceptionHandler.java
@@ -0,0 +1,36 @@
+package edu.harvard.iq.dataverse.api.errorhandlers;
+
+import javax.json.Json;
+import javax.servlet.http.HttpServletRequest;
+import javax.ws.rs.NotFoundException;
+import javax.ws.rs.core.Context;
+import javax.ws.rs.core.Response;
+import javax.ws.rs.ext.ExceptionMapper;
+import javax.ws.rs.ext.Provider;
+
+/**
+ * Produces custom 404 messages for the API.
+ * @author michael
+ */
+@Provider
+public class NotFoundExceptionHandler implements ExceptionMapper{
+
+ @Context
+ HttpServletRequest request;
+
+ @Override
+ public Response toResponse(NotFoundException ex){
+ String uri = request.getRequestURI();
+ return Response.status(404)
+ .entity( Json.createObjectBuilder()
+ .add("status", "ERROR")
+ .add("code", 404)
+ .add("message", "'" + uri + "' endpoint does not exist on this server. Please check your code for typos, or consult our API guide at http://guides.dataverse.org.")
+ .build())
+ .type("application/json").build();
+
+
+ }
+
+}
+
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NullPointerExceptionHandler.java b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NullPointerExceptionHandler.java
new file mode 100644
index 00000000000..1e9c2a28690
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/NullPointerExceptionHandler.java
@@ -0,0 +1,38 @@
+package edu.harvard.iq.dataverse.api.errorhandlers;
+
+import java.util.UUID;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+import javax.json.Json;
+import javax.servlet.http.HttpServletRequest;
+import javax.ws.rs.core.Context;
+import javax.ws.rs.core.Response;
+import javax.ws.rs.ext.ExceptionMapper;
+import javax.ws.rs.ext.Provider;
+
+/**
+ * Produces custom 500 messages for the API.
+ * @author michael
+ */
+@Provider
+public class NullPointerExceptionHandler implements ExceptionMapper{
+
+ private static final Logger logger = Logger.getLogger(ServeletExceptionHandler.class.getName());
+
+ @Context
+ HttpServletRequest request;
+
+ @Override
+ public Response toResponse(java.lang.NullPointerException ex){
+ String incidentId = UUID.randomUUID().toString();
+ logger.log(Level.SEVERE, "API internal error " + incidentId +": Null Pointer", ex);
+ return Response.status(500)
+ .entity( Json.createObjectBuilder()
+ .add("status", "ERROR")
+ .add("code", 500)
+ .add("message", "Internal server error. More details available at the server logs.")
+ .add("incidentId", incidentId)
+ .build())
+ .type("application/json").build();
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/ServeletExceptionHandler.java b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/ServeletExceptionHandler.java
new file mode 100644
index 00000000000..fa6bff57c03
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/api/errorhandlers/ServeletExceptionHandler.java
@@ -0,0 +1,38 @@
+package edu.harvard.iq.dataverse.api.errorhandlers;
+
+import java.util.UUID;
+import java.util.logging.Level;
+import java.util.logging.Logger;
+import javax.json.Json;
+import javax.servlet.http.HttpServletRequest;
+import javax.ws.rs.core.Context;
+import javax.ws.rs.core.Response;
+import javax.ws.rs.ext.ExceptionMapper;
+import javax.ws.rs.ext.Provider;
+
+/**
+ * Produces custom 500 messages for the API.
+ * @author michael
+ */
+@Provider
+public class ServeletExceptionHandler implements ExceptionMapper{
+
+ private static final Logger logger = Logger.getLogger(ServeletExceptionHandler.class.getName());
+
+ @Context
+ HttpServletRequest request;
+
+ @Override
+ public Response toResponse(javax.servlet.ServletException ex){
+ String incidentId = UUID.randomUUID().toString();
+ logger.log(Level.SEVERE, "API internal error " + incidentId +": " + ex.getMessage(), ex);
+ return Response.status(500)
+ .entity( Json.createObjectBuilder()
+ .add("status", "ERROR")
+ .add("code", 500)
+ .add("message", "Internal server error. More details available at the server logs.")
+ .add("incidentId", incidentId)
+ .build())
+ .type("application/json").build();
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportDDIServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportDDIServiceBean.java
index b96acd8fc7e..ecb099cf26c 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportDDIServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportDDIServiceBean.java
@@ -47,6 +47,7 @@ public class ImportDDIServiceBean {
public static final String NAMING_PROTOCOL_DOI = "doi";
public static final String AGENCY_HANDLE = "handle";
public static final String AGENCY_DOI = "DOI";
+ public static final String AGENCY_DARA = "dara"; // da|ra - http://www.da-ra.de/en/home/
public static final String REPLICATION_FOR_TYPE = "replicationFor";
public static final String VAR_WEIGHTED = "wgtd";
public static final String VAR_INTERVAL_CONTIN = "contin";
@@ -91,6 +92,7 @@ public class ImportDDIServiceBean {
public static final String NOTE_SUBJECT_LOCKSS_PERM = "LOCKSS Permission";
public static final String NOTE_TYPE_REPLICATION_FOR = "DVN:REPLICATION_FOR";
+ private static final String HARVESTED_FILE_STORAGE_PREFIX = "http://";
private XMLInputFactory xmlInputFactory = null;
@EJB CustomFieldServiceBean customFieldService;
@@ -241,18 +243,28 @@ private void processCodeBook(ImportType importType, XMLStreamReader xmlr, Datase
if (event == XMLStreamConstants.START_ELEMENT) {
if (xmlr.getLocalName().equals("docDscr")) {
processDocDscr(xmlr, datasetDTO);
- }
- else if (xmlr.getLocalName().equals("stdyDscr")) {
+ } else if (xmlr.getLocalName().equals("stdyDscr")) {
processStdyDscr(importType, xmlr, datasetDTO);
- }
- else if (xmlr.getLocalName().equals("fileDscr") && !isMigrationImport(importType)) {
+ } else if (xmlr.getLocalName().equals("otherMat") && (isNewImport(importType) || isHarvestWithFilesImport(importType)) ) {
+ processOtherMat(xmlr, datasetDTO, filesMap);
+ } else if (xmlr.getLocalName().equals("fileDscr") && isHarvestWithFilesImport(importType)) {
+ // If this is a harvesting import, we'll attempt to extract some minimal
+ // file-level metadata information from the fileDscr sections as well.
+ // TODO: add more info here... -- 4.6
+ processFileDscrMinimal(xmlr, datasetDTO, filesMap);
+ } else if (xmlr.getLocalName().equals("fileDscr") && isNewImport(importType)) {
+ // this is a "full" fileDscr section - Dataverses use it
+ // to encode *tabular* files only. It will contain the information
+ // about variables, observations, etc. It will be complemented
+ // by a number of entries in the dataDscr section.
+ // Dataverses do not use this section for harvesting exports, since
+ // we don't harvest tabular metadata. And all the "regular"
+ // file-level metadata is encoded in otherMat sections.
+ // The goal is to one day be able to import such tabular
+ // metadata using the direct (non-harvesting) import API.
// EMK TODO: add this back in for ImportType.NEW
//processFileDscr(xmlr, datasetDTO, filesMap);
-
- }
- else if (xmlr.getLocalName().equals("otherMat") && (isNewImport(importType) || isHarvestWithFilesImport(importType)) ) {
- processOtherMat(xmlr, datasetDTO, filesMap);
- }
+ }
} else if (event == XMLStreamConstants.END_ELEMENT) {
if (xmlr.getLocalName().equals("codeBook")) return;
@@ -432,12 +444,23 @@ else if (xmlr.getLocalName().equals("relStdy")) {
private void processCitation(ImportType importType, XMLStreamReader xmlr, DatasetDTO datasetDTO) throws XMLStreamException, ImportException {
DatasetVersionDTO dvDTO = datasetDTO.getDatasetVersion();
MetadataBlockDTO citation=datasetDTO.getDatasetVersion().getMetadataBlocks().get("citation");
+ boolean distStatementProcessed = false;
for (int event = xmlr.next(); event != XMLStreamConstants.END_DOCUMENT; event = xmlr.next()) {
if (event == XMLStreamConstants.START_ELEMENT) {
if (xmlr.getLocalName().equals("titlStmt")) processTitlStmt(xmlr, datasetDTO);
else if (xmlr.getLocalName().equals("rspStmt")) processRspStmt(xmlr,citation);
else if (xmlr.getLocalName().equals("prodStmt")) processProdStmt(xmlr,citation);
- else if (xmlr.getLocalName().equals("distStmt")) processDistStmt(xmlr,citation);
+ else if (xmlr.getLocalName().equals("distStmt")) {
+ if (distStatementProcessed) {
+ // We've already encountered one Distribution Statement in
+ // this citation, we'll just skip any consecutive ones.
+ // This is a defensive check against duplicate distStmt
+ // in some DDIs (notably, from ICPSR)
+ } else {
+ processDistStmt(xmlr,citation);
+ distStatementProcessed = true;
+ }
+ }
else if (xmlr.getLocalName().equals("serStmt")) processSerStmt(xmlr,citation);
else if (xmlr.getLocalName().equals("verStmt")) processVerStmt(importType, xmlr,dvDTO);
else if (xmlr.getLocalName().equals("notes")) {
@@ -882,11 +905,23 @@ private void processAnlyInfo(XMLStreamReader xmlr, MetadataBlockDTO socialScienc
private void processDataColl(XMLStreamReader xmlr, DatasetVersionDTO dvDTO) throws XMLStreamException {
MetadataBlockDTO socialScience =getSocialScience(dvDTO);
+
+ String collMode = "";
+ String timeMeth = "";
+ String weight = "";
+
for (int event = xmlr.next(); event != XMLStreamConstants.END_DOCUMENT; event = xmlr.next()) {
if (event == XMLStreamConstants.START_ELEMENT) {
//timeMethod
if (xmlr.getLocalName().equals("timeMeth")) {
- socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("timeMethod", parseText( xmlr, "timeMeth" )));
+ String thisValue = parseText( xmlr, "timeMeth" );
+ if (!StringUtil.isEmpty(thisValue)) {
+ if (!"".equals(timeMeth)) {
+ timeMeth = timeMeth.concat(", ");
+ }
+ timeMeth = timeMeth.concat(thisValue);
+ }
+ //socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("timeMethod", parseText( xmlr, "timeMeth" )));
} else if (xmlr.getLocalName().equals("dataCollector")) {
socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("dataCollector", parseText( xmlr, "dataCollector" )));
// frequencyOfDataCollection
@@ -903,7 +938,14 @@ private void processDataColl(XMLStreamReader xmlr, DatasetVersionDTO dvDTO) thro
socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("deviationsFromSampleDesign", parseText( xmlr, "deviat" )));
// collectionMode
} else if (xmlr.getLocalName().equals("collMode")) {
- socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("collectionMode", parseText( xmlr, "collMode" )));
+ String thisValue = parseText( xmlr, "collMode" );
+ if (!StringUtil.isEmpty(thisValue)) {
+ if (!"".equals(collMode)) {
+ collMode = collMode.concat(", ");
+ }
+ collMode = collMode.concat(thisValue);
+ }
+ //socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("collectionMode", parseText( xmlr, "collMode" )));
//researchInstrument
} else if (xmlr.getLocalName().equals("resInstru")) {
socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("researchInstrument", parseText( xmlr, "resInstru" )));
@@ -916,12 +958,30 @@ private void processDataColl(XMLStreamReader xmlr, DatasetVersionDTO dvDTO) thro
} else if (xmlr.getLocalName().equals("ConOps")) {
socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("controlOperations", parseText( xmlr, "ConOps" )));
} else if (xmlr.getLocalName().equals("weight")) {
- socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("weighting", parseText( xmlr, "weight" )));
+ String thisValue = parseText( xmlr, "weight" );
+ if (!StringUtil.isEmpty(thisValue)) {
+ if (!"".equals(weight)) {
+ weight = weight.concat(", ");
+ }
+ weight = weight.concat(thisValue);
+ }
+ //socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("weighting", parseText( xmlr, "weight" )));
} else if (xmlr.getLocalName().equals("cleanOps")) {
socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("cleaningOperations", parseText( xmlr, "cleanOps" )));
- }
+ }
} else if (event == XMLStreamConstants.END_ELEMENT) {
- if (xmlr.getLocalName().equals("dataColl")) return;
+ if (xmlr.getLocalName().equals("dataColl")) {
+ if (!StringUtil.isEmpty(timeMeth)) {
+ socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("timeMethod", timeMeth));
+ }
+ if (!StringUtil.isEmpty(collMode)) {
+ socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("collectionMode", collMode));
+ }
+ if (!StringUtil.isEmpty(weight)) {
+ socialScience.getFields().add(FieldDTO.createPrimitiveFieldDTO("weighting", weight));
+ }
+ return;
+ }
}
}
}
@@ -1242,6 +1302,16 @@ private void processTitlStmt(XMLStreamReader xmlr, DatasetDTO datasetDTO) throws
parseStudyIdHandle( parseText(xmlr), datasetDTO );
} else if ( AGENCY_DOI.equals( xmlr.getAttributeValue(null, "agency") ) ) {
parseStudyIdDOI( parseText(xmlr), datasetDTO );
+ } else if ( AGENCY_DARA.equals( xmlr.getAttributeValue(null, "agency"))) {
+ /*
+ da|ra - "Registration agency for social and economic data"
+ (http://www.da-ra.de/en/home/)
+ ICPSR uses da|ra to register their DOIs; so they have agency="dara"
+ in their IDNo entries.
+ Also, their DOIs are formatted differently, without the
+ hdl: prefix.
+ */
+ parseStudyIdDoiICPSRdara( parseText(xmlr), datasetDTO );
} else {
HashSet set = new HashSet<>();
addToSet(set,"otherIdAgency", xmlr.getAttributeValue(null, "agency"));
@@ -1325,16 +1395,23 @@ private Object parseTextNew(XMLStreamReader xmlr, String endTag) throws XMLStrea
if (event == XMLStreamConstants.CHARACTERS) {
returnString += xmlr.getText().trim().replace('\n',' ');
} else if (event == XMLStreamConstants.START_ELEMENT) {
- if (xmlr.getLocalName().equals("p")) {
- returnString += "" + parseText(xmlr, "p") + "
";
- } else if (xmlr.getLocalName().equals("emph")) {
- returnString += "" + parseText(xmlr, "emph") + " ";
- } else if (xmlr.getLocalName().equals("hi")) {
- returnString += "" + parseText(xmlr, "hi") + " ";
+ if (xmlr.getLocalName().equals("p") || xmlr.getLocalName().equals("br") || xmlr.getLocalName().equals("head")) {
+ returnString += "" + parseText(xmlr, xmlr.getLocalName()) + "
";
+ } else if (xmlr.getLocalName().equals("emph") || xmlr.getLocalName().equals("em") || xmlr.getLocalName().equals("i")) {
+ returnString += "" + parseText(xmlr, xmlr.getLocalName()) + " ";
+ } else if (xmlr.getLocalName().equals("hi") || xmlr.getLocalName().equals("b")) {
+ returnString += "" + parseText(xmlr, xmlr.getLocalName()) + " ";
} else if (xmlr.getLocalName().equals("ExtLink")) {
String uri = xmlr.getAttributeValue(null, "URI");
String text = parseText(xmlr, "ExtLink").trim();
returnString += "" + ( StringUtil.isEmpty(text) ? uri : text) + " ";
+ } else if (xmlr.getLocalName().equals("a") || xmlr.getLocalName().equals("A")) {
+ String uri = xmlr.getAttributeValue(null, "URI");
+ if (StringUtil.isEmpty(uri)) {
+ uri = xmlr.getAttributeValue(null, "HREF");
+ }
+ String text = parseText(xmlr, xmlr.getLocalName()).trim();
+ returnString += "" + ( StringUtil.isEmpty(text) ? uri : text) + " ";
} else if (xmlr.getLocalName().equals("list")) {
returnString += parseText_list(xmlr);
} else if (xmlr.getLocalName().equals("citation")) {
@@ -1343,6 +1420,8 @@ private Object parseTextNew(XMLStreamReader xmlr, String endTag) throws XMLStrea
} else {
returnString += parseText_citation(xmlr);
}
+ } else if (xmlr.getLocalName().equals("txt")) {
+ returnString += parseText(xmlr);
} else {
throw new EJBException("ERROR occurred in mapDDI (parseText): tag not yet supported: <" + xmlr.getLocalName() + ">" );
}
@@ -1373,7 +1452,7 @@ private String parseText_list (XMLStreamReader xmlr) throws XMLStreamException {
// check type
String listType = xmlr.getAttributeValue(null, "type");
- if ("bulleted".equals(listType) ){
+ if ("bulleted".equals(listType) || listType == null){
listString = "";
} else if ("ordered".equals(listType) ) {
@@ -1524,6 +1603,31 @@ private void parseStudyIdDOI(String _id, DatasetDTO datasetDTO) throws ImportExc
datasetDTO.setIdentifier(_id.substring(index2+1));
}
+
+ private void parseStudyIdDoiICPSRdara(String _id, DatasetDTO datasetDTO) throws ImportException{
+ /*
+ dara/ICPSR DOIs are formatted without the hdl: prefix; for example -
+ 10.3886/ICPSR06635.v1
+ so we assume that everything before the last "/" is the authority,
+ and everything past it - the identifier:
+ */
+
+ int index = _id.lastIndexOf('/');
+
+ if (index == -1) {
+ throw new ImportException("Error parsing ICPSR/dara DOI IdNo: "+_id+". '/' not found in string");
+ }
+
+ if (index == _id.length() - 1) {
+ throw new ImportException("Error parsing ICPSR/dara DOI IdNo: "+_id+" ends with '/'");
+ }
+
+ datasetDTO.setAuthority(_id.substring(0, index));
+ datasetDTO.setProtocol("doi");
+ datasetDTO.setDoiSeparator("/");
+
+ datasetDTO.setIdentifier(_id.substring(index+1));
+ }
// Helper methods
private MetadataBlockDTO getCitation(DatasetVersionDTO dvDTO) {
return dvDTO.getMetadataBlocks().get("citation");
@@ -1609,6 +1713,58 @@ private void processOtherMat(XMLStreamReader xmlr, DatasetDTO datasetDTO, Map fi
}
}
+ // this method is for attempting to extract the minimal amount of file-level
+ // metadata from an ICPSR-supplied DDI. (they use the "fileDscr" instead of
+ // "otherMat" for general file metadata; the only field they populate is
+ // "fileName". -- 4.6
+
+ private void processFileDscrMinimal(XMLStreamReader xmlr, DatasetDTO datasetDTO, Map filesMap) throws XMLStreamException {
+ FileMetadataDTO fmdDTO = new FileMetadataDTO();
+
+ if (datasetDTO.getDatasetVersion().getFileMetadatas() == null) {
+ datasetDTO.getDatasetVersion().setFileMetadatas(new ArrayList<>());
+ }
+ datasetDTO.getDatasetVersion().getFileMetadatas().add(fmdDTO);
+
+ DataFileDTO dfDTO = new DataFileDTO();
+ dfDTO.setContentType("data/various-formats"); // reserved ICPSR content type identifier
+ fmdDTO.setDataFile(dfDTO);
+
+ for (int event = xmlr.next(); event != XMLStreamConstants.END_DOCUMENT; event = xmlr.next()) {
+ if (event == XMLStreamConstants.START_ELEMENT) {
+ if (xmlr.getLocalName().equals("fileName")) {
+ // this is the file name:
+ String label = parseText(xmlr);
+ // do some cleanup:
+ int col = label.lastIndexOf(':');
+ if ( col > -1) {
+ if (col < label.length() - 1) {
+ label = label.substring(col+1);
+ } else {
+ label = label.replaceAll(":", "");
+ }
+ }
+ label = label.replaceAll("[#;<>\\?\\|\\*\"]", "");
+ label = label.replaceAll("/", "-");
+ // strip leading blanks:
+ label = label.replaceFirst("^[ \t]*", "");
+ fmdDTO.setLabel(label);
+ }
+ } else if (event == XMLStreamConstants.END_ELEMENT) {
+ if (xmlr.getLocalName().equals("fileDscr")) {
+ if (fmdDTO.getLabel() == null || fmdDTO.getLabel().trim().equals("") ) {
+ fmdDTO.setLabel("harvested file");
+ }
+ if (StringUtil.isEmpty(fmdDTO.getDataFile().getStorageIdentifier())) {
+ fmdDTO.getDataFile().setStorageIdentifier(HARVESTED_FILE_STORAGE_PREFIX);
+ }
+
+ return;
+ }
+ }
+ }
+ }
+
private void processFileDscr(XMLStreamReader xmlr, DatasetDTO datasetDTO, Map filesMap) throws XMLStreamException {
FileMetadataDTO fmdDTO = new FileMetadataDTO();
diff --git a/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportServiceBean.java
index 4c5865d560e..2b4e30e25c1 100644
--- a/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/api/imports/ImportServiceBean.java
@@ -38,6 +38,7 @@
import edu.harvard.iq.dataverse.util.json.JsonParseException;
import edu.harvard.iq.dataverse.util.json.JsonParser;
import java.io.File;
+import java.io.FileOutputStream;
import java.io.IOException;
import java.io.PrintWriter;
import java.io.StringReader;
@@ -221,7 +222,8 @@ public Dataset doImportHarvestedDataset(DataverseRequest dataverseRequest, Harve
// Kraffmiller's export modules; replace the logic below with clean
// programmatic lookup of the import plugin needed.
- if ("ddi".equalsIgnoreCase(metadataFormat) || "oai_ddi".equals(metadataFormat)) {
+ if ("ddi".equalsIgnoreCase(metadataFormat) || "oai_ddi".equals(metadataFormat)
+ || metadataFormat.toLowerCase().matches("^oai_ddi.*")) {
try {
String xmlToParse = new String(Files.readAllBytes(metadataFile.toPath()));
// TODO:
@@ -230,16 +232,16 @@ public Dataset doImportHarvestedDataset(DataverseRequest dataverseRequest, Harve
// ImportType.HARVEST vs. ImportType.HARVEST_WITH_FILES
logger.fine("importing DDI "+metadataFile.getAbsolutePath());
dsDTO = importDDIService.doImport(ImportType.HARVEST_WITH_FILES, xmlToParse);
- } catch (XMLStreamException e) {
- throw new ImportException("XMLStreamException" + e);
+ } catch (Exception e) {
+ throw new ImportException("Failed to process DDI XML record: "+ e.getClass() + " (" + e.getMessage() + ")");
}
} else if ("dc".equalsIgnoreCase(metadataFormat) || "oai_dc".equals(metadataFormat)) {
logger.fine("importing DC "+metadataFile.getAbsolutePath());
try {
String xmlToParse = new String(Files.readAllBytes(metadataFile.toPath()));
dsDTO = importGenericService.processOAIDCxml(xmlToParse);
- } catch (XMLStreamException e) {
- throw new ImportException("XMLStreamException processing Dublin Core XML record: "+e.getMessage());
+ } catch (Exception e) {
+ throw new ImportException("Failed to process Dublin Core XML record: "+ e.getClass() + " (" + e.getMessage() + ")");
}
} else if ("dataverse_json".equals(metadataFormat)) {
// This is Dataverse metadata already formatted in JSON.
@@ -371,12 +373,20 @@ public Dataset doImportHarvestedDataset(DataverseRequest dataverseRequest, Harve
importedDataset = engineSvc.submit(new CreateDatasetCommand(ds, dataverseRequest, false, ImportType.HARVEST));
}
- } catch (JsonParseException ex) {
- logger.log(Level.INFO, "Error parsing datasetVersion: {0}", ex.getMessage());
- throw new ImportException("Error parsing datasetVersion: " + ex.getMessage(), ex);
- } catch (CommandException ex) {
- logger.log(Level.INFO, "Error excuting Create dataset command: {0}", ex.getMessage());
- throw new ImportException("Error excuting dataverse command: " + ex.getMessage(), ex);
+ } catch (Exception ex) {
+ logger.fine("Failed to import harvested dataset: " + ex.getClass() + ": " + ex.getMessage());
+ FileOutputStream savedJsonFileStream = new FileOutputStream(new File(metadataFile.getAbsolutePath() + ".json"));
+ byte[] jsonBytes = json.getBytes();
+ int i = 0;
+ while (i < jsonBytes.length) {
+ int chunkSize = i + 8192 <= jsonBytes.length ? 8192 : jsonBytes.length - i;
+ savedJsonFileStream.write(jsonBytes, i, chunkSize);
+ i += chunkSize;
+ savedJsonFileStream.flush();
+ }
+ savedJsonFileStream.close();
+ logger.info("JSON produced saved in " + metadataFile.getAbsolutePath() + ".json");
+ throw new ImportException("Failed to import harvested dataset: " + ex.getClass() + " (" + ex.getMessage() + ")", ex);
}
return importedDataset;
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/BuiltinUserServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/BuiltinUserServiceBean.java
index 760b7a2986d..d5459a2fb75 100644
--- a/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/BuiltinUserServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/authorization/providers/builtin/BuiltinUserServiceBean.java
@@ -46,16 +46,14 @@ public String encryptPassword(String plainText) {
}
public BuiltinUser save(BuiltinUser dataverseUser) {
- /**
- * Trim the email address no matter what the user entered or is entered
+ /* Trim the email address no matter what the user entered or is entered
* on their behalf in the case of Shibboleth assertions.
*
* @todo Why doesn't Bean Validation report that leading and trailing
* whitespace in an email address is a problem?
*/
dataverseUser.setEmail(dataverseUser.getEmail().trim());
- /**
- * We throw a proper IllegalArgumentException here because otherwise
+ /* We throw a proper IllegalArgumentException here because otherwise
* from the API you get a 500 response and "Can't save user: null".
*/
ValidatorFactory factory = Validation.buildDefaultValidatorFactory();
diff --git a/src/main/java/edu/harvard/iq/dataverse/dataaccess/ImageThumbConverter.java b/src/main/java/edu/harvard/iq/dataverse/dataaccess/ImageThumbConverter.java
index b2aa4223881..6915d04754e 100644
--- a/src/main/java/edu/harvard/iq/dataverse/dataaccess/ImageThumbConverter.java
+++ b/src/main/java/edu/harvard/iq/dataverse/dataaccess/ImageThumbConverter.java
@@ -233,7 +233,7 @@ public static File getImageThumbAsFile(FileAccessIO fileAccess, int size ) {
logger.fine("obtained non-null imageThumbFileName: "+imageThumbFileName);
File imageThumbFile = new File(imageThumbFileName);
- if (imageThumbFile != null && imageThumbFile.exists()) {
+ if (imageThumbFile.exists()) {
return imageThumbFile;
}
@@ -280,7 +280,7 @@ public static String generateImageThumb(String fileLocation, int size) {
} catch (Exception ex) {
//
}
-
+
if (fileSize == 0 || fileSize > sizeLimit) {
// this file is too large, exiting.
return null;
@@ -292,7 +292,7 @@ public static String generateImageThumb(String fileLocation, int size) {
BufferedImage fullSizeImage = ImageIO.read(new File(fileLocation));
if (fullSizeImage == null) {
- logger.fine("could not read image with ImageIO.read()");
+ logger.warning("could not read image with ImageIO.read()");
return null;
}
@@ -488,39 +488,49 @@ public static String generatePDFThumb(String fileLocation, int size) {
// belongs. :)
// -- L.A. June 2014
- String ImageMagickCmd = null;
String previewFileLocation = null;
-
- if (size != DEFAULT_PREVIEW_SIZE) {
- // check if the "preview size" image is already available - and
- // if not, generate it. this 400 pixel image will be used to
- // generate smaller-size thumbnails.
-
- previewFileLocation = fileLocation + ".thumb" + DEFAULT_PREVIEW_SIZE;
- if (new File(previewFileLocation).exists()) {
- fileLocation = previewFileLocation;
- } else {
- previewFileLocation = runImageMagick(imageMagickExec, fileLocation, DEFAULT_PREVIEW_SIZE, "pdf");
- if (previewFileLocation != null) {
- fileLocation = previewFileLocation;
- }
- }
+ // check if the "preview size" image is already available - and
+ // if not, generate it. this 400 pixel image will be used to
+ // generate smaller-size thumbnails.
+ previewFileLocation = fileLocation + ".thumb" + DEFAULT_PREVIEW_SIZE;
+
+ if (!((new File(previewFileLocation)).exists())) {
+ previewFileLocation = runImageMagick(imageMagickExec, fileLocation, DEFAULT_PREVIEW_SIZE, "pdf");
}
- thumbFileLocation = runImageMagick(imageMagickExec, fileLocation, size, "pdf");
- if (thumbFileLocation != null) {
- // While we are at it, let's make sure both thumbnail sizes are generated:
- if (size == DEFAULT_PREVIEW_SIZE || fileLocation.equals(previewFileLocation)) {
+ if (previewFileLocation == null) {
+ return null;
+ }
+
+ if (size == DEFAULT_PREVIEW_SIZE) {
+ return previewFileLocation;
+ }
+
+ // generate the thumbnail for the requested size, *using the already scaled-down
+ // 400x400 png version, above*:
+
+ if (!((new File(thumbFileLocation)).exists())) {
+ thumbFileLocation = runImageMagick(imageMagickExec, previewFileLocation, thumbFileLocation, size, "png");
+ }
- for (int s : (new int[] {DEFAULT_THUMBNAIL_SIZE, DEFAULT_CARDIMAGE_SIZE})) {
- if (size != s && !thumbnailFileExists(fileLocation, s)) {
- runImageMagick(imageMagickExec, fileLocation, s, "pdf");
- }
- }
+ return thumbFileLocation;
+
+
+ /*
+ An alternative way of handling it:
+ while we are at it, let's generate *all* the smaller thumbnail sizes:
+ for (int s : (new int[]{DEFAULT_THUMBNAIL_SIZE, DEFAULT_CARDIMAGE_SIZE})) {
+ String thisThumbLocation = fileLocation + ".thumb" + s;
+ if (!(new File(thisThumbLocation).exists())) {
+ thisThumbLocation = runImageMagick(imageMagickExec, previewFileLocation, thisThumbLocation, s, "png");
}
- return thumbFileLocation;
}
+
+ // return the location of the thumbnail for the requested size:
+ if (new File(thumbFileLocation).exists()) {
+ return thumbFileLocation;
+ }*/
}
logger.fine("returning null");
@@ -534,8 +544,12 @@ private static boolean thumbnailFileExists(String fileLocation, int size) {
}
private static String runImageMagick(String imageMagickExec, String fileLocation, int size, String format) {
- String imageMagickCmd = null;
String thumbFileLocation = fileLocation + ".thumb" + size;
+ return runImageMagick(imageMagickExec, fileLocation, thumbFileLocation, size, format);
+ }
+
+ private static String runImageMagick(String imageMagickExec, String fileLocation, String thumbFileLocation, int size, String format) {
+ String imageMagickCmd = null;
if ("pdf".equals(format)) {
imageMagickCmd = imageMagickExec + " pdf:" + fileLocation + "[0] -thumbnail "+ size + "x" + size + " -flatten -strip png:" + thumbFileLocation;
diff --git a/src/main/java/edu/harvard/iq/dataverse/datasetutility/TwoRavensHelper.java b/src/main/java/edu/harvard/iq/dataverse/datasetutility/TwoRavensHelper.java
new file mode 100644
index 00000000000..e352a2995fc
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/datasetutility/TwoRavensHelper.java
@@ -0,0 +1,322 @@
+/*
+ * To change this license header, choose License Headers in Project Properties.
+ * To change this template file, choose Tools | Templates
+ * and open the template in the editor.
+ */
+package edu.harvard.iq.dataverse.datasetutility;
+
+import edu.harvard.iq.dataverse.Dataset;
+import edu.harvard.iq.dataverse.DataverseSession;
+import edu.harvard.iq.dataverse.DvObject;
+import edu.harvard.iq.dataverse.FileMetadata;
+import edu.harvard.iq.dataverse.PermissionServiceBean;
+import edu.harvard.iq.dataverse.authorization.AuthenticationServiceBean;
+import edu.harvard.iq.dataverse.authorization.Permission;
+import edu.harvard.iq.dataverse.authorization.users.ApiToken;
+import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
+import edu.harvard.iq.dataverse.authorization.users.GuestUser;
+import edu.harvard.iq.dataverse.authorization.users.User;
+import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
+import java.util.HashMap;
+import java.util.Map;
+import javax.faces.view.ViewScoped;
+import javax.inject.Inject;
+import javax.inject.Named;
+
+/**
+ *
+ * @author rmp553
+
+ */
+@ViewScoped
+@Named
+public class TwoRavensHelper implements java.io.Serializable {
+
+ @Inject SettingsServiceBean settingsService;
+ @Inject PermissionServiceBean permissionService;
+ @Inject AuthenticationServiceBean authService;
+
+ @Inject
+ DataverseSession session;
+
+ private final Map fileMetadataTwoRavensExploreMap = new HashMap<>(); // { FileMetadata.id : Boolean }
+
+ public TwoRavensHelper(){
+
+ }
+
+
+ /**
+ * Call this from a Dataset or File page
+ * - calls private method canSeeTwoRavensExploreButton
+ *
+ * WARNING: Before calling this, make sure the user has download
+ * permission for the file!! (See DatasetPage.canDownloadFile())
+ *
+ * @param fm
+ * @return
+ */
+ public boolean canSeeTwoRavensExploreButtonFromAPI(FileMetadata fm, User user){
+
+ if (fm == null){
+ return false;
+ }
+
+ if (user == null){
+ return false;
+ }
+
+ if (!this.permissionService.userOn(user, fm.getDataFile()).has(Permission.DownloadFile)){
+ return false;
+ }
+
+ return this.canSeeTwoRavensExploreButton(fm, true);
+ }
+
+ /**
+ * Call this from a Dataset or File page
+ * - calls private method canSeeTwoRavensExploreButton
+ *
+ * WARNING: Before calling this, make sure the user has download
+ * permission for the file!! (See DatasetPage.canDownloadFile())
+ *
+ * @param fm
+ * @return
+ */
+ public boolean canSeeTwoRavensExploreButtonFromPage(FileMetadata fm){
+
+ if (fm == null){
+ return false;
+ }
+
+ return this.canSeeTwoRavensExploreButton(fm, true);
+ }
+
+ /**
+ * Used to check whether a tabular file
+ * may be viewed via TwoRavens
+ *
+ * @param fm
+ * @return
+ */
+ public boolean canSeeTwoRavensExploreButton(FileMetadata fm, boolean permissionsChecked){
+ if (fm == null){
+ return false;
+ }
+
+ // This is only here as a reminder to the public method users
+ if (!permissionsChecked){
+ return false;
+ }
+
+ if (!fm.getDataFile().isTabularData()){
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+ }
+
+ // Has this already been checked?
+ if (this.fileMetadataTwoRavensExploreMap.containsKey(fm.getId())){
+ // Yes, return previous answer
+ //logger.info("using cached result for candownloadfile on filemetadata "+fid);
+ return this.fileMetadataTwoRavensExploreMap.get(fm.getId());
+ }
+
+
+ // (1) Is TwoRavens active via the "setting" table?
+ // Nope: get out
+ //
+ if (!settingsService.isTrueForKey(SettingsServiceBean.Key.TwoRavensTabularView, false)){
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+ }
+
+ //----------------------------------------------------------------------
+ //(1a) Before we do any testing - if version is deaccessioned and user
+ // does not have edit dataset permission then may download
+ //---
+
+ // (2) Is the DataFile object there and persisted?
+ // Nope: scat
+ //
+ if ((fm.getDataFile() == null)||(fm.getDataFile().getId()==null)){
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+ }
+
+ if (fm.getDatasetVersion().isDeaccessioned()) {
+ if (this.doesSessionUserHavePermission( Permission.EditDataset, fm)) {
+ // Yes, save answer and return true
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), true);
+ return true;
+ } else {
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+ }
+ }
+
+
+
+
+ //Check for restrictions
+
+ boolean isRestrictedFile = fm.isRestricted();
+
+
+ // --------------------------------------------------------------------
+ // Conditions (2) through (4) are for Restricted files
+ // --------------------------------------------------------------------
+
+
+ if (isRestrictedFile && session.getUser() instanceof GuestUser){
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+ }
+
+
+ // --------------------------------------------------------------------
+ // (3) Does the User have DownloadFile Permission at the **Dataset** level
+ // --------------------------------------------------------------------
+
+
+ if (isRestrictedFile && !this.doesSessionUserHavePermission(Permission.DownloadFile, fm)){
+ // Yes, save answer and return true
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+ }
+
+ // (3) Is there tabular data or is the ingest in progress?
+ // Yes: great
+ //
+ if ((fm.getDataFile().isTabularData())||(fm.getDataFile().isIngestInProgress())){
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), true);
+ return true;
+ }
+
+ // Nope
+ this.fileMetadataTwoRavensExploreMap.put(fm.getId(), false);
+ return false;
+
+ // (empty fileMetadata.dataFile.id) and (fileMetadata.dataFile.tabularData or fileMetadata.dataFile.ingestInProgress)
+ // and DatasetPage.canDownloadFile(fileMetadata)
+ }
+
+
+ /**
+ * Copied over from the dataset page - 9/21/2016
+ *
+ * @return
+ */
+ public String getDataExploreURL() {
+ String TwoRavensUrl = settingsService.getValueForKey(SettingsServiceBean.Key.TwoRavensUrl);
+
+ if (TwoRavensUrl != null && !TwoRavensUrl.equals("")) {
+ return TwoRavensUrl;
+ }
+
+ return "";
+ }
+
+
+ /**
+ * Copied over from the dataset page - 9/21/2016
+ *
+ * @param fileid
+ * @param apiTokenKey
+ * @return
+ */
+ public String getDataExploreURLComplete(Long fileid) {
+ if (fileid == null){
+ throw new NullPointerException("fileid cannot be null");
+ }
+
+
+ String TwoRavensUrl = settingsService.getValueForKey(SettingsServiceBean.Key.TwoRavensUrl);
+ String TwoRavensDefaultLocal = "/dataexplore/gui.html?dfId=";
+
+ if (TwoRavensUrl != null && !TwoRavensUrl.equals("")) {
+ // If we have TwoRavensUrl set up as, as an optional
+ // configuration service, it must mean that TwoRavens is sitting
+ // on some remote server. And that in turn means that we must use
+ // full URLs to pass data and metadata to it.
+ // update: actually, no we don't want to use this "dataurl" notation.
+ // switching back to the dfId=:
+ // -- L.A. 4.1
+ /*
+ String tabularDataURL = getTabularDataFileURL(fileid);
+ String tabularMetaURL = getVariableMetadataURL(fileid);
+ return TwoRavensUrl + "?ddiurl=" + tabularMetaURL + "&dataurl=" + tabularDataURL + "&" + getApiTokenKey();
+ */
+ System.out.print("TwoRavensUrl Set up " + TwoRavensUrl + "?dfId=" + fileid + "&" + getApiTokenKey());
+
+ return TwoRavensUrl + "?dfId=" + fileid + "&" + getApiTokenKey();
+ }
+
+ // For a local TwoRavens setup it's enough to call it with just
+ // the file id:
+ return TwoRavensDefaultLocal + fileid + "&" + getApiTokenKey();
+ }
+
+ private String getApiTokenKey() {
+ ApiToken apiToken;
+ if (session.getUser() == null) {
+ return null;
+ }
+ if (isSessionUserAuthenticated()) {
+ AuthenticatedUser au = (AuthenticatedUser) session.getUser();
+ apiToken = authService.findApiTokenByUser(au);
+ if (apiToken != null) {
+ return "key=" + apiToken.getTokenString();
+ }
+ // Generate if not available?
+ // Or should it just be generated inside the authService
+ // automatically?
+ apiToken = authService.generateApiTokenForUser(au);
+ if (apiToken != null) {
+ return "key=" + apiToken.getTokenString();
+ }
+ }
+ return "";
+
+ }
+
+ public boolean isSessionUserAuthenticated() {
+
+ if (session == null) {
+ return false;
+ }
+
+ if (session.getUser() == null) {
+ return false;
+ }
+
+ return session.getUser().isAuthenticated();
+
+ }
+
+ public boolean doesSessionUserHavePermission(Permission permissionToCheck, FileMetadata fileMetadata){
+ if (permissionToCheck == null){
+ return false;
+ }
+
+ DvObject objectToCheck = null;
+
+ if (permissionToCheck.equals(Permission.EditDataset)){
+ objectToCheck = fileMetadata.getDatasetVersion().getDataset();
+ } else if (permissionToCheck.equals(Permission.DownloadFile)){
+ objectToCheck = fileMetadata.getDataFile();
+ }
+
+ if (objectToCheck == null){
+ return false;
+ }
+
+
+ // Check the permission
+ //
+ boolean hasPermission = this.permissionService.userOn(this.session.getUser(), objectToCheck).has(permissionToCheck);
+
+
+ // return true/false
+ return hasPermission;
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/datasetutility/WorldMapPermissionHelper.java b/src/main/java/edu/harvard/iq/dataverse/datasetutility/WorldMapPermissionHelper.java
new file mode 100644
index 00000000000..2bf79b7cfb9
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/datasetutility/WorldMapPermissionHelper.java
@@ -0,0 +1,629 @@
+/*
+ * To change this license header, choose License Headers in Project Properties.
+ * To change this template file, choose Tools | Templates
+ * and open the template in the editor.
+ */
+package edu.harvard.iq.dataverse.datasetutility;
+
+import edu.harvard.iq.dataverse.DataFile;
+import edu.harvard.iq.dataverse.Dataset;
+import edu.harvard.iq.dataverse.DataverseSession;
+import edu.harvard.iq.dataverse.DvObject;
+import edu.harvard.iq.dataverse.FileMetadata;
+import edu.harvard.iq.dataverse.MapLayerMetadata;
+import edu.harvard.iq.dataverse.MapLayerMetadataServiceBean;
+import edu.harvard.iq.dataverse.PermissionServiceBean;
+import edu.harvard.iq.dataverse.authorization.Permission;
+import edu.harvard.iq.dataverse.authorization.users.GuestUser;
+import edu.harvard.iq.dataverse.authorization.users.User;
+import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
+import java.util.HashMap;
+import java.util.List;
+import java.util.Map;
+import javax.faces.view.ViewScoped;
+import javax.inject.Inject;
+import javax.inject.Named;
+
+/**
+ * This class originally encapsulated display logic for the DatasetPage
+ *
+ * It allows the following checks without redundantly querying the db to
+ * check permissions or if MapLayerMetadata exists
+ *
+ * - canUserSeeMapDataButton (private)
+ * - canUserSeeMapDataButtonFromPage (public)
+ * - canUserSeeMapDataButtonFromAPI (public)
+ *
+ * - canSeeMapButtonReminderToPublish (private)
+ * - canSeeMapButtonReminderToPublishFromPage (public)
+ * - canSeeMapButtonReminderToPublishFromAPI (public)
+ *
+ * - canUserSeeExploreWorldMapButton (private)
+ * - canUserSeeExploreWorldMapButtonFromPage (public)
+ * - canUserSeeExploreWorldMapButtonFromAPI (public)
+ *
+ * @author rmp553
+ */
+@ViewScoped
+@Named
+public class WorldMapPermissionHelper implements java.io.Serializable {
+
+ @Inject SettingsServiceBean settingsService;
+ @Inject MapLayerMetadataServiceBean mapLayerMetadataService;
+ @Inject PermissionServiceBean permissionService;
+ @Inject DataverseSession session;
+
+
+ private final Map fileMetadataWorldMapExplore = new HashMap<>(); // { FileMetadata.id : Boolean }
+ private Map mapLayerMetadataLookup = null;
+ private final Map datasetPermissionMap = new HashMap<>(); // { Permission human_name : Boolean }
+
+
+ public WorldMapPermissionHelper( ){
+
+ }
+
+
+ /**
+ * Using a DataFile id, retrieve an associated MapLayerMetadata object
+ *
+ * The MapLayerMetadata objects have been fetched at page inception by
+ * "loadMapLayerMetadataLookup()"
+ */
+ public MapLayerMetadata getMapLayerMetadata(DataFile df) {
+ if (df == null) {
+ return null;
+ }
+ if (mapLayerMetadataLookup == null){
+ loadMapLayerMetadataLookup(df.getOwner());
+ }
+ return this.mapLayerMetadataLookup.get(df.getId());
+ }
+
+
+ /*
+ * Call this when using the API
+ * - calls private method canUserSeeExploreWorldMapButton
+ */
+ public boolean canUserSeeExploreWorldMapButtonFromAPI(FileMetadata fm, User user){
+
+ if (fm == null){
+ return false;
+ }
+ if (user==null){
+ return false;
+ }
+ if (!this.permissionService.userOn(user, fm.getDataFile()).has(Permission.DownloadFile)){
+ return false;
+ }
+
+ return this.canUserSeeExploreWorldMapButton(fm, true);
+ }
+
+ /**
+ * Call this from a Dataset or File page
+ * - calls private method canUserSeeExploreWorldMapButton
+ *
+ * WARNING: Before calling this, make sure the user has download
+ * permission for the file!! (See DatasetPage.canDownloadFile())
+ *
+ * @param FileMetadata fm
+ * @return boolean
+ */
+ public boolean canUserSeeExploreWorldMapButtonFromPage(FileMetadata fm){
+
+ if (fm==null){
+ return false;
+ }
+ return this.canUserSeeExploreWorldMapButton(fm, true);
+ }
+
+ /**
+ * WARNING: Before calling this, make sure the user has download
+ * permission for the file!! (See DatasetPage.canDownloadFile())
+ *
+ * Should there be a Explore WorldMap Button for this file?
+ * See table in: https://github.com/IQSS/dataverse/issues/1618
+ *
+ * (1) Does the file have MapLayerMetadata?
+ * (2) Are the proper settings in place
+ *
+ * @param fm FileMetadata
+ * @return boolean
+ */
+ private boolean canUserSeeExploreWorldMapButton(FileMetadata fm, boolean permissionsChecked){
+ if (fm==null){
+ return false;
+ }
+ // This is only here to make the public method users think...
+ if (!permissionsChecked){
+ return false;
+ }
+ if (this.fileMetadataWorldMapExplore.containsKey(fm.getId())){
+ // Yes, return previous answer
+ //logger.info("using cached result for candownloadfile on filemetadata "+fid);
+ return this.fileMetadataWorldMapExplore.get(fm.getId());
+ }
+
+ /* -----------------------------------------------------
+ Does a Map Exist?
+ ----------------------------------------------------- */
+ if (!(this.hasMapLayerMetadata(fm))) {
+ //See if it does
+ MapLayerMetadata layer_metadata = mapLayerMetadataService.findMetadataByDatafile(fm.getDataFile());
+ if (layer_metadata != null) {
+ if (mapLayerMetadataLookup == null) {
+ loadMapLayerMetadataLookup(fm.getDataFile().getOwner());
+ }
+ // yes: keep going...
+ mapLayerMetadataLookup.put(layer_metadata.getDataFile().getId(), layer_metadata);
+ } else {
+ // Nope: no button
+ this.fileMetadataWorldMapExplore.put(fm.getId(), false);
+ return false;
+ }
+ }
+
+ /*
+ Is setting for GeoconnectViewMaps true?
+ Nope? no button
+ */
+ if (!settingsService.isTrueForKey(SettingsServiceBean.Key.GeoconnectViewMaps, false)){
+ this.fileMetadataWorldMapExplore.put(fm.getId(), false);
+ return false;
+ }
+ //----------------------------------------------------------------------
+ //(0) Before we give it to you - if version is deaccessioned and user
+ // does not have edit dataset permission then may download
+ //----------------------------------------------------------------------
+
+ if (fm.getDatasetVersion().isDeaccessioned()) {
+ if (this.doesSessionUserHavePermission( Permission.EditDataset, fm)) {
+ // Yes, save answer and return true
+ this.fileMetadataWorldMapExplore.put(fm.getId(), true);
+ return true;
+ } else {
+ this.fileMetadataWorldMapExplore.put(fm.getId(), false);
+ return false;
+ }
+ }
+ //Check for restrictions
+
+ boolean isRestrictedFile = fm.isRestricted();
+
+ // --------------------------------------------------------------------
+ // Is the file Unrestricted ?
+ // --------------------------------------------------------------------
+ if (!isRestrictedFile){
+ // Yes, save answer and return true
+ this.fileMetadataWorldMapExplore.put(fm.getId(), true);
+ return true;
+ }
+
+ // --------------------------------------------------------------------
+ // Conditions (2) through (4) are for Restricted files
+ // --------------------------------------------------------------------
+
+
+ if (session.getUser() instanceof GuestUser){
+ this.fileMetadataWorldMapExplore.put(fm.getId(), false);
+ return false;
+ }
+
+
+ // --------------------------------------------------------------------
+ // (3) Does the User have DownloadFile Permission at the **Dataset** level
+ // --------------------------------------------------------------------
+
+
+ if (!this.doesSessionUserHavePermission(Permission.DownloadFile, fm)){
+ // Yes, save answer and return true
+ this.fileMetadataWorldMapExplore.put(fm.getId(), false);
+ return false;
+ }
+
+ /* -----------------------------------------------------
+ Yes: User can view button!
+ ----------------------------------------------------- */
+ this.fileMetadataWorldMapExplore.put(fm.getId(), true);
+ return true;
+ }
+
+
+ /*
+ Check if the FileMetadata.dataFile has an associated MapLayerMetadata object
+
+ The MapLayerMetadata objects have been fetched at page inception by "loadMapLayerMetadataLookup()"
+ */
+ public boolean hasMapLayerMetadata(FileMetadata fm) {
+ if (fm == null) {
+ return false;
+ }
+ if (fm.getDataFile() == null) {
+ return false;
+ }
+ if (mapLayerMetadataLookup == null) {
+ loadMapLayerMetadataLookup(fm.getDataFile().getOwner());
+ }
+ return doesDataFileHaveMapLayerMetadata(fm.getDataFile());
+ }
+
+ /**
+ * Check if a DataFile has an associated MapLayerMetadata object
+ *
+ * The MapLayerMetadata objects have been fetched at page inception by
+ * "loadMapLayerMetadataLookup()"
+ */
+ private boolean doesDataFileHaveMapLayerMetadata(DataFile df) {
+ if (df == null) {
+ return false;
+ }
+ if (df.getId() == null) {
+ return false;
+ }
+ return this.mapLayerMetadataLookup.containsKey(df.getId());
+ }
+
+
+ /**
+ * Create a hashmap consisting of { DataFile.id : MapLayerMetadata object}
+ *
+ * Very few DataFiles will have associated MapLayerMetadata objects so only
+ * use 1 query to get them
+ */
+ private void loadMapLayerMetadataLookup(Dataset dataset) {
+ mapLayerMetadataLookup = new HashMap<>();
+ if (dataset == null) {
+ }
+ if (dataset.getId() == null) {
+ return;
+ }
+ List mapLayerMetadataList = mapLayerMetadataService.getMapLayerMetadataForDataset(dataset);
+ if (mapLayerMetadataList == null) {
+ return;
+ }
+ for (MapLayerMetadata layer_metadata : mapLayerMetadataList) {
+ mapLayerMetadataLookup.put(layer_metadata.getDataFile().getId(), layer_metadata);
+ }
+
+ }// A DataFile may have a related MapLayerMetadata object
+
+
+ /**
+ * Check if this is a mappable file type.
+ *
+ * Currently (2/2016)
+ * - Shapefile (zipped shapefile)
+ * - Tabular file with Geospatial Data tag
+ *
+ * @param fm
+ * @return
+ */
+ private boolean isPotentiallyMappableFileType(FileMetadata fm){
+ if (fm==null){
+ return false;
+ }
+
+ // Yes, it's a shapefile
+ //
+ if (this.isShapefileType(fm)){
+ return true;
+ }
+
+ // Yes, it's tabular with a geospatial tag
+ //
+ if (fm.getDataFile().isTabularData()){
+ if (fm.getDataFile().hasGeospatialTag()){
+ return true;
+ }
+ }
+ return false;
+ }
+
+
+
+ public boolean isShapefileType(FileMetadata fm) {
+ if (fm == null) {
+ return false;
+ }
+ if (fm.getDataFile() == null) {
+ return false;
+ }
+
+ return fm.getDataFile().isShapefileType();
+ }
+
+
+ /**
+ * Call this from a Dataset or File page
+ * - calls private method canSeeMapButtonReminderToPublish
+ *
+ * WARNING: Assumes user isAuthenicated AND has Permission.EditDataset
+ * - These checks should be made on the DatasetPage or FilePage which calls this method
+ *
+ *
+ * @param FileMetadata fm
+ * @return boolean
+ */
+ public boolean canSeeMapButtonReminderToPublishFromPage(FileMetadata fm){
+ if (fm == null){
+ return false;
+ }
+
+ if (mapLayerMetadataLookup == null){
+ loadMapLayerMetadataLookup(fm.getDatasetVersion().getDataset());
+ }
+
+ return this.canSeeMapButtonReminderToPublish(fm, true);
+
+ }
+
+
+ /**
+ * Call this when using the API
+ * - calls private method canSeeMapButtonReminderToPublish
+ *
+ * @param fm
+ * @param user
+ * @return
+ */
+ public boolean canSeeMapButtonReminderToPublishFromAPI(FileMetadata fm, User user){
+ if (fm == null){
+ return false;
+ }
+ if (user==null){
+ return false;
+ }
+
+ if (!this.permissionService.userOn(user, fm.getDataFile().getOwner()).has(Permission.EditDataset)){
+ return false;
+ }
+
+ return this.canSeeMapButtonReminderToPublish(fm, true);
+
+ }
+
+
+
+ /**
+ * Assumes permissions have been checked!!
+ *
+ * See table in: https://github.com/IQSS/dataverse/issues/1618
+ *
+ * Can the user see a reminder to publish button?
+ * (1) Is the view GeoconnectViewMaps
+ * (2) Is this file a Shapefile or a Tabular file tagged as Geospatial?
+ * (3) Is this DataFile released? Yes, don't need reminder
+ * (4) Does a map already exist? Yes, don't need reminder
+ */
+ private boolean canSeeMapButtonReminderToPublish(FileMetadata fm, boolean permissionsChecked){
+ if (fm==null){
+ return false;
+ }
+
+ // Is this user authenticated with EditDataset permission?
+ //
+ if (!(isUserAuthenticatedWithEditDatasetPermission(fm))){
+ return false;
+ }
+
+ // This is only here as a reminder to the public method users
+ if (!permissionsChecked){
+ return false;
+ }
+
+ // (1) Is the view GeoconnectViewMaps
+ if (!settingsService.isTrueForKey(SettingsServiceBean.Key.GeoconnectCreateEditMaps, false)){
+ return false;
+ }
+
+
+ // (2) Is this file a Shapefile or a Tabular file tagged as Geospatial?
+ //
+ if (!(this.isPotentiallyMappableFileType(fm))){
+ return false;
+ }
+
+ // (3) Is this DataFile released? Yes, don't need reminder
+ //
+ if (fm.getDataFile().isReleased()){
+ return false;
+ }
+
+ // (4) Does a map already exist? Yes, don't need reminder
+ //
+ if (this.hasMapLayerMetadata(fm)){
+ return false;
+ }
+
+ // Looks good
+ //
+ return true;
+ }
+
+ /**
+ *
+ * WARNING: Assumes user isAuthenicated AND has Permission.EditDataset
+ * - These checks are made on the DatasetPage which calls this method
+ *
+ */
+ public boolean canUserSeeMapDataButtonFromPage(FileMetadata fm){
+
+ if (fm==null){
+ return false;
+ }
+
+ // Is this user authenticated with EditDataset permission?
+ //
+ if (!(isUserAuthenticatedWithEditDatasetPermission(fm))){
+ return false;
+ }
+ if (mapLayerMetadataLookup == null){
+ loadMapLayerMetadataLookup(fm.getDatasetVersion().getDataset());
+ }
+ if (this.hasMapLayerMetadata(fm)){
+ return false;
+ }
+ return this.canUserSeeMapDataButton(fm, true);
+ }
+
+
+
+ /**
+ * Call this when using the API
+ * - calls private method canUserSeeMapDataButton
+ *
+ * @param fm
+ * @param user
+ * @return
+ */
+ public boolean canUserSeeMapDataButtonFromAPI(FileMetadata fm, User user){
+ if (fm == null){
+ return false;
+ }
+ if (user==null){
+ return false;
+ }
+
+ if (!this.permissionService.userOn(user, fm.getDataFile().getOwner()).has(Permission.EditDataset)){
+ return false;
+ }
+
+ return this.canUserSeeMapDataButton(fm, true);
+
+ }
+
+ /**
+ *
+ * WARNING: Assumes user isAuthenicated AND has Permission.EditDataset
+ * - These checks are made on the DatasetPage which calls this method
+ *
+ * Should there be a Map Data Button for this file?
+ * see table in: https://github.com/IQSS/dataverse/issues/1618
+ * (1) Is the user logged in?
+ * (2) Is this file a Shapefile or a Tabular file tagged as Geospatial?
+ * (3) Does the logged in user have permission to edit the Dataset to which this FileMetadata belongs?
+ * (4) Is the create Edit Maps flag set to true?
+ * (5) Any of these conditions:
+ * 9a) File Published
+ * (b) Draft: File Previously published
+ * @param fm FileMetadata
+ * @return boolean
+ */
+ private boolean canUserSeeMapDataButton(FileMetadata fm, boolean permissionsChecked){
+ if (fm==null){
+ return false;
+ }
+
+ // This is only here as a reminder to the public method users
+ if (!permissionsChecked){
+
+ return false;
+ }
+
+ // (1) Is this file a Shapefile or a Tabular file tagged as Geospatial?
+ // TO DO: EXPAND FOR TABULAR FILES TAGGED AS GEOSPATIAL!
+ //
+ if (!(this.isPotentiallyMappableFileType(fm))){
+
+ return false;
+ }
+
+
+ // (2) Is the view GeoconnectViewMaps
+ if (!settingsService.isTrueForKey(SettingsServiceBean.Key.GeoconnectCreateEditMaps, false)){
+
+ return false;
+ }
+
+ // (3) Is File released?
+ //
+ if (fm.getDataFile().isReleased()){
+
+ return true;
+ }
+
+ // Nope
+ return false;
+ }
+
+ private boolean isUserAuthenticatedWithEditDatasetPermission( FileMetadata fm){
+
+ // Is the user authenticated?
+ //
+ if (!(isSessionUserAuthenticated())){
+ return false;
+ }
+
+ // If so, can the logged in user edit the Dataset to which this FileMetadata belongs?
+ //
+ if (!this.doesSessionUserHavePermission(Permission.EditDataset, fm)){
+ return false;
+ }
+
+ return true;
+ }
+
+ public boolean isSessionUserAuthenticated() {
+
+
+ if (session == null) {
+ return false;
+ }
+
+ if (session.getUser() == null) {
+ return false;
+ }
+
+ return session.getUser().isAuthenticated();
+
+ }
+
+ private boolean doesSessionUserHavePermission(Permission permissionToCheck, FileMetadata fileMetadata){
+ if (permissionToCheck == null){
+ return false;
+ }
+
+ DvObject objectToCheck = null;
+
+ if (permissionToCheck.equals(Permission.EditDataset)){
+ objectToCheck = fileMetadata.getDatasetVersion().getDataset();
+ } else if (permissionToCheck.equals(Permission.DownloadFile)){
+ objectToCheck = fileMetadata.getDataFile();
+ }
+
+ if (objectToCheck == null){
+ return false;
+ }
+
+ if (this.session.getUser() == null){
+ return false;
+ }
+
+ if (this.permissionService == null){
+ return false;
+ }
+
+ String permName = permissionToCheck.getHumanName();
+
+ // Has this check already been done?
+ //
+ if (this.datasetPermissionMap.containsKey(permName)){
+ // Yes, return previous answer
+ return this.datasetPermissionMap.get(permName);
+ }
+
+ // Check the permission
+ //
+
+ boolean hasPermission = this.permissionService.userOn(this.session.getUser(), objectToCheck).has(permissionToCheck);
+
+ // Save the permission
+ this.datasetPermissionMap.put(permName, hasPermission);
+
+ // return true/false
+ return hasPermission;
+ }
+
+
+}
\ No newline at end of file
diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateDatasetCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateDatasetCommand.java
index 25d2231708a..3a8973c7845 100644
--- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateDatasetCommand.java
+++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/CreateDatasetCommand.java
@@ -100,6 +100,7 @@ public Dataset execute(CommandContext ctxt) throws CommandException {
String validationFailedString = "Validation failed:";
for (ConstraintViolation constraintViolation : constraintViolations) {
validationFailedString += " " + constraintViolation.getMessage();
+ validationFailedString += " Invalid value: '" + constraintViolation.getInvalidValue() + "'.";
}
throw new IllegalCommandException(validationFailedString, this);
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/DestroyDatasetCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/DestroyDatasetCommand.java
index 1f27ba0a199..12ac09af935 100644
--- a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/DestroyDatasetCommand.java
+++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/DestroyDatasetCommand.java
@@ -8,7 +8,6 @@
import edu.harvard.iq.dataverse.RoleAssignment;
import edu.harvard.iq.dataverse.authorization.Permission;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
-import edu.harvard.iq.dataverse.authorization.users.User;
import edu.harvard.iq.dataverse.engine.command.AbstractVoidCommand;
import edu.harvard.iq.dataverse.engine.command.CommandContext;
import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
@@ -25,7 +24,7 @@
import java.util.logging.Logger;
/**
- * Same as {@link DeleteDatasetCommand}, but does not stop it the dataset is
+ * Same as {@link DeleteDatasetCommand}, but does not stop if the dataset is
* published. This command is reserved for super-users, if at all.
*
* @author michael
diff --git a/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/RequestAccessCommand.java b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/RequestAccessCommand.java
new file mode 100644
index 00000000000..84fbe138e6d
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/engine/command/impl/RequestAccessCommand.java
@@ -0,0 +1,43 @@
+/*
+ * To change this license header, choose License Headers in Project Properties.
+ * To change this template file, choose Tools | Templates
+ * and open the template in the editor.
+ */
+package edu.harvard.iq.dataverse.engine.command.impl;
+
+import edu.harvard.iq.dataverse.DataFile;
+import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
+import edu.harvard.iq.dataverse.engine.command.AbstractCommand;
+import edu.harvard.iq.dataverse.engine.command.CommandContext;
+import edu.harvard.iq.dataverse.engine.command.DataverseRequest;
+import edu.harvard.iq.dataverse.engine.command.RequiredPermissions;
+import edu.harvard.iq.dataverse.engine.command.exception.CommandException;
+
+/**
+ *
+ * @author gdurand
+ */
+@RequiredPermissions({})
+public class RequestAccessCommand extends AbstractCommand {
+
+ private final DataFile file;
+ private final AuthenticatedUser requester;
+
+
+ public RequestAccessCommand(DataverseRequest dvRequest, DataFile file) {
+ // for data file check permission on owning dataset
+ super(dvRequest, file);
+ this.file = file;
+ this.requester = (AuthenticatedUser) dvRequest.getUser();
+ }
+
+ @Override
+ public DataFile execute(CommandContext ctxt) throws CommandException {
+ file.getFileAccessRequesters().add(requester);
+ return ctxt.files().save(file);
+ }
+
+
+
+}
+
diff --git a/src/main/java/edu/harvard/iq/dataverse/export/dublincore/DublinCoreExportUtil.java b/src/main/java/edu/harvard/iq/dataverse/export/dublincore/DublinCoreExportUtil.java
index 2ace4655d07..4d587d085b3 100644
--- a/src/main/java/edu/harvard/iq/dataverse/export/dublincore/DublinCoreExportUtil.java
+++ b/src/main/java/edu/harvard/iq/dataverse/export/dublincore/DublinCoreExportUtil.java
@@ -70,25 +70,28 @@ private static void dto2dublincore(DatasetDTO datasetDto, OutputStream outputStr
xmlw.writeAttribute("xmlns:dcterms", DCTERMS_XML_NAMESPACE);
xmlw.writeDefaultNamespace(DCTERMS_DEFAULT_NAMESPACE);
//xmlw.writeAttribute("xsi:schemaLocation", DCTERMS_DEFAULT_NAMESPACE+" "+DCTERMS_XML_SCHEMALOCATION);
+ createDC(xmlw, datasetDto, dcFlavor);
} else if (DC_FLAVOR_OAI.equals(dcFlavor)) {
xmlw.writeStartElement("oai_dc:dc");
xmlw.writeAttribute("xmlns:xsi", "http://www.w3.org/2001/XMLSchema-instance");
xmlw.writeAttribute("xmlns:oai_dc", OAI_DC_XML_NAMESPACE);
xmlw.writeAttribute("xmlns:dc", DC_XML_NAMESPACE);
xmlw.writeAttribute("xsi:schemaLocation", OAI_DC_XML_NAMESPACE+" "+OAI_DC_XML_SCHEMALOCATION);
- writeAttribute(xmlw, "version", DEFAULT_XML_VERSION);
+ //writeAttribute(xmlw, "version", DEFAULT_XML_VERSION);
+ createOAIDC(xmlw, datasetDto, dcFlavor);
}
- createDC(xmlw, datasetDto, dcFlavor);
+
xmlw.writeEndElement(); // or
xmlw.flush();
}
- //TODO:
+ //UPDATED by rmo-cdsp:
// If the requested flavor is "OAI_DC" (the minimal, original 15 field format),
- // we shuld NOT be exporting the extended, DCTERMS fields
+ // we shuld NOT be exporting the extended, DCTERMS fields (aka not createDC)
// - such as, for example, "dateSubmitted" ... (4.5.1?)
// -- L.A.
+ // but use createOAIDC instead (the minimal, original 15 field format)
private static void createDC(XMLStreamWriter xmlw, DatasetDTO datasetDto, String dcFlavor) throws XMLStreamException {
DatasetVersionDTO version = datasetDto.getDatasetVersion();
@@ -139,6 +142,42 @@ private static void createDC(XMLStreamWriter xmlw, DatasetDTO datasetDto, String
}
+ private static void createOAIDC(XMLStreamWriter xmlw, DatasetDTO datasetDto, String dcFlavor) throws XMLStreamException {
+ DatasetVersionDTO version = datasetDto.getDatasetVersion();
+ String persistentAgency = datasetDto.getProtocol();
+ String persistentAuthority = datasetDto.getAuthority();
+ String persistentId = datasetDto.getIdentifier();
+
+ writeFullElement(xmlw, dcFlavor+":"+"title", dto2Primitive(version, DatasetFieldConstant.title));
+
+ xmlw.writeStartElement(dcFlavor+":"+"identifier");
+ xmlw.writeCharacters(persistentAgency + ":" + persistentAuthority + "/" + persistentId);
+ xmlw.writeEndElement(); // decterms:identifier
+
+ writeAuthorsElement(xmlw, version, dcFlavor); //creator
+
+ writeFullElement(xmlw, dcFlavor+":"+"publisher", datasetDto.getPublisher());
+
+ writeAbstractElement(xmlw, version, dcFlavor); // Description
+ writeSubjectElement(xmlw, version, dcFlavor); //Subjects and Key Words
+
+ writeFullElementList(xmlw, dcFlavor+":"+"language", dto2PrimitiveList(version, DatasetFieldConstant.language));
+
+ writeFullElement(xmlw, dcFlavor+":"+"date", dto2Primitive(version, DatasetFieldConstant.productionDate));
+
+ writeFullElement(xmlw, dcFlavor+":"+"contributor", dto2Primitive(version, DatasetFieldConstant.depositor));
+
+ writeContributorElement(xmlw, version, dcFlavor);
+
+ writeFullElementList(xmlw, dcFlavor+":"+"relation", dto2PrimitiveList(version, DatasetFieldConstant.relatedDatasets));
+
+ writeFullElementList(xmlw, dcFlavor+":"+"type", dto2PrimitiveList(version, DatasetFieldConstant.kindOfData));
+
+ writeFullElementList(xmlw, dcFlavor+":"+"source", dto2PrimitiveList(version, DatasetFieldConstant.dataSources));
+
+
+ }
+
private static void writeAuthorsElement(XMLStreamWriter xmlw, DatasetVersionDTO datasetVersionDTO, String dcFlavor) throws XMLStreamException {
for (Map.Entry entry : datasetVersionDTO.getMetadataBlocks().entrySet()) {
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/client/FastGetRecord.java b/src/main/java/edu/harvard/iq/dataverse/harvest/client/FastGetRecord.java
index 6acdaf06102..742771ef9a5 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/client/FastGetRecord.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/client/FastGetRecord.java
@@ -72,7 +72,16 @@
public class FastGetRecord {
- private static String DATAVERSE_EXTENDED_METADATA = "dataverse_json";
+ private static final String DATAVERSE_EXTENDED_METADATA = "dataverse_json";
+ private static final String XML_METADATA_TAG = "metadata";
+ private static final String XML_METADATA_TAG_OPEN = "<"+XML_METADATA_TAG+">";
+ private static final String XML_METADATA_TAG_CLOSE = ""+XML_METADATA_TAG+">";
+ private static final String XML_OAI_PMH_CLOSING_TAGS = "";
+ private static final String XML_XMLNS_XSI_ATTRIBUTE_TAG = "xmlns:xsi=";
+ private static final String XML_XMLNS_XSI_ATTRIBUTE = " "+XML_XMLNS_XSI_ATTRIBUTE_TAG+"\"http://www.w3.org/2001/XMLSchema-instance\">";
+ private static final String XML_COMMENT_START = "";
+
/**
* Client-side GetRecord verb constructor
*
@@ -186,20 +195,34 @@ public void harvestRecord(String baseURL, String identifier, String metadataPref
while ( ( line = rd.readLine () ) != null) {
if (!metadataFlag) {
- if (line.matches(".*.*")) {
+ if (line.matches(".*"+XML_METADATA_TAG_OPEN+".*")) {
String lineCopy = line;
- int i = line.indexOf("");
- line = line.substring(i+10);
+ int i = line.indexOf(XML_METADATA_TAG_OPEN);
+ if (line.length() > i + XML_METADATA_TAG_OPEN.length()) {
+ line = line.substring(i + XML_METADATA_TAG_OPEN.length());
+ // TODO: check if there's anything useful (non-white space, etc.)
+ // in the remaining part of the line?
+ if ((i = line.indexOf('<')) > -1) {
+ if (i > 0) {
+ line = line.substring(i);
+ }
+ } else {
+ line = null;
+ }
+
+ } else {
+ line = null;
+ }
- oaiResponseHeader = oaiResponseHeader.concat(lineCopy.replaceAll(".*", " "));
+ oaiResponseHeader = oaiResponseHeader.concat(lineCopy.replaceAll(XML_METADATA_TAG_OPEN+".*", XML_METADATA_TAG_OPEN+XML_METADATA_TAG_CLOSE+XML_OAI_PMH_CLOSING_TAGS));
tempFileStream = new FileOutputStream(savedMetadataFile);
metadataOut = new PrintWriter (tempFileStream, true);
//metadataOut.println(""); /* ? */
metadataFlag = true;
- } else if (line.matches(".*]*>.*")) {
+ } else if (line.matches(".*<"+XML_METADATA_TAG+" [^>]*>.*")) {
if (metadataPrefix.equals(DATAVERSE_EXTENDED_METADATA)) {
oaiResponseHeader = oaiResponseHeader.concat(line);
metadataWritten = true;
@@ -207,7 +230,10 @@ public void harvestRecord(String baseURL, String identifier, String metadataPref
}
}
}
+
+ //System.out.println(line);
+ if (line != null) {
if (metadataFlag) {
if (!metadataWritten) {
// Inside an OAI-PMH GetRecord response, the metadata
@@ -224,26 +250,26 @@ public void harvestRecord(String baseURL, String identifier, String metadataPref
// significant.
// -- L.A.
- if (line.matches(" -1) {
- if (!line.substring(i).matches("^]*/")) {
+ while ((i = line.indexOf("<"+XML_METADATA_TAG, i)) > -1) {
+ if (!line.substring(i).matches("^<"+XML_METADATA_TAG+"[^>]*/")) {
// don't count if it's a closed, empty tag:
//
mopen++;
}
- i+=10;
+ i+=XML_METADATA_TAG_OPEN.length();
}
}
- if (line.matches(".* .*")) {
+ if (line.matches(".*"+XML_METADATA_TAG_CLOSE+".*")) {
int i = 0;
- while ((i = line.indexOf(" ", i)) > -1) {
- i+=11;
+ while ((i = line.indexOf(XML_METADATA_TAG_CLOSE, i)) > -1) {
+ i+=XML_METADATA_TAG_CLOSE.length();
mclose++;
}
if ( mclose > mopen ) {
- line = line.substring(0, line.lastIndexOf(" "));
+ line = line.substring(0, line.lastIndexOf(XML_METADATA_TAG_CLOSE));
metadataWritten = true;
}
}
@@ -262,10 +288,13 @@ public void harvestRecord(String baseURL, String identifier, String metadataPref
// the first "real" XML element (of the form
// ). So we need to skip these!
- while ( (line.indexOf('<', offset) > -1)
- &&
- "",offset)) < 0)) {
+ ((offset = line.indexOf(XML_COMMENT_END,offset)) < 0)) {
line = line.replaceAll("[\n\r]", " ");
offset = line.length();
line = line.concat(rd.readLine());
}
- offset += 3;
+ offset += XML_COMMENT_END.length();
}
// if we have skipped some comments, is there another
@@ -319,10 +348,11 @@ public void harvestRecord(String baseURL, String identifier, String metadataPref
int i = firstElementStart;
- if (!line.substring(i).matches("^<[^>]*xmlns.*")) {
+ if (!line.substring(i).matches("^<[^>]*"+XML_XMLNS_XSI_ATTRIBUTE_TAG+".*")) {
String head = line.substring(0, i);
String tail = line.substring(i);
- tail = tail.replaceFirst(">", " xmlns=\"http://www.openarchives.org/OAI/2.0/\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\">");
+ //tail = tail.replaceFirst(">", " xmlns=\"http://www.openarchives.org/OAI/2.0/\" xmlns:xsi=\"http://www.w3.org/2001/XMLSchema-instance\">");
+ tail = tail.replaceFirst(">", XML_XMLNS_XSI_ATTRIBUTE);
line = head + tail;
}
@@ -340,6 +370,7 @@ public void harvestRecord(String baseURL, String identifier, String metadataPref
} else {
oaiResponseHeader = oaiResponseHeader.concat(line);
}
+ }
}
// parse the OAI Record header:
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvesterServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvesterServiceBean.java
index f5e1a4ca976..4d546d57eea 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvesterServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvesterServiceBean.java
@@ -50,6 +50,8 @@
import edu.harvard.iq.dataverse.harvest.client.oai.OaiHandler;
import edu.harvard.iq.dataverse.harvest.client.oai.OaiHandlerException;
import edu.harvard.iq.dataverse.search.IndexServiceBean;
+import java.io.FileWriter;
+import java.io.PrintWriter;
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
@@ -151,10 +153,14 @@ public void doHarvest(DataverseRequest dataverseRequest, Long harvestingClientId
MutableBoolean harvestErrorOccurred = new MutableBoolean(false);
String logTimestamp = logFormatter.format(new Date());
Logger hdLogger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.client.HarvesterServiceBean." + harvestingDataverse.getAlias() + logTimestamp);
- String logFileName = "../logs" + File.separator + "harvest_" + harvestingClientConfig.getName() + logTimestamp + ".log";
+ String logFileName = "../logs" + File.separator + "harvest_" + harvestingClientConfig.getName() + "_" + logTimestamp + ".log";
FileHandler fileHandler = new FileHandler(logFileName);
hdLogger.setUseParentHandlers(false);
hdLogger.addHandler(fileHandler);
+
+ PrintWriter importCleanupLog = new PrintWriter(new FileWriter( "../logs/harvest_cleanup_" + harvestingClientConfig.getName() + "_" + logTimestamp+".txt"));
+
+
List harvestedDatasetIds = null;
List harvestedDatasetIdsThisBatch = new ArrayList();
@@ -177,7 +183,7 @@ public void doHarvest(DataverseRequest dataverseRequest, Long harvestingClientId
if (harvestingClientConfig.isOai()) {
- harvestedDatasetIds = harvestOAI(dataverseRequest, harvestingClientConfig, hdLogger, harvestErrorOccurred, failedIdentifiers, deletedIdentifiers, harvestedDatasetIdsThisBatch);
+ harvestedDatasetIds = harvestOAI(dataverseRequest, harvestingClientConfig, hdLogger, importCleanupLog, harvestErrorOccurred, failedIdentifiers, deletedIdentifiers, harvestedDatasetIdsThisBatch);
} else {
throw new IOException("Unsupported harvest type");
@@ -221,6 +227,7 @@ public void doHarvest(DataverseRequest dataverseRequest, Long harvestingClientId
harvestingClientService.resetHarvestInProgress(harvestingClientId);
fileHandler.close();
hdLogger.removeHandler(fileHandler);
+ importCleanupLog.close();
}
}
@@ -231,7 +238,7 @@ public void doHarvest(DataverseRequest dataverseRequest, Long harvestingClientId
* @param harvestErrorOccurred have we encountered any errors during harvest?
* @param failedIdentifiers Study Identifiers for failed "GetRecord" requests
*/
- private List harvestOAI(DataverseRequest dataverseRequest, HarvestingClient harvestingClient, Logger hdLogger, MutableBoolean harvestErrorOccurred, List failedIdentifiers, List deletedIdentifiers, List harvestedDatasetIdsThisBatch)
+ private List harvestOAI(DataverseRequest dataverseRequest, HarvestingClient harvestingClient, Logger hdLogger, PrintWriter importCleanupLog, MutableBoolean harvestErrorOccurred, List failedIdentifiers, List deletedIdentifiers, List harvestedDatasetIdsThisBatch)
throws IOException, ParserConfigurationException, SAXException, TransformerException {
logBeginOaiHarvest(hdLogger, harvestingClient);
@@ -262,7 +269,7 @@ private List harvestOAI(DataverseRequest dataverseRequest, HarvestingClien
MutableBoolean getRecordErrorOccurred = new MutableBoolean(false);
// Retrieve and process this record with a separate GetRecord call:
- Long datasetId = processRecord(dataverseRequest, hdLogger, oaiHandler, identifier, getRecordErrorOccurred, processedSizeThisBatch, deletedIdentifiers);
+ Long datasetId = processRecord(dataverseRequest, hdLogger, importCleanupLog, oaiHandler, identifier, getRecordErrorOccurred, processedSizeThisBatch, deletedIdentifiers);
hdLogger.info("Total content processed in this batch so far: "+processedSizeThisBatch);
if (datasetId != null) {
@@ -278,6 +285,8 @@ private List harvestOAI(DataverseRequest dataverseRequest, HarvestingClien
if (getRecordErrorOccurred.booleanValue() == true) {
failedIdentifiers.add(identifier);
harvestErrorOccurred.setValue(true);
+ //temporary:
+ //throw new IOException("Exception occured, stopping harvest");
}
// reindexing in batches? - this is from DVN 3;
@@ -307,7 +316,7 @@ private List harvestOAI(DataverseRequest dataverseRequest, HarvestingClien
@TransactionAttribute(TransactionAttributeType.NOT_SUPPORTED)
- public Long processRecord(DataverseRequest dataverseRequest, Logger hdLogger, OaiHandler oaiHandler, String identifier, MutableBoolean recordErrorOccurred, MutableLong processedSizeThisBatch, List deletedIdentifiers) {
+ public Long processRecord(DataverseRequest dataverseRequest, Logger hdLogger, PrintWriter importCleanupLog, OaiHandler oaiHandler, String identifier, MutableBoolean recordErrorOccurred, MutableLong processedSizeThisBatch, List deletedIdentifiers) {
String errMessage = null;
Dataset harvestedDataset = null;
logGetRecord(hdLogger, oaiHandler, identifier);
@@ -334,15 +343,16 @@ public Long processRecord(DataverseRequest dataverseRequest, Logger hdLogger, Oa
}
} else {
- hdLogger.fine("Successfully retrieved GetRecord response.");
+ hdLogger.info("Successfully retrieved GetRecord response.");
tempFile = record.getMetadataFile();
+ PrintWriter cleanupLog;
harvestedDataset = importService.doImportHarvestedDataset(dataverseRequest,
oaiHandler.getHarvestingClient(),
identifier,
oaiHandler.getMetadataPrefix(),
record.getMetadataFile(),
- null);
+ importCleanupLog);
hdLogger.fine("Harvest Successful for identifier " + identifier);
hdLogger.fine("Size of this record: " + record.getMetadataFile().length());
@@ -355,7 +365,10 @@ public Long processRecord(DataverseRequest dataverseRequest, Logger hdLogger, Oa
} finally {
if (tempFile != null) {
- try{tempFile.delete();}catch(Throwable t){};
+ // temporary - let's not delete the temp metadata file if anything went wrong, for now:
+ if (errMessage == null) {
+ try{tempFile.delete();}catch(Throwable t){};
+ }
}
}
@@ -445,6 +458,9 @@ public void logGetRecordException(Logger hdLogger, OaiHandler oaiHandler, String
+e.getMessage();
hdLogger.log(Level.SEVERE, errMessage);
+
+ // temporary:
+ e.printStackTrace();
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java b/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java
index ee2ddb8bd12..12d3ebac6f3 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/client/HarvestingClient.java
@@ -93,8 +93,8 @@ public void setId(Long id) {
public static final String HARVEST_STYLE_DESCRIPTION_DEFAULT="Generic OAI resource (DC)";
- public static final List HARVEST_STYLE_LIST = Arrays.asList(HARVEST_STYLE_DATAVERSE, HARVEST_STYLE_VDC, HARVEST_STYLE_NESSTAR, HARVEST_STYLE_ROPER, HARVEST_STYLE_HGL, HARVEST_STYLE_DEFAULT);
- public static final List HARVEST_STYLE_DESCRIPTION_LIST = Arrays.asList(HARVEST_STYLE_DESCRIPTION_DATAVERSE, HARVEST_STYLE_DESCRIPTION_VDC, HARVEST_STYLE_DESCRIPTION_NESSTAR, HARVEST_STYLE_DESCRIPTION_ROPER, HARVEST_STYLE_DESCRIPTION_HGL, HARVEST_STYLE_DESCRIPTION_DEFAULT);
+ public static final List HARVEST_STYLE_LIST = Arrays.asList(HARVEST_STYLE_DATAVERSE, HARVEST_STYLE_VDC, HARVEST_STYLE_ICPSR, HARVEST_STYLE_NESSTAR, HARVEST_STYLE_ROPER, HARVEST_STYLE_HGL, HARVEST_STYLE_DEFAULT);
+ public static final List HARVEST_STYLE_DESCRIPTION_LIST = Arrays.asList(HARVEST_STYLE_DESCRIPTION_DATAVERSE, HARVEST_STYLE_DESCRIPTION_VDC, HARVEST_STYLE_DESCRIPTION_ICPSR, HARVEST_STYLE_DESCRIPTION_NESSTAR, HARVEST_STYLE_DESCRIPTION_ROPER, HARVEST_STYLE_DESCRIPTION_HGL, HARVEST_STYLE_DESCRIPTION_DEFAULT);
public static final Map HARVEST_STYLE_INFOMAP = new LinkedHashMap();
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/OAIRecordServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/OAIRecordServiceBean.java
index 99fb4f0c316..28322068519 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/OAIRecordServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/OAIRecordServiceBean.java
@@ -31,6 +31,7 @@
import javax.persistence.EntityManager;
import javax.persistence.PersistenceContext;
import javax.persistence.Query;
+import javax.persistence.TemporalType;
/**
*
@@ -303,18 +304,45 @@ public List findOaiRecordsBySetName(String setName) {
public List findOaiRecordsBySetName(String setName, Date from, Date until) {
- String queryString ="SELECT object(h) from OAIRecord as h";
- queryString += setName != null ? " where h.setName = :setName" : ""; // where h.setName is null";
+ String queryString ="SELECT object(h) from OAIRecord as h where h.id is not null";
+ queryString += setName != null ? " and h.setName = :setName" : ""; // where h.setName is null";
queryString += from != null ? " and h.lastUpdateTime >= :from" : "";
- queryString += until != null ? " and h.lastUpdateTime <= :until" : "";
+ queryString += until != null ? " and h.lastUpdateTime<=:until" : "";
logger.fine("Query: "+queryString);
Query query = em.createQuery(queryString);
if (setName != null) { query.setParameter("setName",setName); }
- if (from != null) { query.setParameter("from",from); }
- if (until != null) { query.setParameter("until",until); }
+ if (from != null) { query.setParameter("from",from,TemporalType.TIMESTAMP); }
+ // In order to achieve inclusivity on the "until" matching, we need to do
+ // the following (if the "until" parameter is supplied):
+ // 1) if the supplied "until" parameter has the time portion (and is not just
+ // a date), we'll increment it by one second. This is because the time stamps we
+ // keep in the database also have fractional thousands of a second.
+ // So, a record may be shown as "T17:35:45", but in the database it is
+ // actually "17:35:45.356", so "<= 17:35:45" isn't going to work on this
+ // time stamp! - So we want to try "<= 17:35:45" instead.
+ // 2) if it's just a date, we'll increment it by a *full day*. Otherwise
+ // our database time stamp of 2016-10-23T17:35:45.123Z is NOT going to
+ // match " <= 2016-10-23" - which is really going to be interpreted as
+ // "2016-10-23T00:00:00.000".
+ // -- L.A. 4.6
+ if (until != null) {
+ // 24 * 3600 * 1000 = number of milliseconds in a day.
+
+ if (until.getTime() % (24 * 3600 * 1000) == 0) {
+ // The supplied "until" parameter is a date, with no time
+ // portion.
+ logger.fine("plain date. incrementing by one day");
+ until.setTime(until.getTime()+(24 * 3600 * 1000));
+ } else {
+ logger.fine("date and time. incrementing by one second");
+ until.setTime(until.getTime()+1000);
+ }
+ query.setParameter("until",until,TemporalType.TIMESTAMP);
+ }
+
try {
return query.getResultList();
} catch (Exception ex) {
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/servlet/OAIServlet.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/servlet/OAIServlet.java
index b3b50c13d26..da43fccf744 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/servlet/OAIServlet.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/servlet/OAIServlet.java
@@ -6,31 +6,25 @@
package edu.harvard.iq.dataverse.harvest.server.web.servlet;
import com.lyncode.xml.exceptions.XmlWriteException;
-import com.lyncode.xoai.dataprovider.DataProvider;
import com.lyncode.xoai.dataprovider.builder.OAIRequestParametersBuilder;
-import com.lyncode.xoai.dataprovider.exceptions.BadArgumentException;
-import com.lyncode.xoai.dataprovider.exceptions.DuplicateDefinitionException;
-import com.lyncode.xoai.dataprovider.exceptions.IllegalVerbException;
import com.lyncode.xoai.dataprovider.exceptions.OAIException;
-import com.lyncode.xoai.dataprovider.exceptions.UnknownParameterException;
import com.lyncode.xoai.dataprovider.repository.Repository;
import com.lyncode.xoai.dataprovider.repository.RepositoryConfiguration;
import com.lyncode.xoai.dataprovider.model.Context;
import com.lyncode.xoai.dataprovider.model.MetadataFormat;
import com.lyncode.xoai.services.impl.SimpleResumptionTokenFormat;
-import static com.lyncode.xoai.dataprovider.model.MetadataFormat.identity;
-import com.lyncode.xoai.dataprovider.parameters.OAICompiledRequest;
-import static com.lyncode.xoai.dataprovider.parameters.OAIRequest.Parameter.MetadataPrefix;
import com.lyncode.xoai.dataprovider.repository.ItemRepository;
import com.lyncode.xoai.dataprovider.repository.SetRepository;
-import com.lyncode.xoai.exceptions.InvalidResumptionTokenException;
import com.lyncode.xoai.model.oaipmh.DeletedRecord;
-import com.lyncode.xoai.model.oaipmh.GetRecord;
import com.lyncode.xoai.model.oaipmh.Granularity;
import com.lyncode.xoai.model.oaipmh.OAIPMH;
+import static com.lyncode.xoai.model.oaipmh.OAIPMH.NAMESPACE_URI;
+import static com.lyncode.xoai.model.oaipmh.OAIPMH.SCHEMA_LOCATION;
+import com.lyncode.xoai.model.oaipmh.Verb;
+import com.lyncode.xoai.xml.XSISchema;
import com.lyncode.xoai.xml.XmlWriter;
-import edu.harvard.iq.dataverse.Dataset;
+import static com.lyncode.xoai.xml.XmlWriter.defaultContext;
import edu.harvard.iq.dataverse.DatasetServiceBean;
import edu.harvard.iq.dataverse.DataverseServiceBean;
import edu.harvard.iq.dataverse.export.ExportException;
@@ -38,16 +32,16 @@
import edu.harvard.iq.dataverse.export.spi.Exporter;
import edu.harvard.iq.dataverse.harvest.server.OAIRecordServiceBean;
import edu.harvard.iq.dataverse.harvest.server.OAISetServiceBean;
-import edu.harvard.iq.dataverse.harvest.server.web.XOAIItemRepository;
-import edu.harvard.iq.dataverse.harvest.server.web.XOAISetRepository;
-import edu.harvard.iq.dataverse.harvest.server.web.xMetadata;
+import edu.harvard.iq.dataverse.harvest.server.xoai.XdataProvider;
+import edu.harvard.iq.dataverse.harvest.server.xoai.XgetRecord;
+import edu.harvard.iq.dataverse.harvest.server.xoai.XitemRepository;
+import edu.harvard.iq.dataverse.harvest.server.xoai.XsetRepository;
+import edu.harvard.iq.dataverse.harvest.server.xoai.XlistRecords;
import edu.harvard.iq.dataverse.settings.SettingsServiceBean;
import edu.harvard.iq.dataverse.util.SystemConfig;
-import java.io.File;
-import java.io.FileInputStream;
+import java.io.ByteArrayOutputStream;
import java.io.IOException;
-import java.io.InputStream;
import java.io.OutputStream;
import java.io.OutputStreamWriter;
import java.io.Writer;
@@ -89,11 +83,12 @@ public class OAIServlet extends HttpServlet {
private static final Logger logger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.server.web.servlet.OAIServlet");
protected HashMap attributesMap = new HashMap();
- private static boolean debug = false;
+ private static final String OAI_PMH = "OAI-PMH";
+ private static final String RESPONSEDATE_FIELD = "responseDate";
+ private static final String REQUEST_FIELD = "request";
private static final String DATAVERSE_EXTENDED_METADATA_FORMAT = "dataverse_json";
private static final String DATAVERSE_EXTENDED_METADATA_INFO = "Custom Dataverse metadata in JSON format (Dataverse4 to Dataverse4 harvesting only)";
private static final String DATAVERSE_EXTENDED_METADATA_SCHEMA = "JSON schema pending";
- private static final String DATAVERSE_EXTENDED_METADATA_API = "/api/datasets/export";
private Context xoaiContext;
@@ -101,7 +96,7 @@ public class OAIServlet extends HttpServlet {
private ItemRepository itemRepository;
private RepositoryConfiguration repositoryConfiguration;
private Repository xoaiRepository;
- private DataProvider dataProvider;
+ private XdataProvider dataProvider;
public void init(ServletConfig config) throws ServletException {
super.init(config);
@@ -112,8 +107,8 @@ public void init(ServletConfig config) throws ServletException {
xoaiContext = addDataverseJsonMetadataFormat(xoaiContext);
}
- setRepository = new XOAISetRepository(setService);
- itemRepository = new XOAIItemRepository(recordService);
+ setRepository = new XsetRepository(setService);
+ itemRepository = new XitemRepository(recordService, datasetService);
repositoryConfiguration = createRepositoryConfiguration();
@@ -123,7 +118,7 @@ public void init(ServletConfig config) throws ServletException {
.withResumptionTokenFormatter(new SimpleResumptionTokenFormat())
.withConfiguration(repositoryConfiguration);
- dataProvider = new DataProvider(getXoaiContext(), getXoaiRepository());
+ dataProvider = new XdataProvider(getXoaiContext(), getXoaiRepository());
}
private Context createContext() {
@@ -199,7 +194,21 @@ private RepositoryConfiguration createRepositoryConfiguration() {
return repositoryConfiguration;
}
-
+ /**
+ * Handles the HTTP POST method.
+ *
+ * @param request servlet request
+ * @param response servlet response
+ * @throws ServletException if a servlet-specific error occurs
+ * @throws IOException if an I/O error occurs
+ */
+ @Override
+ protected void doPost(HttpServletRequest request, HttpServletResponse response)
+ throws ServletException, IOException {
+ processRequest(request, response);
+ }
+
+
/**
* Handles the HTTP GET method.
*
@@ -211,7 +220,12 @@ private RepositoryConfiguration createRepositoryConfiguration() {
@Override
protected void doGet(HttpServletRequest request, HttpServletResponse response)
throws ServletException, IOException {
-
+ processRequest(request, response);
+ }
+
+
+ private void processRequest(HttpServletRequest request, HttpServletResponse response)
+ throws ServletException, IOException {
try {
if (!isHarvestingServerEnabled()) {
@@ -226,18 +240,17 @@ protected void doGet(HttpServletRequest request, HttpServletResponse response)
for (Object p : request.getParameterMap().keySet()) {
String parameterName = (String)p;
String parameterValue = request.getParameter(parameterName);
-
parametersBuilder = parametersBuilder.with(parameterName, parameterValue);
+
}
- logger.fine("executing dataProvider.handle():");
OAIPMH handle = dataProvider.handle(parametersBuilder);
- logger.fine("executed dataProvider.handle().");
response.setContentType("text/xml;charset=UTF-8");
-
- if (isGetRecord(request)) {
- String formatName = parametersBuilder.build().get(MetadataPrefix);
- writeGetRecord(response, handle, formatName);
+
+ if (isGetRecord(request) && !handle.hasErrors()) {
+ writeGetRecord(response, handle);
+ } else if (isListRecords(request) && !handle.hasErrors()) {
+ writeListRecords(response, handle);
} else {
XmlWriter xmlWriter = new XmlWriter(response.getOutputStream());
xmlWriter.write(handle);
@@ -263,140 +276,108 @@ protected void doGet(HttpServletRequest request, HttpServletResponse response)
}
}
+
+ // Custom methods for the potentially expensive GetRecord and ListRecords requests:
+
+ private void writeListRecords(HttpServletResponse response, OAIPMH handle) throws IOException {
+ OutputStream outputStream = response.getOutputStream();
- private void writeGetRecord(HttpServletResponse response, OAIPMH handle, String formatName) throws IOException, XmlWriteException, XMLStreamException {
- // TODO:
- // produce clean failure records when proper record cannot be
- // produced for some reason.
-
- String responseBody = XmlWriter.toString(handle);
+ outputStream.write(oaiPmhResponseToString(handle).getBytes());
- responseBody = responseBody.replaceFirst(" ", "");
- outputStream.write(responseBody.getBytes());
- outputStream.flush();
-
- writeMetadataStream(inputStream, outputStream);
- } else {
- // Custom Dataverse metadata extension:
- // (check again if the client has explicitly requested/advertised support
- // of the extensions?)
-
- responseBody = responseBody.concat(customMetadataExtensionAttribute(identifier)+">");
- outputStream.write(responseBody.getBytes());
- outputStream.flush();
+ if (!verb.getType().equals(Verb.Type.ListRecords)) {
+ throw new IOException("writeListRecords() called on a non-ListRecords verb");
}
-
+ outputStream.write(("<" + verb.getType().displayName() + ">").getBytes());
+
+ outputStream.flush();
+
+ ((XlistRecords) verb).writeToStream(outputStream);
+
+ outputStream.write(("" + verb.getType().displayName() + ">").getBytes());
+ outputStream.write(("" + OAI_PMH + ">\n").getBytes());
- String responseFooter = " ";
- outputStream.write(responseFooter.getBytes());
outputStream.flush();
outputStream.close();
-
-
- }
-
- private String customMetadataExtensionAttribute(String identifier) {
- String ret = " directApiCall=\""
- + systemConfig.getDataverseSiteUrl()
- + DATAVERSE_EXTENDED_METADATA_API
- + "?exporter="
- + DATAVERSE_EXTENDED_METADATA_FORMAT
- + "&persistentId="
- + identifier
- + "\"";
-
- return ret;
+
}
- private void writeMetadataStream(InputStream inputStream, OutputStream outputStream) throws IOException {
- int bufsize;
- byte[] buffer = new byte[4 * 8192];
+ private void writeGetRecord(HttpServletResponse response, OAIPMH handle) throws IOException, XmlWriteException, XMLStreamException {
+ OutputStream outputStream = response.getOutputStream();
+
+ outputStream.write(oaiPmhResponseToString(handle).getBytes());
- while ((bufsize = inputStream.read(buffer)) != -1) {
- outputStream.write(buffer, 0, bufsize);
- outputStream.flush();
+ Verb verb = handle.getVerb();
+
+ if (verb == null) {
+ throw new IOException("An error or a valid response must be set");
+ }
+
+ if (!verb.getType().equals(Verb.Type.GetRecord)) {
+ throw new IOException("writeListRecords() called on a non-GetRecord verb");
}
- inputStream.close();
+ outputStream.write(("<" + verb.getType().displayName() + ">").getBytes());
+
+ outputStream.flush();
+
+ ((XgetRecord) verb).writeToStream(outputStream);
+
+ outputStream.write(("" + verb.getType().displayName() + ">").getBytes());
+ outputStream.write(("" + OAI_PMH + ">\n").getBytes());
+
+ outputStream.flush();
+ outputStream.close();
+
+ }
+
+ // This function produces the string representation of the top level,
+ // "service" record of an OAIPMH response (i.e., the header that precedes
+ // the actual "payload" record, such as , ,
+ // , etc.
+
+ private String oaiPmhResponseToString(OAIPMH handle) {
+ try {
+ ByteArrayOutputStream byteOutputStream = new ByteArrayOutputStream();
+ XmlWriter writer = new XmlWriter(byteOutputStream, defaultContext());
+
+ writer.writeStartElement(OAI_PMH);
+ writer.writeDefaultNamespace(NAMESPACE_URI);
+ writer.writeNamespace(XSISchema.PREFIX, XSISchema.NAMESPACE_URI);
+ writer.writeAttribute(XSISchema.PREFIX, XSISchema.NAMESPACE_URI, "schemaLocation",
+ NAMESPACE_URI + " " + SCHEMA_LOCATION);
+
+ writer.writeElement(RESPONSEDATE_FIELD, handle.getResponseDate(), Granularity.Second);
+ writer.writeElement(REQUEST_FIELD, handle.getRequest());
+ writer.writeEndElement();
+ writer.flush();
+ writer.close();
+
+ String ret = byteOutputStream.toString().replaceFirst(""+OAI_PMH+">", "");
+
+ return ret;
+ } catch (Exception ex) {
+ logger.warning("caught exception trying to convert an OAIPMH response header to string: " + ex.getMessage());
+ ex.printStackTrace();
+ return null;
+ }
}
private boolean isGetRecord(HttpServletRequest request) {
return "GetRecord".equals(request.getParameter("verb"));
}
-
-
- private boolean isExtendedDataverseMetadataMode(String formatName) {
- return DATAVERSE_EXTENDED_METADATA_FORMAT.equals(formatName);
- }
- /**
- * Get a response Writer depending on acceptable encodings
- * @param request the servlet's request information
- * @param response the servlet's response information
- * @exception IOException an I/O error occurred
- */
- public static Writer getWriter(HttpServletRequest request, HttpServletResponse response)
- throws IOException {
- Writer out;
- String encodings = request.getHeader("Accept-Encoding");
- if (debug) {
- System.out.println("encodings=" + encodings);
- }
- if (encodings != null && encodings.indexOf("gzip") != -1) {
- response.setHeader("Content-Encoding", "gzip");
- out = new OutputStreamWriter(new GZIPOutputStream(response.getOutputStream()),
- "UTF-8");
-
- } else if (encodings != null && encodings.indexOf("deflate") != -1) {
-
- response.setHeader("Content-Encoding", "deflate");
- out = new OutputStreamWriter(new DeflaterOutputStream(response.getOutputStream()),
- "UTF-8");
- } else {
- out = response.getWriter();
- }
- return out;
+
+ private boolean isListRecords(HttpServletRequest request) {
+ return "ListRecords".equals(request.getParameter("verb"));
}
- /**
- * Handles the HTTP POST method.
- *
- * @param request servlet request
- * @param response servlet response
- * @throws ServletException if a servlet-specific error occurs
- * @throws IOException if an I/O error occurs
- *//*
- @Override
- protected void doPost(HttpServletRequest request, HttpServletResponse response)
- throws ServletException, IOException {
- processRequest(request, response);
- }*/
-
protected Context getXoaiContext () {
return xoaiContext;
}
@@ -409,9 +390,6 @@ protected OAIRequestParametersBuilder newXoaiRequest() {
return new OAIRequestParametersBuilder();
}
- protected OAICompiledRequest compileXoaiRequest (OAIRequestParametersBuilder builder) throws BadArgumentException, InvalidResumptionTokenException, UnknownParameterException, IllegalVerbException, DuplicateDefinitionException {
- return OAICompiledRequest.compile(builder);
- }
public boolean isHarvestingServerEnabled() {
return systemConfig.isOAIServerEnabled();
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/xMetadata.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/xMetadata.java
deleted file mode 100644
index 052f29f3d96..00000000000
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/xMetadata.java
+++ /dev/null
@@ -1,82 +0,0 @@
- /*
- * To change this license header, choose License Headers in Project Properties.
- * To change this template file, choose Tools | Templates
- * and open the template in the editor.
- */
-package edu.harvard.iq.dataverse.harvest.server.web;
-
-import com.lyncode.xml.exceptions.XmlWriteException;
-import com.lyncode.xoai.model.oaipmh.Metadata;
-import com.lyncode.xoai.xml.XmlWriter;
-import edu.harvard.iq.dataverse.Dataset;
-import java.io.IOException;
-import java.io.InputStream;
-import java.io.OutputStream;
-
-/**
- *
- * @author Leonid Andreev
- */
-public class xMetadata extends Metadata {
- //private InputStream inputStream;
- //private Dataset dataset;
- //private boolean unread;
-
-
- public xMetadata(String value) {
- super(value);
- }
-
- /*public xMetadata(Dataset dataset) throws IOException {
- super((String)null);
- //this.inputStream = value;
- //this.unread = true;
- //this.dataset = dataset;
- }*/
-
-
- @Override
- public void write(XmlWriter writer) throws XmlWriteException {
- // Do nothing!
- // - rather than writing Metadata as an XML writer stram, we will write
- // the pre-exported *and pre-validated* content as a byte stream (below).
- }
-
- /*
- public Dataset getDataset() {
- return dataset;
- }
-
- public void setDataset(Dataset dataset) {
- this.dataset = dataset;
- }*/
-
- /*
- public void writeToStream(OutputStream outputStream) throws IOException {
- InputStream inputStream = getMetadataInputStream();
-
- outputStream.flush();
-
- int bufsize;
- byte[] buffer = new byte[4 * 8192];
-
- while ((bufsize = inputStream.read(buffer)) != -1) {
- outputStream.write(buffer, 0, bufsize);
- outputStream.flush();
- }
-
- inputStream.close();
- unread = false;
-
- }*/
-
- /*
- public InputStream getMetadataInputStream() throws IOException {
- if (unread && inputStream != null) {
- return inputStream;
- }
-
- throw new IOException ("No InputStream for the metadata record, or InputStream has already been read.");
- }
-*/
-}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java
new file mode 100644
index 00000000000..63b9fc1799f
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XdataProvider.java
@@ -0,0 +1,113 @@
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+
+import com.lyncode.builder.Builder;
+import com.lyncode.xoai.dataprovider.exceptions.*;
+import com.lyncode.xoai.dataprovider.handlers.*;
+import com.lyncode.xoai.exceptions.InvalidResumptionTokenException;
+import com.lyncode.xoai.dataprovider.model.Context;
+import com.lyncode.xoai.model.oaipmh.OAIPMH;
+import com.lyncode.xoai.model.oaipmh.Request;
+import com.lyncode.xoai.dataprovider.parameters.OAICompiledRequest;
+import com.lyncode.xoai.dataprovider.parameters.OAIRequest;
+import com.lyncode.xoai.dataprovider.repository.Repository;
+import com.lyncode.xoai.services.api.DateProvider;
+import com.lyncode.xoai.services.impl.UTCDateProvider;
+import org.apache.log4j.Logger;
+
+import static com.lyncode.xoai.dataprovider.parameters.OAIRequest.Parameter.*;
+
+/**
+ *
+ * @author Leonid Andreev
+ */
+public class XdataProvider {
+ private static Logger log = Logger.getLogger(XdataProvider.class);
+
+ public static XdataProvider dataProvider (Context context, Repository repository) {
+ return new XdataProvider(context, repository);
+ }
+
+ private Repository repository;
+ private DateProvider dateProvider;
+
+ private final IdentifyHandler identifyHandler;
+ private final XgetRecordHandler getRecordHandler;
+ private final ListSetsHandler listSetsHandler;
+ private final XlistRecordsHandler listRecordsHandler;
+ private final ListIdentifiersHandler listIdentifiersHandler;
+ private final ListMetadataFormatsHandler listMetadataFormatsHandler;
+ private final ErrorHandler errorsHandler;
+
+ public XdataProvider (Context context, Repository repository) {
+ this.repository = repository;
+ this.dateProvider = new UTCDateProvider();
+
+ this.identifyHandler = new IdentifyHandler(context, repository);
+ this.listSetsHandler = new ListSetsHandler(context, repository);
+ this.listMetadataFormatsHandler = new ListMetadataFormatsHandler(context, repository);
+ this.listRecordsHandler = new XlistRecordsHandler(context, repository);
+ this.listIdentifiersHandler = new ListIdentifiersHandler(context, repository);
+ //this.getRecordHandler = new GetRecordHandler(context, repository);
+ this.getRecordHandler = new XgetRecordHandler(context, repository);
+ this.errorsHandler = new ErrorHandler();
+ }
+
+ public OAIPMH handle (Builder builder) throws OAIException {
+ return handle(builder.build());
+ }
+
+ public OAIPMH handle (OAIRequest requestParameters) throws OAIException {
+ log.debug("Handling OAI request");
+ Request request = new Request(repository.getConfiguration().getBaseUrl())
+ .withVerbType(requestParameters.get(Verb))
+ .withResumptionToken(requestParameters.get(ResumptionToken))
+ .withIdentifier(requestParameters.get(Identifier))
+ .withMetadataPrefix(requestParameters.get(MetadataPrefix))
+ .withSet(requestParameters.get(Set))
+ .withFrom(requestParameters.get(From))
+ .withUntil(requestParameters.get(Until));
+
+ OAIPMH response = new OAIPMH()
+ .withRequest(request)
+ .withResponseDate(dateProvider.now());
+ try {
+ OAICompiledRequest parameters = compileParameters(requestParameters);
+
+ switch (request.getVerbType()) {
+ case Identify:
+ response.withVerb(identifyHandler.handle(parameters));
+ break;
+ case ListSets:
+ response.withVerb(listSetsHandler.handle(parameters));
+ break;
+ case ListMetadataFormats:
+ response.withVerb(listMetadataFormatsHandler.handle(parameters));
+ break;
+ case GetRecord:
+ response.withVerb(getRecordHandler.handle(parameters));
+ break;
+ case ListIdentifiers:
+ response.withVerb(listIdentifiersHandler.handle(parameters));
+ break;
+ case ListRecords:
+ response.withVerb(listRecordsHandler.handle(parameters));
+ break;
+ }
+ } catch (HandlerException e) {
+ log.debug(e.getMessage(), e);
+ response.withError(errorsHandler.handle(e));
+ }
+
+ return response;
+ }
+
+ private OAICompiledRequest compileParameters(OAIRequest requestParameters) throws IllegalVerbException, UnknownParameterException, BadArgumentException, DuplicateDefinitionException, BadResumptionToken {
+ try {
+ return requestParameters.compile();
+ } catch (InvalidResumptionTokenException e) {
+ throw new BadResumptionToken("The resumption token is invalid");
+ }
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XgetRecord.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XgetRecord.java
new file mode 100644
index 00000000000..d86f555d105
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XgetRecord.java
@@ -0,0 +1,52 @@
+/*
+ * To change this license header, choose License Headers in Project Properties.
+ * To change this template file, choose Tools | Templates
+ * and open the template in the editor.
+ */
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xoai.model.oaipmh.GetRecord;
+import com.lyncode.xoai.model.oaipmh.Record;
+import java.io.IOException;
+import java.io.OutputStream;
+
+/**
+ *
+ * @author Leonid Andreev
+ *
+ * This is the Dataverse extension of XOAI GetRecord,
+ * optimized to stream individual records to the output directly
+ */
+
+public class XgetRecord extends GetRecord {
+ private static final String RECORD_FIELD = "record";
+ private static final String RECORD_START_ELEMENT = "<"+RECORD_FIELD+">";
+ private static final String RECORD_CLOSE_ELEMENT = ""+RECORD_FIELD+">";
+ private static final String RESUMPTION_TOKEN_FIELD = "resumptionToken";
+ private static final String EXPIRATION_DATE_ATTRIBUTE = "expirationDate";
+ private static final String COMPLETE_LIST_SIZE_ATTRIBUTE = "completeListSize";
+ private static final String CURSOR_ATTRIBUTE = "cursor";
+
+
+ public XgetRecord(Xrecord record) {
+ super(record);
+ }
+
+ public void writeToStream(OutputStream outputStream) throws IOException {
+
+ if (this.getRecord() == null) {
+ throw new IOException("XgetRecord: null Record");
+ }
+ Xrecord xrecord = (Xrecord) this.getRecord();
+
+ outputStream.write(RECORD_START_ELEMENT.getBytes());
+ outputStream.flush();
+
+ xrecord.writeToStream(outputStream);
+
+ outputStream.write(RECORD_CLOSE_ELEMENT.getBytes());
+ outputStream.flush();
+
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XgetRecordHandler.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XgetRecordHandler.java
new file mode 100644
index 00000000000..ba28894482a
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XgetRecordHandler.java
@@ -0,0 +1,92 @@
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xml.exceptions.XmlWriteException;
+import com.lyncode.xoai.dataprovider.exceptions.BadArgumentException;
+import com.lyncode.xoai.dataprovider.exceptions.CannotDisseminateFormatException;
+import com.lyncode.xoai.dataprovider.parameters.OAICompiledRequest;
+import com.lyncode.xoai.dataprovider.exceptions.CannotDisseminateRecordException;
+import com.lyncode.xoai.dataprovider.exceptions.HandlerException;
+import com.lyncode.xoai.dataprovider.exceptions.IdDoesNotExistException;
+import com.lyncode.xoai.dataprovider.exceptions.NoMetadataFormatsException;
+import com.lyncode.xoai.dataprovider.exceptions.OAIException;
+import com.lyncode.xoai.dataprovider.handlers.VerbHandler;
+import com.lyncode.xoai.dataprovider.handlers.helpers.ItemHelper;
+import com.lyncode.xoai.dataprovider.model.Context;
+import com.lyncode.xoai.dataprovider.model.Item;
+import com.lyncode.xoai.dataprovider.model.MetadataFormat;
+import com.lyncode.xoai.dataprovider.model.Set;
+import com.lyncode.xoai.model.oaipmh.*;
+import com.lyncode.xoai.dataprovider.repository.Repository;
+import com.lyncode.xoai.xml.XSLPipeline;
+import com.lyncode.xoai.xml.XmlWriter;
+import edu.harvard.iq.dataverse.Dataset;
+
+import javax.xml.stream.XMLStreamException;
+import javax.xml.transform.TransformerException;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.util.logging.Logger;
+
+/*
+ * @author Leonid Andreev
+*/
+public class XgetRecordHandler extends VerbHandler {
+ private static Logger logger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.server.xoai.XgetRecordHandler");
+ public XgetRecordHandler(Context context, Repository repository) {
+ super(context, repository);
+ }
+
+ @Override
+ public GetRecord handle(OAICompiledRequest parameters) throws OAIException, HandlerException {
+
+ MetadataFormat format = getContext().formatForPrefix(parameters.getMetadataPrefix());
+ Item item = getRepository().getItemRepository().getItem(parameters.getIdentifier());
+
+ if (getContext().hasCondition() &&
+ !getContext().getCondition().getFilter(getRepository().getFilterResolver()).isItemShown(item))
+ throw new IdDoesNotExistException("This context does not include this item");
+
+ if (format.hasCondition() &&
+ !format.getCondition().getFilter(getRepository().getFilterResolver()).isItemShown(item))
+ throw new CannotDisseminateRecordException("Format not applicable to this item");
+
+
+ Xrecord record = this.createRecord(parameters, item);
+ GetRecord result = new XgetRecord(record);
+
+ return result;
+ }
+
+ private Xrecord createRecord(OAICompiledRequest parameters, Item item)
+ throws BadArgumentException, CannotDisseminateRecordException,
+ OAIException, NoMetadataFormatsException, CannotDisseminateFormatException {
+ MetadataFormat format = getContext().formatForPrefix(parameters.getMetadataPrefix());
+ Header header = new Header();
+
+ Dataset dataset = ((Xitem)item).getDataset();
+ Xrecord xrecord = new Xrecord().withFormatName(parameters.getMetadataPrefix()).withDataset(dataset);
+ header.withIdentifier(item.getIdentifier());
+
+ ItemHelper itemHelperWrap = new ItemHelper(item);
+ header.withDatestamp(item.getDatestamp());
+ for (Set set : itemHelperWrap.getSets(getContext(), getRepository().getFilterResolver()))
+ header.withSetSpec(set.getSpec());
+ if (item.isDeleted())
+ header.withStatus(Header.Status.DELETED);
+
+ xrecord.withHeader(header);
+ xrecord.withMetadata(item.getMetadata());
+
+ return xrecord;
+ }
+
+ private XSLPipeline toPipeline(Item item) throws XmlWriteException, XMLStreamException {
+ ByteArrayOutputStream output = new ByteArrayOutputStream();
+ XmlWriter writer = new XmlWriter(output);
+ Metadata metadata = item.getMetadata();
+ metadata.write(writer);
+ writer.close();
+ return new XSLPipeline(new ByteArrayInputStream(output.toByteArray()), true);
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAIItem.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xitem.java
similarity index 68%
rename from src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAIItem.java
rename to src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xitem.java
index 231e322aba3..66c589a4192 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAIItem.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xitem.java
@@ -3,40 +3,29 @@
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
-package edu.harvard.iq.dataverse.harvest.server.web;
+package edu.harvard.iq.dataverse.harvest.server.xoai;
-import com.google.common.base.Function;
-import com.lyncode.builder.ListBuilder;
import com.lyncode.xoai.dataprovider.model.Item;
import com.lyncode.xoai.dataprovider.model.Set;
import com.lyncode.xoai.model.oaipmh.About;
-import com.lyncode.xoai.model.oaipmh.Metadata;
-import com.lyncode.xoai.model.xoai.Element;
-import com.lyncode.xoai.model.xoai.XOAIMetadata;
import edu.harvard.iq.dataverse.Dataset;
import edu.harvard.iq.dataverse.harvest.server.OAIRecord;
-import java.io.File;
-import java.io.FileInputStream;
-import java.io.FileNotFoundException;
-import java.io.InputStream;
import java.util.ArrayList;
import java.util.Date;
-import java.util.HashMap;
import java.util.List;
-import java.util.Map;
-import static org.apache.commons.lang3.RandomStringUtils.randomAlphabetic;
-import static org.apache.commons.lang3.RandomStringUtils.randomNumeric;
+
/**
*
* @author Leonid Andreev
+ *
* This is an implemention of an Lyncode XOAI Item;
* You can think of it as an XOAI Item wrapper around the
* Dataverse OAIRecord entity.
*/
-public class XOAIItem implements Item {
+public class Xitem implements Item {
- public XOAIItem(OAIRecord oaiRecord) {
+ public Xitem(OAIRecord oaiRecord) {
super();
this.oaiRecord = oaiRecord;
}
@@ -51,14 +40,25 @@ public void setOaiRecord(OAIRecord oaiRecord) {
this.oaiRecord = oaiRecord;
}
+ private Dataset dataset;
+
+ public Dataset getDataset() {
+ return dataset;
+ }
+
+ public Xitem withDataset(Dataset dataset) {
+ this.dataset = dataset;
+ return this;
+ }
+
@Override
public List getAbout() {
return null;
}
@Override
- public xMetadata getMetadata() {
- return new xMetadata((String)null);
+ public Xmetadata getMetadata() {
+ return new Xmetadata((String)null);
}
@Override
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAIItemRepository.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XitemRepository.java
similarity index 68%
rename from src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAIItemRepository.java
rename to src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XitemRepository.java
index ba5c9678aab..a147ddb3ddc 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAIItemRepository.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XitemRepository.java
@@ -3,8 +3,9 @@
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
-package edu.harvard.iq.dataverse.harvest.server.web;
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+import edu.harvard.iq.dataverse.harvest.server.xoai.Xitem;
import com.lyncode.xoai.dataprovider.exceptions.IdDoesNotExistException;
import com.lyncode.xoai.dataprovider.exceptions.OAIException;
import com.lyncode.xoai.dataprovider.filter.ScopedFilter;
@@ -13,6 +14,8 @@
import com.lyncode.xoai.dataprovider.model.Item;
import com.lyncode.xoai.dataprovider.model.ItemIdentifier;
import com.lyncode.xoai.dataprovider.repository.ItemRepository;
+import edu.harvard.iq.dataverse.Dataset;
+import edu.harvard.iq.dataverse.DatasetServiceBean;
import edu.harvard.iq.dataverse.harvest.server.OAIRecord;
import edu.harvard.iq.dataverse.harvest.server.OAIRecordServiceBean;
import static java.lang.Math.min;
@@ -21,6 +24,9 @@
import java.util.Date;
import java.util.List;
import java.util.logging.Logger;
+import static java.lang.Math.min;
+import static java.lang.Math.min;
+import static java.lang.Math.min;
/**
*
@@ -30,27 +36,32 @@
* XOAI "items".
*/
-public class XOAIItemRepository implements ItemRepository {
- private static Logger logger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.server.web.XOAIItemRepository");
+public class XitemRepository implements ItemRepository {
+ private static Logger logger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.server.xoai.XitemRepository");
private OAIRecordServiceBean recordService;
+ private DatasetServiceBean datasetService;
- public XOAIItemRepository (OAIRecordServiceBean recordService) {
+ public XitemRepository (OAIRecordServiceBean recordService, DatasetServiceBean datasetService) {
super();
this.recordService = recordService;
+ this.datasetService = datasetService;
}
- private List list = new ArrayList();
+ private List list = new ArrayList();
@Override
public Item getItem(String identifier) throws IdDoesNotExistException, OAIException {
- logger.fine("getItem; calling findOAIRecordBySetNameandGlobalId, identifier "+identifier);
+ logger.fine("getItem; calling findOAIRecordBySetNameandGlobalId, identifier " + identifier);
OAIRecord oaiRecord = recordService.findOAIRecordBySetNameandGlobalId(null, identifier);
if (oaiRecord != null) {
- return new XOAIItem(oaiRecord);
+ Dataset dataset = datasetService.findByGlobalId(oaiRecord.getGlobalId());
+ if (dataset != null) {
+ return new Xitem(oaiRecord).withDataset(dataset);
+ }
}
-
+
throw new IdDoesNotExistException();
}
@@ -106,7 +117,7 @@ public ListItemIdentifiersResult getItemIdentifiers(List filters,
for (int i = offset; i < offset + length && i < oaiRecords.size(); i++) {
OAIRecord record = oaiRecords.get(i);
- xoaiItems.add(new XOAIItem(record));
+ xoaiItems.add(new Xitem(record));
}
boolean hasMore = offset + length < oaiRecords.size();
ListItemIdentifiersResult result = new ListItemIdentifiersResult(hasMore, xoaiItems);
@@ -119,41 +130,68 @@ public ListItemIdentifiersResult getItemIdentifiers(List filters,
@Override
public ListItemsResults getItems(List filters, int offset, int length) throws OAIException {
- return new ListItemsResults(offset + length < list.size(), new ArrayList- (list.subList(offset, min(offset + length, list.size()))));
+ return getItems(filters, offset, length, null, null, null);
}
@Override
public ListItemsResults getItems(List
filters, int offset, int length, Date from) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ return getItems(filters, offset, length, null, from, null);
}
@Override
public ListItemsResults getItemsUntil(List filters, int offset, int length, Date until) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ return getItems(filters, offset, length, null, null, until);
}
@Override
public ListItemsResults getItems(List filters, int offset, int length, Date from, Date until) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ return getItems(filters, offset, length, null, from, until);
}
@Override
public ListItemsResults getItems(List filters, int offset, int length, String setSpec) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ return getItems(filters, offset, length, setSpec, null, null);
}
@Override
public ListItemsResults getItems(List filters, int offset, int length, String setSpec, Date from) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ return getItems(filters, offset, length, setSpec, from, null);
}
@Override
public ListItemsResults getItemsUntil(List filters, int offset, int length, String setSpec, Date until) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ return getItems(filters, offset, length, setSpec, null, until);
}
@Override
public ListItemsResults getItems(List filters, int offset, int length, String setSpec, Date from, Date until) throws OAIException {
- return null; //To change body of implemented methods use File | Settings | File Templates.
+ logger.fine("calling getItems; offset=" + offset
+ + ", length=" + length
+ + ", setSpec=" + setSpec
+ + ", from=" + from
+ + ", until=" + until);
+
+ List oaiRecords = recordService.findOaiRecordsBySetName(setSpec, from, until);
+
+ logger.fine("total " + oaiRecords.size() + " returned");
+
+ List- xoaiItems = new ArrayList<>();
+ if (oaiRecords != null && !oaiRecords.isEmpty()) {
+
+ for (int i = offset; i < offset + length && i < oaiRecords.size(); i++) {
+ OAIRecord oaiRecord = oaiRecords.get(i);
+ Dataset dataset = datasetService.findByGlobalId(oaiRecord.getGlobalId());
+ if (dataset != null) {
+ Xitem xItem = new Xitem(oaiRecord).withDataset(dataset);
+ xoaiItems.add(xItem);
+ }
+ }
+ boolean hasMore = offset + length < oaiRecords.size();
+ ListItemsResults result = new ListItemsResults(hasMore, xoaiItems);
+ logger.fine("returning result with " + xoaiItems.size() + " items.");
+ return result;
+ }
+
+ return new ListItemsResults(false, xoaiItems);
}
}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XlistRecords.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XlistRecords.java
new file mode 100644
index 00000000000..e2366119f54
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XlistRecords.java
@@ -0,0 +1,87 @@
+
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xml.exceptions.XmlWriteException;
+import static com.lyncode.xoai.model.oaipmh.Granularity.Second;
+import com.lyncode.xoai.model.oaipmh.ListRecords;
+import com.lyncode.xoai.model.oaipmh.Record;
+import com.lyncode.xoai.model.oaipmh.ResumptionToken;
+import com.lyncode.xoai.xml.XmlWriter;
+import static com.lyncode.xoai.xml.XmlWriter.defaultContext;
+import java.io.ByteArrayOutputStream;
+
+import java.io.IOException;
+import java.io.OutputStream;
+import javax.xml.stream.XMLStreamException;
+
+/**
+ *
+ * @author Leonid Andreev
+ *
+ * This is the Dataverse extension of XOAI ListRecords,
+ * optimized to stream individual records using fast dumping
+ * of pre-exported metadata fragments (and by-passing expensive
+ * XML parsing and writing).
+ */
+public class XlistRecords extends ListRecords {
+ private static final String RECORD_FIELD = "record";
+ private static final String RECORD_START_ELEMENT = "<"+RECORD_FIELD+">";
+ private static final String RECORD_CLOSE_ELEMENT = ""+RECORD_FIELD+">";
+ private static final String RESUMPTION_TOKEN_FIELD = "resumptionToken";
+ private static final String EXPIRATION_DATE_ATTRIBUTE = "expirationDate";
+ private static final String COMPLETE_LIST_SIZE_ATTRIBUTE = "completeListSize";
+ private static final String CURSOR_ATTRIBUTE = "cursor";
+
+ public void writeToStream(OutputStream outputStream) throws IOException {
+ if (!this.records.isEmpty()) {
+ for (Record record : this.records) {
+ outputStream.write(RECORD_START_ELEMENT.getBytes());
+ outputStream.flush();
+
+ ((Xrecord)record).writeToStream(outputStream);
+
+ outputStream.write(RECORD_CLOSE_ELEMENT.getBytes());
+ outputStream.flush();
+ }
+ }
+
+ if (resumptionToken != null) {
+
+ String resumptionTokenString = resumptionTokenToString(resumptionToken);
+ if (resumptionTokenString == null) {
+ throw new IOException("XlistRecords: failed to output resumption token");
+ }
+ outputStream.write(resumptionTokenString.getBytes());
+ outputStream.flush();
+ }
+ }
+
+ private String resumptionTokenToString(ResumptionToken token) {
+ try {
+ ByteArrayOutputStream byteOutputStream = new ByteArrayOutputStream();
+ XmlWriter writer = new XmlWriter(byteOutputStream, defaultContext());
+
+ writer.writeStartElement(RESUMPTION_TOKEN_FIELD);
+
+ if (token.getExpirationDate() != null)
+ writer.writeAttribute(EXPIRATION_DATE_ATTRIBUTE, token.getExpirationDate(), Second);
+ if (token.getCompleteListSize() != null)
+ writer.writeAttribute(COMPLETE_LIST_SIZE_ATTRIBUTE, "" + token.getCompleteListSize());
+ if (token.getCursor() != null)
+ writer.writeAttribute(CURSOR_ATTRIBUTE, "" + token.getCursor());
+ if (token.getValue() != null)
+ writer.write(token.getValue());
+
+ writer.writeEndElement(); // resumptionToken;
+ writer.flush();
+ writer.close();
+
+ String ret = byteOutputStream.toString();
+
+ return ret;
+ } catch (XMLStreamException | XmlWriteException e) {
+ return null;
+ }
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XlistRecordsHandler.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XlistRecordsHandler.java
new file mode 100644
index 00000000000..5651545e505
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XlistRecordsHandler.java
@@ -0,0 +1,160 @@
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xml.exceptions.XmlWriteException;
+import com.lyncode.xoai.dataprovider.handlers.*;
+import com.lyncode.xoai.dataprovider.exceptions.*;
+import com.lyncode.xoai.dataprovider.handlers.results.ListItemsResults;
+import com.lyncode.xoai.dataprovider.handlers.helpers.ItemHelper;
+import com.lyncode.xoai.dataprovider.handlers.helpers.ItemRepositoryHelper;
+import com.lyncode.xoai.dataprovider.handlers.helpers.ResumptionTokenHelper;
+import com.lyncode.xoai.dataprovider.handlers.helpers.SetRepositoryHelper;
+import com.lyncode.xoai.dataprovider.model.Context;
+import com.lyncode.xoai.dataprovider.model.Item;
+import com.lyncode.xoai.dataprovider.model.MetadataFormat;
+import com.lyncode.xoai.dataprovider.model.Set;
+import com.lyncode.xoai.model.oaipmh.*;
+import com.lyncode.xoai.dataprovider.parameters.OAICompiledRequest;
+import com.lyncode.xoai.dataprovider.repository.Repository;
+import com.lyncode.xoai.xml.XSLPipeline;
+import com.lyncode.xoai.xml.XmlWriter;
+import edu.harvard.iq.dataverse.Dataset;
+
+import javax.xml.stream.XMLStreamException;
+import javax.xml.transform.TransformerException;
+import java.io.ByteArrayInputStream;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.util.List;
+
+/**
+ *
+ * @author Leonid Andreev
+ *
+ * This is Dataverse's own implementation of ListRecords Verb Handler
+ * (used instead of the ListRecordsHandler provided by XOAI).
+ * It is customized to support the optimizations that allows
+ * Dataverse to directly output pre-exported metadata records to the output
+ * stream, bypassing expensive XML parsing and writing.
+ */
+public class XlistRecordsHandler extends VerbHandler
{
+ private static java.util.logging.Logger logger = java.util.logging.Logger.getLogger("XlistRecordsHandler");
+ private final ItemRepositoryHelper itemRepositoryHelper;
+ private final SetRepositoryHelper setRepositoryHelper;
+
+ public XlistRecordsHandler(Context context, Repository repository) {
+ super(context, repository);
+ this.itemRepositoryHelper = new ItemRepositoryHelper(getRepository().getItemRepository());
+ this.setRepositoryHelper = new SetRepositoryHelper(getRepository().getSetRepository());
+ }
+
+ @Override
+ public ListRecords handle(OAICompiledRequest parameters) throws OAIException, HandlerException {
+ XlistRecords res = new XlistRecords();
+ int length = getRepository().getConfiguration().getMaxListRecords();
+
+ if (parameters.hasSet() && !getRepository().getSetRepository().supportSets())
+ throw new DoesNotSupportSetsException();
+
+ int offset = getOffset(parameters);
+ ListItemsResults result;
+ if (!parameters.hasSet()) {
+ if (parameters.hasFrom() && !parameters.hasUntil())
+ result = itemRepositoryHelper.getItems(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getFrom());
+ else if (!parameters.hasFrom() && parameters.hasUntil())
+ result = itemRepositoryHelper.getItemsUntil(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getUntil());
+ else if (parameters.hasFrom() && parameters.hasUntil())
+ result = itemRepositoryHelper.getItems(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getFrom(), parameters.getUntil());
+ else
+ result = itemRepositoryHelper.getItems(getContext(), offset,
+ length, parameters.getMetadataPrefix());
+ } else {
+ if (!setRepositoryHelper.exists(getContext(), parameters.getSet())) {
+ // throw new NoMatchesException();
+ }
+ if (parameters.hasFrom() && !parameters.hasUntil())
+ result = itemRepositoryHelper.getItems(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getSet(), parameters.getFrom());
+ else if (!parameters.hasFrom() && parameters.hasUntil())
+ result = itemRepositoryHelper.getItemsUntil(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getSet(), parameters.getUntil());
+ else if (parameters.hasFrom() && parameters.hasUntil())
+ result = itemRepositoryHelper.getItems(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getSet(), parameters.getFrom(),
+ parameters.getUntil());
+ else
+ result = itemRepositoryHelper.getItems(getContext(), offset,
+ length, parameters.getMetadataPrefix(),
+ parameters.getSet());
+ }
+
+ List- results = result.getResults();
+ if (results.isEmpty()) throw new NoMatchesException();
+ for (Item i : results)
+ res.withRecord(this.createRecord(parameters, i));
+
+
+ ResumptionToken.Value currentResumptionToken = new ResumptionToken.Value();
+ if (parameters.hasResumptionToken()) {
+ currentResumptionToken = parameters.getResumptionToken();
+ } else if (result.hasMore()) {
+ currentResumptionToken = parameters.extractResumptionToken();
+ }
+
+ XresumptionTokenHelper resumptionTokenHelper = new XresumptionTokenHelper(currentResumptionToken,
+ getRepository().getConfiguration().getMaxListRecords());
+ res.withResumptionToken(resumptionTokenHelper.resolve(result.hasMore()));
+
+ return res;
+ }
+
+
+ private int getOffset(OAICompiledRequest parameters) {
+ if (!parameters.hasResumptionToken())
+ return 0;
+ if (parameters.getResumptionToken().getOffset() == null)
+ return 0;
+ return parameters.getResumptionToken().getOffset().intValue();
+ }
+
+ private Record createRecord(OAICompiledRequest parameters, Item item)
+ throws BadArgumentException, CannotDisseminateRecordException,
+ OAIException, NoMetadataFormatsException, CannotDisseminateFormatException {
+ MetadataFormat format = getContext().formatForPrefix(parameters.getMetadataPrefix());
+ Header header = new Header();
+
+ Dataset dataset = ((Xitem)item).getDataset();
+ Xrecord xrecord = new Xrecord().withFormatName(parameters.getMetadataPrefix()).withDataset(dataset);
+ header.withIdentifier(item.getIdentifier());
+
+ ItemHelper itemHelperWrap = new ItemHelper(item);
+ header.withDatestamp(item.getDatestamp());
+ for (Set set : itemHelperWrap.getSets(getContext(), getRepository().getFilterResolver()))
+ header.withSetSpec(set.getSpec());
+ if (item.isDeleted())
+ header.withStatus(Header.Status.DELETED);
+
+ xrecord.withHeader(header);
+ xrecord.withMetadata(item.getMetadata());
+
+ return xrecord;
+ }
+
+
+ private XSLPipeline toPipeline(Item item) throws XmlWriteException, XMLStreamException {
+ ByteArrayOutputStream output = new ByteArrayOutputStream();
+ XmlWriter writer = new XmlWriter(output);
+ Metadata metadata = item.getMetadata();
+ metadata.write(writer);
+ writer.close();
+ return new XSLPipeline(new ByteArrayInputStream(output.toByteArray()), true);
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xmetadata.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xmetadata.java
new file mode 100644
index 00000000000..fd8427251b3
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xmetadata.java
@@ -0,0 +1,27 @@
+
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xml.exceptions.XmlWriteException;
+import com.lyncode.xoai.model.oaipmh.Metadata;
+import com.lyncode.xoai.xml.XmlWriter;
+
+/**
+ *
+ * @author Leonid Andreev
+ */
+public class Xmetadata extends Metadata {
+
+
+ public Xmetadata(String value) {
+ super(value);
+ }
+
+
+ @Override
+ public void write(XmlWriter writer) throws XmlWriteException {
+ // Do nothing!
+ // - rather than writing Metadata as an XML writer stram, we will write
+ // the pre-exported *and pre-validated* content as a byte stream (below).
+ }
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java
new file mode 100644
index 00000000000..192615090fc
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/Xrecord.java
@@ -0,0 +1,178 @@
+
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xoai.model.oaipmh.Header;
+import com.lyncode.xoai.model.oaipmh.Record;
+import com.lyncode.xoai.xml.XmlWriter;
+import static com.lyncode.xoai.xml.XmlWriter.defaultContext;
+
+import edu.harvard.iq.dataverse.Dataset;
+import edu.harvard.iq.dataverse.export.ExportException;
+import edu.harvard.iq.dataverse.export.ExportService;
+import static edu.harvard.iq.dataverse.util.SystemConfig.FQDN;
+import static edu.harvard.iq.dataverse.util.SystemConfig.SITE_URL;
+import java.io.ByteArrayOutputStream;
+import java.io.IOException;
+import java.io.InputStream;
+import java.io.OutputStream;
+import java.net.InetAddress;
+import java.net.UnknownHostException;
+
+/**
+ *
+ * @author Leonid Andreev
+ *
+ * This is the Dataverse extension of XOAI Record,
+ * optimized to directly output a pre-exported metadata record to the
+ * output stream, thus by-passing expensive parsing and writing by
+ * an XML writer, as in the original XOAI implementation.
+ */
+
+public class Xrecord extends Record {
+ private static final String METADATA_FIELD = "metadata";
+ private static final String METADATA_START_ELEMENT = "<"+METADATA_FIELD+">";
+ private static final String METADATA_END_ELEMENT = ""+METADATA_FIELD+">";
+ private static final String HEADER_FIELD = "header";
+ private static final String STATUS_ATTRIBUTE = "status";
+ private static final String IDENTIFIER_FIELD = "identifier";
+ private static final String DATESTAMP_FIELD = "datestamp";
+ private static final String SETSPEC_FIELD = "setSpec";
+ private static final String DATAVERSE_EXTENDED_METADATA_FORMAT = "dataverse_json";
+ private static final String DATAVERSE_EXTENDED_METADATA_API = "/api/datasets/export";
+
+ protected Dataset dataset;
+ protected String formatName;
+
+
+ public Dataset getDataset() {
+ return dataset;
+ }
+
+ public Xrecord withDataset(Dataset dataset) {
+ this.dataset = dataset;
+ return this;
+ }
+
+
+ public String getFormatName() {
+ return formatName;
+ }
+
+
+ public Xrecord withFormatName(String formatName) {
+ this.formatName = formatName;
+ return this;
+ }
+
+ public void writeToStream(OutputStream outputStream) throws IOException {
+ outputStream.flush();
+
+ String headerString = itemHeaderToString(this.header);
+
+ if (headerString == null) {
+ throw new IOException("Xrecord: failed to stream item header.");
+ }
+
+ outputStream.write(headerString.getBytes());
+
+ if (!isExtendedDataverseMetadataMode(formatName)) {
+ outputStream.write(METADATA_START_ELEMENT.getBytes());
+
+ outputStream.flush();
+
+ if (dataset != null && formatName != null) {
+ InputStream inputStream = null;
+ try {
+ inputStream = ExportService.getInstance().getExport(dataset, formatName);
+ } catch (ExportException ex) {
+ inputStream = null;
+ }
+
+ if (inputStream == null) {
+ throw new IOException("Xrecord: failed to open metadata stream.");
+ }
+ writeMetadataStream(inputStream, outputStream);
+ }
+ outputStream.write(METADATA_END_ELEMENT.getBytes());
+ } else {
+ outputStream.write(customMetadataExtensionRef(this.dataset.getGlobalId()).getBytes());
+ }
+ outputStream.flush();
+
+ }
+
+ private String itemHeaderToString(Header header) {
+ try {
+ ByteArrayOutputStream byteOutputStream = new ByteArrayOutputStream();
+ XmlWriter writer = new XmlWriter(byteOutputStream, defaultContext());
+
+ writer.writeStartElement(HEADER_FIELD);
+
+ if (header.getStatus() != null) {
+ writer.writeAttribute(STATUS_ATTRIBUTE, header.getStatus().value());
+ }
+ writer.writeElement(IDENTIFIER_FIELD, header.getIdentifier());
+ writer.writeElement(DATESTAMP_FIELD, header.getDatestamp());
+ for (String setSpec : header.getSetSpecs()) {
+ writer.writeElement(SETSPEC_FIELD, setSpec);
+ }
+ writer.writeEndElement(); // header
+ writer.flush();
+ writer.close();
+
+ String ret = byteOutputStream.toString();
+
+ return ret;
+ } catch (Exception ex) {
+ return null;
+ }
+ }
+
+ private void writeMetadataStream(InputStream inputStream, OutputStream outputStream) throws IOException {
+ int bufsize;
+ byte[] buffer = new byte[4 * 8192];
+
+ while ((bufsize = inputStream.read(buffer)) != -1) {
+ outputStream.write(buffer, 0, bufsize);
+ outputStream.flush();
+ }
+
+ inputStream.close();
+ }
+
+ private String customMetadataExtensionRef(String identifier) {
+ String ret = "<" + METADATA_FIELD
+ + " directApiCall=\""
+ + getDataverseSiteUrl()
+ + DATAVERSE_EXTENDED_METADATA_API
+ + "?exporter="
+ + DATAVERSE_EXTENDED_METADATA_FORMAT
+ + "&persistentId="
+ + identifier
+ + "\""
+ + "/>";
+
+ return ret;
+ }
+
+ private boolean isExtendedDataverseMetadataMode(String formatName) {
+ return DATAVERSE_EXTENDED_METADATA_FORMAT.equals(formatName);
+ }
+
+ private String getDataverseSiteUrl() {
+ String hostUrl = System.getProperty(SITE_URL);
+ if (hostUrl != null && !"".equals(hostUrl)) {
+ return hostUrl;
+ }
+ String hostName = System.getProperty(FQDN);
+ if (hostName == null) {
+ try {
+ hostName = InetAddress.getLocalHost().getCanonicalHostName();
+ } catch (UnknownHostException e) {
+ return null;
+ }
+ }
+ hostUrl = "https://" + hostName;
+ return hostUrl;
+ }
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XresumptionTokenHelper.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XresumptionTokenHelper.java
new file mode 100644
index 00000000000..7f9eac2cbe8
--- /dev/null
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XresumptionTokenHelper.java
@@ -0,0 +1,61 @@
+
+package edu.harvard.iq.dataverse.harvest.server.xoai;
+
+import com.lyncode.xoai.dataprovider.handlers.helpers.ResumptionTokenHelper;
+import com.lyncode.xoai.model.oaipmh.ResumptionToken;
+import static java.lang.Math.round;
+import static com.google.common.base.Predicates.isNull;
+
+/**
+ *
+ * @author Leonid Andreev
+ * Dataverse's own version of the XOAI ResumptionTokenHelper
+ * Fixes the issue with the offset cursor: the OAI validation spec
+ * insists that it starts with 0, while the XOAI implementation uses 1
+ * as the initial offset.
+ */
+public class XresumptionTokenHelper {
+
+ private ResumptionToken.Value current;
+ private long maxPerPage;
+ private Long totalResults;
+
+ public XresumptionTokenHelper(ResumptionToken.Value current, long maxPerPage) {
+ this.current = current;
+ this.maxPerPage = maxPerPage;
+ }
+
+ public XresumptionTokenHelper withTotalResults(long totalResults) {
+ this.totalResults = totalResults;
+ return this;
+ }
+
+ public ResumptionToken resolve (boolean hasMoreResults) {
+ if (isInitialOffset() && !hasMoreResults) return null;
+ else {
+ if (hasMoreResults) {
+ ResumptionToken.Value next = current.next(maxPerPage);
+ return populate(new ResumptionToken(next));
+ } else {
+ ResumptionToken resumptionToken = new ResumptionToken();
+ resumptionToken.withCursor(round((current.getOffset()) / maxPerPage));
+ if (totalResults != null)
+ resumptionToken.withCompleteListSize(totalResults);
+ return resumptionToken;
+ }
+ }
+ }
+
+ private boolean isInitialOffset() {
+ return isNull().apply(current.getOffset()) || current.getOffset() == 0;
+ }
+
+ private ResumptionToken populate(ResumptionToken resumptionToken) {
+ if (totalResults != null)
+ resumptionToken.withCompleteListSize(totalResults);
+ resumptionToken.withCursor(round((resumptionToken.getValue().getOffset() - maxPerPage)/ maxPerPage));
+ return resumptionToken;
+ }
+
+
+}
diff --git a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAISetRepository.java b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XsetRepository.java
similarity index 86%
rename from src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAISetRepository.java
rename to src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XsetRepository.java
index ae4b8eb8623..d932e160007 100644
--- a/src/main/java/edu/harvard/iq/dataverse/harvest/server/web/XOAISetRepository.java
+++ b/src/main/java/edu/harvard/iq/dataverse/harvest/server/xoai/XsetRepository.java
@@ -3,7 +3,7 @@
* To change this template file, choose Tools | Templates
* and open the template in the editor.
*/
-package edu.harvard.iq.dataverse.harvest.server.web;
+package edu.harvard.iq.dataverse.harvest.server.xoai;
import com.lyncode.xoai.model.xoai.Element;
import com.lyncode.xoai.dataprovider.repository.SetRepository;
@@ -16,19 +16,17 @@
import java.util.ArrayList;
import java.util.List;
import java.util.logging.Logger;
-import org.apache.commons.logging.Log;
-import org.apache.commons.logging.LogFactory;
/**
*
* @author Leonid Andreev
*/
-public class XOAISetRepository implements SetRepository {
- private static Logger logger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.server.web.XOAISetRepository");
+public class XsetRepository implements SetRepository {
+ private static Logger logger = Logger.getLogger("edu.harvard.iq.dataverse.harvest.server.xoai.XsetRepository");
private OAISetServiceBean setService;
- public XOAISetRepository (OAISetServiceBean setService) {
+ public XsetRepository (OAISetServiceBean setService) {
super();
this.setService = setService;
}
@@ -44,7 +42,7 @@ public void setSetService(OAISetServiceBean setService) {
@Override
public boolean supportSets() {
- logger.info("calling supportSets()");
+ logger.fine("calling supportSets()");
List
dataverseOAISets = setService.findAll();
if (dataverseOAISets == null || dataverseOAISets.isEmpty()) {
@@ -55,7 +53,7 @@ public boolean supportSets() {
@Override
public ListSetsResult retrieveSets(int offset, int length) {
- logger.info("calling retrieveSets()");
+ logger.fine("calling retrieveSets()");
List dataverseOAISets = setService.findAll();
List XOAISets = new ArrayList();
diff --git a/src/main/java/edu/harvard/iq/dataverse/ingest/IngestServiceBean.java b/src/main/java/edu/harvard/iq/dataverse/ingest/IngestServiceBean.java
index 77dbeebdf22..324bebc8cd6 100644
--- a/src/main/java/edu/harvard/iq/dataverse/ingest/IngestServiceBean.java
+++ b/src/main/java/edu/harvard/iq/dataverse/ingest/IngestServiceBean.java
@@ -20,6 +20,7 @@
package edu.harvard.iq.dataverse.ingest;
+import com.google.common.collect.Lists;
import edu.harvard.iq.dataverse.ControlledVocabularyValue;
import edu.harvard.iq.dataverse.datavariable.VariableServiceBean;
import edu.harvard.iq.dataverse.DatasetServiceBean;
@@ -37,7 +38,6 @@
import edu.harvard.iq.dataverse.MetadataBlock;
import edu.harvard.iq.dataverse.authorization.users.AuthenticatedUser;
import edu.harvard.iq.dataverse.dataaccess.DataFileIO;
-import edu.harvard.iq.dataverse.dataaccess.DataAccessOption;
import edu.harvard.iq.dataverse.dataaccess.FileAccessIO;
import edu.harvard.iq.dataverse.dataaccess.ImageThumbConverter;
import edu.harvard.iq.dataverse.dataaccess.TabularSubsetGenerator;
@@ -62,36 +62,27 @@
import edu.harvard.iq.dataverse.ingest.tabulardata.impl.plugins.por.PORFileReader;
import edu.harvard.iq.dataverse.ingest.tabulardata.impl.plugins.por.PORFileReaderSpi;
import edu.harvard.iq.dataverse.util.FileUtil;
-import edu.harvard.iq.dataverse.util.MD5Checksum;
-import edu.harvard.iq.dataverse.util.ShapefileHandler;
import edu.harvard.iq.dataverse.util.SumStatCalculator;
import edu.harvard.iq.dataverse.util.SystemConfig;
//import edu.harvard.iq.dvn.unf.*;
import org.dataverse.unf.*;
import java.io.BufferedInputStream;
-import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileInputStream;
import java.io.IOException;
import java.io.FileNotFoundException;
-import java.io.FileOutputStream;
import java.io.InputStream;
-import java.nio.channels.Channels;
import java.nio.channels.FileChannel;
-import java.nio.channels.ReadableByteChannel;
import java.nio.channels.WritableByteChannel;
-import java.nio.charset.Charset;
+import java.nio.file.DirectoryStream;
import java.nio.file.Files;
import java.nio.file.Path;
import java.nio.file.Paths;
import java.nio.file.StandardCopyOption;
-import java.sql.Timestamp;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Collection;
-import java.util.Date;
-import java.util.HashSet;
import java.util.List;
import java.util.Iterator;
import java.util.LinkedHashSet;
@@ -99,7 +90,6 @@
import java.util.Set;
import java.util.Arrays;
import java.util.Comparator;
-import java.util.logging.Level;
import java.util.logging.Logger;
import javax.ejb.EJB;
import javax.ejb.Stateless;
@@ -114,17 +104,6 @@
import javax.jms.Message;
import javax.faces.bean.ManagedBean;
import javax.faces.application.FacesMessage;
-import java.util.zip.GZIPInputStream;
-import java.util.zip.ZipEntry;
-import java.util.zip.ZipException;
-import java.util.zip.ZipInputStream;
-import javax.annotation.PostConstruct;
-import javax.ejb.EJBException;
-import javax.ejb.Singleton;
-import javax.ejb.Startup;
-import org.apache.commons.io.FileUtils;
-import org.primefaces.push.EventBus;
-import org.primefaces.push.EventBusFactory;
/**
*
@@ -154,720 +133,39 @@ public class IngestServiceBean {
@Resource(mappedName = "jms/IngestQueueConnectionFactory")
QueueConnectionFactory factory;
- // TODO: [in process!]
- // move all the type-related lookups into the file service (L.A.)
-
- private static final String MIME_TYPE_STATA = "application/x-stata";
- private static final String MIME_TYPE_STATA13 = "application/x-stata-13";
- private static final String MIME_TYPE_RDATA = "application/x-rlang-transport";
-
- private static final String MIME_TYPE_CSV = "text/csv";
- private static final String MIME_TYPE_CSV_ALT = "text/comma-separated-values";
-
- private static final String MIME_TYPE_XLSX = "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet";
- private static final String MIME_TYPE_SPSS_SAV = "application/x-spss-sav";
- private static final String MIME_TYPE_SPSS_POR = "application/x-spss-por";
-
- private static final String MIME_TYPE_TAB = "text/tab-separated-values";
-
- private static final String MIME_TYPE_FITS = "application/fits";
-
- private static final String MIME_TYPE_ZIP = "application/zip";
-
- private static final String MIME_TYPE_UNDETERMINED_DEFAULT = "application/octet-stream";
- private static final String MIME_TYPE_UNDETERMINED_BINARY = "application/binary";
-
- private static final String SAVED_ORIGINAL_FILENAME_EXTENSION = "orig";
-
private static String timeFormat_hmsS = "HH:mm:ss.SSS";
private static String dateTimeFormat_ymdhmsS = "yyyy-MM-dd HH:mm:ss.SSS";
private static String dateFormat_ymd = "yyyy-MM-dd";
-
- /*
- Commenting out the @PostConstruct/init method.
- This was going through the datasets on startup and looking for ingests
- in progress, un-marking the progress status.
- This was before we realized that the JMS queue survived glassfish
- restarts.
- It appears that any purging of the queue will need to be done outside
- the application.
- -- L.A. May 4 2015
-
- @PostConstruct
- public void init() {
- logger.info("Initializing the Ingest Service.");
- try {
- List ingestsInProgress = fileService.findIngestsInProgress();
- if (ingestsInProgress != null && ingestsInProgress.size() > 0) {
- logger.log(Level.INFO, "Ingest Service: {0} files are in the queue.", ingestsInProgress.size());
- // go through the queue, remove the "ingest in progress" flags and the
- // any dataset locks found:
- Iterator dfit = ingestsInProgress.iterator();
- while (dfit.hasNext()) {
- DataFile datafile = (DataFile)dfit.next();
- logger.log(Level.INFO, "Ingest Service: removing ingest-in-progress status on datafile {0}", datafile.getId());
- datafile.setIngestDone();
- datafile = fileService.save(datafile);
-
- if (datafile.getOwner() != null && datafile.getOwner().isLocked()) {
- if (datafile.getOwner().getId() != null) {
- logger.log(Level.FINE, "Ingest Servioce: removing lock on dataset {0}", datafile.getOwner().getId());
- datasetService.removeDatasetLock(datafile.getOwner().getId());
- }
- }
- }
- } else {
- logger.info("Ingest Service: zero files in the ingest queue.");
- }
- } catch ( EJBException ex ) {
- logger.log(Level.WARNING, "Error initing the IngestServiceBean: {0}", ex.getMessage());
- }
- }
- */
-
- @Deprecated
- // All the parts of the app should use the createDataFiles() method instead,
- // that returns a list of DataFiles.
- public DataFile createDataFile(DatasetVersion version, InputStream inputStream, String fileName, String contentType) throws IOException {
- List fileList = createDataFiles(version, inputStream, fileName, contentType);
-
- if (fileList == null) {
- return null;
- }
-
- return fileList.get(0);
- }
-
- public List createDataFiles(DatasetVersion version, InputStream inputStream, String fileName, String suppliedContentType) throws IOException {
- List datafiles = new ArrayList();
-
- String warningMessage = null;
-
- // save the file, in the temporary location for now:
- Path tempFile = null;
-
-
- if (getFilesTempDirectory() != null) {
- tempFile = Files.createTempFile(Paths.get(getFilesTempDirectory()), "tmp", "upload");
- // "temporary" location is the key here; this is why we are not using
- // the DataStore framework for this - the assumption is that
- // temp files will always be stored on the local filesystem.
- // -- L.A. Jul. 2014
- logger.fine("Will attempt to save the file as: " + tempFile.toString());
- Files.copy(inputStream, tempFile, StandardCopyOption.REPLACE_EXISTING);
- } else {
- throw new IOException ("Temp directory is not configured.");
- }
- logger.fine("mime type supplied: "+suppliedContentType);
- // Let's try our own utilities (Jhove, etc.) to determine the file type
- // of the uploaded file. (We may already have a mime type supplied for this
- // file - maybe the type that the browser recognized on upload; or, if
- // it's a harvest, maybe the remote server has already given us the type
- // for this file... with our own type utility we may or may not do better
- // than the type supplied:
- // -- L.A.
- String recognizedType = null;
- String finalType = null;
- try {
- recognizedType = FileUtil.determineFileType(tempFile.toFile(), fileName);
- logger.fine("File utility recognized the file as " + recognizedType);
- if (recognizedType != null && !recognizedType.equals("")) {
- // is it any better than the type that was supplied to us,
- // if any?
- // This is not as trivial a task as one might expect...
- // We may need a list of "good" mime types, that should always
- // be chosen over other choices available. Maybe it should
- // even be a weighed list... as in, "application/foo" should
- // be chosen over "application/foo-with-bells-and-whistles".
-
- // For now the logic will be as follows:
- //
- // 1. If the contentType supplied (by the browser, most likely)
- // is some form of "unknown", we always discard it in favor of
- // whatever our own utilities have determined;
- // 2. We should NEVER trust the browser when it comes to the
- // following "ingestable" types: Stata, SPSS, R;
- // 2a. We are willing to TRUST the browser when it comes to
- // the CSV and XSLX ingestable types.
- // 3. We should ALWAYS trust our utilities when it comes to
- // ingestable types.
-
- if (suppliedContentType == null
- || suppliedContentType.equals("")
- || suppliedContentType.equalsIgnoreCase(MIME_TYPE_UNDETERMINED_DEFAULT)
- || suppliedContentType.equalsIgnoreCase(MIME_TYPE_UNDETERMINED_BINARY)
- || (ingestableAsTabular(suppliedContentType)
- && !suppliedContentType.equalsIgnoreCase(MIME_TYPE_CSV)
- && !suppliedContentType.equalsIgnoreCase(MIME_TYPE_CSV_ALT)
- && !suppliedContentType.equalsIgnoreCase(MIME_TYPE_XLSX))
- || ingestableAsTabular(recognizedType)
- || recognizedType.equals("application/fits-gzipped")
- || recognizedType.equalsIgnoreCase(ShapefileHandler.SHAPEFILE_FILE_TYPE)
- || recognizedType.equals(MIME_TYPE_ZIP)) {
- finalType = recognizedType;
- }
- }
-
- } catch (Exception ex) {
- logger.warning("Failed to run the file utility mime type check on file " + fileName);
- }
-
- if (finalType == null) {
- finalType = (suppliedContentType == null || suppliedContentType.equals(""))
- ? MIME_TYPE_UNDETERMINED_DEFAULT
- : suppliedContentType;
- }
-
- // A few special cases:
-
- // if this is a gzipped FITS file, we'll uncompress it, and ingest it as
- // a regular FITS file:
-
- if (finalType.equals("application/fits-gzipped")) {
-
- InputStream uncompressedIn = null;
- String finalFileName = fileName;
- // if the file name had the ".gz" extension, remove it,
- // since we are going to uncompress it:
- if (fileName != null && fileName.matches(".*\\.gz$")) {
- finalFileName = fileName.replaceAll("\\.gz$", "");
- }
-
- DataFile datafile = null;
- try {
- uncompressedIn = new GZIPInputStream(new FileInputStream(tempFile.toFile()));
- datafile = createSingleDataFile(version, uncompressedIn, finalFileName, MIME_TYPE_UNDETERMINED_DEFAULT);
- } catch (IOException ioex) {
- datafile = null;
- } finally {
- if (uncompressedIn != null) {
- try {uncompressedIn.close();} catch (IOException e) {}
- }
- }
-
- // If we were able to produce an uncompressed file, we'll use it
- // to create and return a final DataFile; if not, we're not going
- // to do anything - and then a new DataFile will be created further
- // down, from the original, uncompressed file.
- if (datafile != null) {
- // remove the compressed temp file:
- try {
- tempFile.toFile().delete();
- } catch (SecurityException ex) {
- // (this is very non-fatal)
- logger.warning("Failed to delete temporary file "+tempFile.toString());
- }
-
- datafiles.add(datafile);
- return datafiles;
- }
-
- // If it's a ZIP file, we are going to unpack it and create multiple
- // DataFile objects from its contents:
- } else if (finalType.equals("application/zip")) {
-
- ZipInputStream unZippedIn = null;
- ZipEntry zipEntry = null;
-
- int fileNumberLimit = systemConfig.getZipUploadFilesLimit();
-
- try {
- Charset charset = null;
- /*
- TODO: (?)
- We may want to investigate somehow letting the user specify
- the charset for the filenames in the zip file...
- - otherwise, ZipInputStream bails out if it encounteres a file
- name that's not valid in the current charest (i.e., UTF-8, in
- our case). It would be a bit trickier than what we're doing for
- SPSS tabular ingests - with the lang. encoding pulldown menu -
- because this encoding needs to be specified *before* we upload and
- attempt to unzip the file.
- -- L.A. 4.0 beta12
- logger.info("default charset is "+Charset.defaultCharset().name());
- if (Charset.isSupported("US-ASCII")) {
- logger.info("charset US-ASCII is supported.");
- charset = Charset.forName("US-ASCII");
- if (charset != null) {
- logger.info("was able to obtain charset for US-ASCII");
- }
-
- }
- */
-
- if (charset != null) {
- unZippedIn = new ZipInputStream(new FileInputStream(tempFile.toFile()), charset);
- } else {
- unZippedIn = new ZipInputStream(new FileInputStream(tempFile.toFile()));
- }
-
- while (true) {
- try {
- zipEntry = unZippedIn.getNextEntry();
- } catch (IllegalArgumentException iaex) {
- // Note:
- // ZipInputStream documentation doesn't even mention that
- // getNextEntry() throws an IllegalArgumentException!
- // but that's what happens if the file name of the next
- // entry is not valid in the current CharSet.
- // -- L.A.
- warningMessage = "Failed to unpack Zip file. (Unknown Character Set used in a file name?) Saving the file as is.";
- logger.warning(warningMessage);
- throw new IOException();
- }
-
- if (zipEntry == null) {
- break;
- }
- // Note that some zip entries may be directories - we
- // simply skip them:
-
- if (!zipEntry.isDirectory()) {
- if (datafiles.size() > fileNumberLimit) {
- logger.warning("Zip upload - too many files.");
- warningMessage = "The number of files in the zip archive is over the limit (" + fileNumberLimit +
- "); please upload a zip archive with fewer files, if you want them to be ingested " +
- "as individual DataFiles.";
- throw new IOException();
- }
-
- String fileEntryName = zipEntry.getName();
- logger.fine("ZipEntry, file: "+fileEntryName);
-
- if (fileEntryName != null && !fileEntryName.equals("")) {
-
- String shortName = fileEntryName.replaceFirst("^.*[\\/]", "");
-
- // Check if it's a "fake" file - a zip archive entry
- // created for a MacOS X filesystem element: (these
- // start with "._")
- if (!shortName.startsWith("._") && !shortName.startsWith(".DS_Store") && !"".equals(shortName)) {
- // OK, this seems like an OK file entry - we'll try
- // to read it and create a DataFile with it:
-
- DataFile datafile = createSingleDataFile(version, unZippedIn, shortName, MIME_TYPE_UNDETERMINED_DEFAULT, false);
-
- if (!fileEntryName.equals(shortName)) {
- String categoryName = fileEntryName.replaceFirst("[\\/][^\\/]*$", "");
- if (!"".equals(categoryName)) {
- logger.fine("setting category to " + categoryName);
- //datafile.getFileMetadata().setCategory(categoryName.replaceAll("[\\/]", "-"));
- datafile.getFileMetadata().addCategoryByName(categoryName.replaceAll("[\\/]", "-"));
- }
- }
-
- if (datafile != null) {
- // We have created this datafile with the mime type "unknown";
- // Now that we have it saved in a temporary location,
- // let's try and determine its real type:
-
- String tempFileName = getFilesTempDirectory() + "/" + datafile.getStorageIdentifier();
-
- try {
- recognizedType = FileUtil.determineFileType(new File(tempFileName), shortName);
- logger.fine("File utility recognized unzipped file as " + recognizedType);
- if (recognizedType != null && !recognizedType.equals("")) {
- datafile.setContentType(recognizedType);
- }
- } catch (Exception ex) {
- logger.warning("Failed to run the file utility mime type check on file " + fileName);
- }
-
- datafiles.add(datafile);
- }
- }
- }
- }
- unZippedIn.closeEntry();
-
- }
-
- } catch (IOException ioex) {
- // just clear the datafiles list and let
- // ingest default to creating a single DataFile out
- // of the unzipped file.
- logger.warning("Unzipping failed; rolling back to saving the file as is.");
- if (warningMessage == null) {
- warningMessage = "Failed to unzip the file. Saving the file as is.";
- }
-
- datafiles.clear();
- } finally {
- if (unZippedIn != null) {
- try {unZippedIn.close();} catch (Exception zEx) {}
- }
- }
- if (datafiles.size() > 0) {
- // link the data files to the dataset/version:
- Iterator itf = datafiles.iterator();
- while (itf.hasNext()) {
- DataFile datafile = itf.next();
- datafile.setOwner(version.getDataset());
- if (version.getFileMetadatas() == null) {
- version.setFileMetadatas(new ArrayList());
- }
- version.getFileMetadatas().add(datafile.getFileMetadata());
- datafile.getFileMetadata().setDatasetVersion(version);
-
- /* TODO!!
- // re-implement this in some way that does not use the
- // deprecated .getCategory() on FileMeatadata:
- if (datafile.getFileMetadata().getCategory() != null) {
- datafile.getFileMetadata().addCategoryByName(datafile.getFileMetadata().getCategory());
- datafile.getFileMetadata().setCategory(null);
- -- done? see above?
- }
- */
- version.getDataset().getFiles().add(datafile);
- }
- // remove the uploaded zip file:
- try {
- Files.delete(tempFile);
- } catch (IOException ioex) {
- // do nothing - it's just a temp file.
- logger.warning("Could not remove temp file "+tempFile.getFileName().toString());
- }
- // and return:
- return datafiles;
- }
-
- } else if (finalType.equalsIgnoreCase(ShapefileHandler.SHAPEFILE_FILE_TYPE)) {
- // Shape files may have to be split into multiple files,
- // one zip archive per each complete set of shape files:
-
- //File rezipFolder = new File(this.getFilesTempDirectory());
- File rezipFolder = this.getShapefileUnzipTempDirectory();
-
- IngestServiceShapefileHelper shpIngestHelper;
- shpIngestHelper = new IngestServiceShapefileHelper(tempFile.toFile(), rezipFolder);
-
- boolean didProcessWork = shpIngestHelper.processFile();
- if (!(didProcessWork)){
- logger.severe("Processing of zipped shapefile failed.");
- return null;
- }
- for (File finalFile : shpIngestHelper.getFinalRezippedFiles()){
- FileInputStream finalFileInputStream = new FileInputStream(finalFile);
- finalType = this.getContentType(finalFile);
- if (finalType==null){
- logger.warning("Content type is null; but should default to 'MIME_TYPE_UNDETERMINED_DEFAULT'");
- continue;
- }
- DataFile new_datafile = createSingleDataFile(version, finalFileInputStream, finalFile.getName(), finalType);
- if (new_datafile != null) {
- datafiles.add(new_datafile);
- }else{
- logger.severe("Could not add part of rezipped shapefile. new_datafile was null: " + finalFile.getName());
- }
- finalFileInputStream.close();
-
- }
-
- // Delete the temp directory used for unzipping
- /*
- logger.fine("Delete temp shapefile unzip directory: " + rezipFolder.getAbsolutePath());
- FileUtils.deleteDirectory(rezipFolder);
-
- // Delete rezipped files
- for (File finalFile : shpIngestHelper.getFinalRezippedFiles()){
- if (finalFile.isFile()){
- finalFile.delete();
- }
- }
- */
-
- if (datafiles.size() > 0) {
- return datafiles;
- }else{
- logger.severe("No files added from directory of rezipped shapefiles");
- }
- return null;
-
- }
-
-
- // Finally, if none of the special cases above were applicable (or
- // if we were unable to unpack an uploaded file, etc.), we'll just
- // create and return a single DataFile:
- // (Note that we are passing null for the InputStream; that's because
- // we already have the file saved; we'll just need to rename it, below)
-
- DataFile datafile = createSingleDataFile(version, null, fileName, finalType);
-
- if (datafile != null) {
- fileService.generateStorageIdentifier(datafile);
- if (!tempFile.toFile().renameTo(new File(getFilesTempDirectory() + "/" + datafile.getStorageIdentifier()))) {
- return null;
- }
-
- // MD5:
- MD5Checksum md5Checksum = new MD5Checksum();
- try {
- datafile.setmd5(md5Checksum.CalculateMD5(getFilesTempDirectory() + "/" + datafile.getStorageIdentifier()));
- } catch (Exception md5ex) {
- logger.warning("Could not calculate MD5 signature for new file " + fileName);
- }
-
- if (warningMessage != null) {
- createIngestFailureReport(datafile, warningMessage);
- datafile.SetIngestProblem();
- }
- datafiles.add(datafile);
-
- return datafiles;
- }
-
- return null;
- } // end createDataFiles
-
- // TODO:
- // add comments explaining what's going on in the 2 methods below.
- // -- L.A. 4.0 beta
- private String checkForDuplicateFileNames(DatasetVersion version, String fileName) {
- Set fileNamesExisting = new HashSet();
-
- Iterator fmIt = version.getFileMetadatas().iterator();
- while (fmIt.hasNext()) {
- FileMetadata fm = fmIt.next();
- String existingName = fm.getLabel();
-
- if (existingName != null) {
- // if it's a tabular file, we need to restore the original file name;
- // otherwise, we may miss a match. e.g. stata file foobar.dta becomes
- // foobar.tab once ingested!
- if (fm.getDataFile().isTabularData()) {
- String originalMimeType = fm.getDataFile().getDataTable().getOriginalFileFormat();
- if ( originalMimeType != null) {
- String origFileExtension = generateOriginalExtension(originalMimeType);
- fileNamesExisting.add(existingName.replaceAll(".tab$", origFileExtension));
- } else {
- fileNamesExisting.add(existingName.replaceAll(".tab$", ""));
- }
- }
- fileNamesExisting.add(existingName);
- }
- }
-
- while (fileNamesExisting.contains(fileName)) {
- fileName = generateNewFileName(fileName);
- }
-
- return fileName;
- }
-
- private void checkForDuplicateFileNamesFinal(DatasetVersion version, List newFiles) {
- Set fileNamesExisting = new HashSet();
-
- Iterator fmIt = version.getFileMetadatas().iterator();
- while (fmIt.hasNext()) {
- FileMetadata fm = fmIt.next();
- if (fm.getId() != null) {
- String existingName = fm.getLabel();
-
- if (existingName != null) {
- // if it's a tabular file, we need to restore the original file name;
- // otherwise, we may miss a match. e.g. stata file foobar.dta becomes
- // foobar.tab once ingested!
- if (fm.getDataFile().isTabularData()) {
- String originalMimeType = fm.getDataFile().getDataTable().getOriginalFileFormat();
- if ( originalMimeType != null) {
- String origFileExtension = generateOriginalExtension(originalMimeType);
- existingName = existingName.replaceAll(".tab$", origFileExtension);
- } else {
- existingName = existingName.replaceAll(".tab$", "");
- }
- }
- fileNamesExisting.add(existingName);
- }
- }
- }
-
- Iterator dfIt = newFiles.iterator();
- while (dfIt.hasNext()) {
- FileMetadata fm = dfIt.next().getFileMetadata();
- String fileName = fm.getLabel();
- while (fileNamesExisting.contains(fileName)) {
- fileName = generateNewFileName(fileName);
- }
- if (!fm.getLabel().equals(fileName)) {
- fm.setLabel(fileName);
- fileNamesExisting.add(fileName);
- }
- }
- }
-
- // TODO:
- // Move this method (duplicated in StoredOriginalFile.java) to
- // FileUtil.java.
- // -- L.A. 4.0 beta
-
- private static String generateOriginalExtension(String fileType) {
-
- if (fileType.equalsIgnoreCase("application/x-spss-sav")) {
- return ".sav";
- } else if (fileType.equalsIgnoreCase("application/x-spss-por")) {
- return ".por";
- } else if (fileType.equalsIgnoreCase("application/x-stata")) {
- return ".dta";
- } else if (fileType.equalsIgnoreCase( "application/x-rlang-transport")) {
- return ".RData";
- } else if (fileType.equalsIgnoreCase("text/csv")) {
- return ".csv";
- } else if (fileType.equalsIgnoreCase( "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet")) {
- return ".xlsx";
- }
-
- return "";
- }
-
-
- private String generateNewFileName(String fileName) {
- String newName = null;
- String baseName = null;
- String extension = null;
-
- int extensionIndex = fileName.lastIndexOf(".");
- if (extensionIndex != -1 ) {
- extension = fileName.substring(extensionIndex+1);
- baseName = fileName.substring(0, extensionIndex);
- } else {
- baseName = fileName;
- }
-
- if (baseName.matches(".*-[0-9][0-9]*$")) {
- int dashIndex = baseName.lastIndexOf("-");
- String numSuffix = baseName.substring(dashIndex+1);
- String basePrefix = baseName.substring(0,dashIndex);
- int numSuffixValue = new Integer(numSuffix).intValue();
- numSuffixValue++;
- baseName = basePrefix + "-" + numSuffixValue;
- } else {
- baseName = baseName + "-1";
- }
-
- newName = baseName;
- if (extension != null) {
- newName = newName + "." + extension;
- }
-
- return newName;
- }
-
- /**
- * Returns a content type string for a FileObject
- *
- */
- private String getContentType(File fileObject){
- if (fileObject==null){
- return null;
- }
- String contentType;
- try {
- contentType = FileUtil.determineFileType(fileObject, fileObject.getName());
- } catch (Exception ex) {
- logger.warning("FileUtil.determineFileType failed for file with name: " + fileObject.getName());
- contentType = null;
- }
+ // addFilesToDataset() takes a list of new DataFiles and attaches them to the parent
+ // Dataset (the files are attached to the dataset, and the fileMetadatas to the
+ // supplied version).
+ public void addFilesToDataset(DatasetVersion version, List newFiles) {
+ if (newFiles != null && newFiles.size() > 0) {
- if ((contentType==null)||(contentType.equals(""))){
- contentType = MIME_TYPE_UNDETERMINED_DEFAULT;
- }
- return contentType;
-
- }
- /*
- * This method creates a DataFile, and also saves the bytes from the suppplied
- * InputStream in the temporary location.
- * This method should only be called by the upper-level methods that handle
- * file upload and creation for individual use cases - a single file upload,
- * an upload of a zip archive that needs to be unpacked and turned into
- * individual files, etc., and once the file name and mime type have already
- * been figured out.
- */
-
- private DataFile createSingleDataFile(DatasetVersion version, InputStream inputStream, String fileName, String contentType) {
- return createSingleDataFile(version, inputStream, fileName, contentType, true);
- }
-
- private DataFile createSingleDataFile(DatasetVersion version, InputStream inputStream, String fileName, String contentType, boolean addToDataset) {
-
- DataFile datafile = new DataFile(contentType);
- datafile.setModificationTime(new Timestamp(new Date().getTime()));
- /**
- * @todo Think more about when permissions on files are modified.
- * Obviously, here at create time files have some sort of permissions,
- * even if these permissions are *implied*, by ViewUnpublishedDataset at
- * the dataset level, for example.
- */
- datafile.setPermissionModificationTime(new Timestamp(new Date().getTime()));
- FileMetadata fmd = new FileMetadata();
-
- fmd.setLabel(checkForDuplicateFileNames(version,fileName));
+ Dataset dataset = version.getDataset();
- if (addToDataset) {
- datafile.setOwner(version.getDataset());
- }
- fmd.setDataFile(datafile);
- datafile.getFileMetadatas().add(fmd);
- if (addToDataset) {
- if (version.getFileMetadatas() == null) {
- version.setFileMetadatas(new ArrayList());
- }
- version.getFileMetadatas().add(fmd);
- fmd.setDatasetVersion(version);
- version.getDataset().getFiles().add(datafile);
- }
+ for (DataFile dataFile : newFiles) {
- // And save the file - but only if the InputStream is not null;
- // (the temp file may be saved already - if this is a single
- // file upload case - and in that case this method gets called
- // with null for the inputStream)
-
- if (inputStream != null) {
-
- fileService.generateStorageIdentifier(datafile);
- BufferedOutputStream outputStream = null;
-
- // Once again, at this point we are dealing with *temp*
- // files only; these are always stored on the local filesystem,
- // so we are using FileInput/Output Streams to read and write
- // these directly, instead of going through the Data Access
- // framework.
- // -- L.A.
-
- try {
- outputStream = new BufferedOutputStream(new FileOutputStream(getFilesTempDirectory() + "/" + datafile.getStorageIdentifier()));
+ // These are all brand new files, so they should all have
+ // one filemetadata total. -- L.A.
+ FileMetadata fileMetadata = dataFile.getFileMetadatas().get(0);
+ String fileName = fileMetadata.getLabel();
- byte[] dataBuffer = new byte[8192];
- int i = 0;
+ // Attach the file to the dataset and to the version:
+ dataFile.setOwner(dataset);
- while ((i = inputStream.read(dataBuffer)) > 0) {
- outputStream.write(dataBuffer, 0, i);
- outputStream.flush();
- }
- } catch (IOException ioex) {
- datafile = null;
- } finally {
- try {
- outputStream.close();
- } catch (IOException ioex) {}
- }
-
- // MD5:
- if (datafile != null) {
- MD5Checksum md5Checksum = new MD5Checksum();
- try {
- datafile.setmd5(md5Checksum.CalculateMD5(getFilesTempDirectory() + "/" + datafile.getStorageIdentifier()));
- } catch (Exception md5ex) {
- logger.warning("Could not calculate MD5 signature for new file " + fileName);
- }
+ version.getFileMetadatas().add(dataFile.getFileMetadata());
+ dataFile.getFileMetadata().setDatasetVersion(version);
+ dataset.getFiles().add(dataFile);
}
}
-
- return datafile;
}
+ // This method tries to permanently store the files on the filesystem.
+ // It should be called before we attempt to permanently save the files in
+ // the database by calling the Save command on the dataset and/or version.
+ // TODO: rename the method finalizeFiles()? or something like that?
public void addFiles (DatasetVersion version, List newFiles) {
if (newFiles != null && newFiles.size() > 0) {
// final check for duplicate file names;
@@ -875,7 +173,7 @@ public void addFiles (DatasetVersion version, List newFiles) {
// the user may have edited them on the "add files" page, and
// renamed FOOBAR-1.txt back to FOOBAR.txt...
- checkForDuplicateFileNamesFinal(version, newFiles);
+ IngestUtil.checkForDuplicateFileNamesFinal(version, newFiles);
Dataset dataset = version.getDataset();
@@ -889,7 +187,7 @@ public void addFiles (DatasetVersion version, List newFiles) {
Files.createDirectories(dataset.getFileSystemDirectory());
}
} catch (IOException dirEx) {
- logger.severe("Failed to create study directory " + dataset.getFileSystemDirectory().toString());
+ logger.severe("Failed to create dataset directory " + dataset.getFileSystemDirectory().toString());
return;
// TODO:
// Decide how we are communicating failure information back to
@@ -900,19 +198,29 @@ public void addFiles (DatasetVersion version, List newFiles) {
if (dataset.getFileSystemDirectory() != null && Files.exists(dataset.getFileSystemDirectory())) {
for (DataFile dataFile : newFiles) {
- String tempFileLocation = getFilesTempDirectory() + "/" + dataFile.getStorageIdentifier();
+ String tempFileLocation = FileUtil.getFilesTempDirectory() + "/" + dataFile.getStorageIdentifier();
+ // These are all brand new files, so they should all have
+ // one filemetadata total. -- L.A.
FileMetadata fileMetadata = dataFile.getFileMetadatas().get(0);
String fileName = fileMetadata.getLabel();
// temp dbug line
- System.out.println("ADDING FILE: " + fileName + "; for dataset: " + dataset.getGlobalId());
+ //System.out.println("ADDING FILE: " + fileName + "; for dataset: " + dataset.getGlobalId());
+
+ // Make sure the file is attached to the dataset and to the version, if this
+ // hasn't been done yet:
+ if (dataFile.getOwner() == null) {
+ dataFile.setOwner(dataset);
+
+ version.getFileMetadatas().add(dataFile.getFileMetadata());
+ dataFile.getFileMetadata().setDatasetVersion(version);
+ dataset.getFiles().add(dataFile);
+ }
- // These are all brand new files, so they should all have
- // one filemetadata total. -- L.A.
boolean metadataExtracted = false;
- if (ingestableAsTabular(dataFile)) {
+ if (FileUtil.ingestableAsTabular(dataFile)) {
/*
* Note that we don't try to ingest the file right away -
* instead we mark it as "scheduled for ingest", then at
@@ -945,13 +253,20 @@ public void addFiles (DatasetVersion version, List newFiles) {
String storageId = dataFile.getStorageIdentifier().replaceFirst("^tmp://", "");
- Path tempLocationPath = Paths.get(getFilesTempDirectory() + "/" + storageId);
+ Path tempLocationPath = Paths.get(FileUtil.getFilesTempDirectory() + "/" + storageId);
WritableByteChannel writeChannel = null;
FileChannel readChannel = null;
+ boolean localFile = false;
+ boolean savedSuccess = false;
+
try {
DataFileIO dataAccess = dataFile.getAccessObject();
+
+ if (dataAccess.isLocalFile()) {
+ localFile = true;
+ }
/*
This commented-out code demonstrates how to copy bytes
@@ -988,6 +303,7 @@ from a local InputStream (or a readChannel) into the
// Set filesize in bytes
//
dataFile.setFilesize(dataAccess.getSize());
+ savedSuccess = true;
} catch (IOException ioex) {
logger.warning("Failed to save the file, storage id " + dataFile.getStorageIdentifier());
@@ -996,21 +312,39 @@ from a local InputStream (or a readChannel) into the
if (writeChannel != null) {try{writeChannel.close();}catch(IOException e){}}
}
- // delete the temporary file:
+ // Since we may have already spent some CPU cycles scaling down image thumbnails,
+ // we may as well save them, by moving these generated images to the permanent
+ // dataset directory. We should also remember to delete any such files in the
+ // temp directory:
+
+ List generatedTempFiles = listGeneratedTempFiles(Paths.get(FileUtil.getFilesTempDirectory()), dataFile.getStorageIdentifier());
+ if (generatedTempFiles != null) {
+ for (Path generated : generatedTempFiles) {
+ if (savedSuccess && localFile) {
+ logger.fine("Will try to permanently save generated file "+generated.toString());
+ try {
+ Files.copy(generated, Paths.get(dataset.getFileSystemDirectory().toString(), generated.getFileName().toString()));
+ } catch (IOException ioex) {
+ logger.warning("Failed to save generated file "+generated.toString());
+ }
+
+ try {
+ Files.delete(generated);
+ } catch (IOException ioex) {
+ logger.warning("Failed to delete generated file "+generated.toString());
+ }
+ }
+ }
+ }
+
try {
logger.fine("Will attempt to delete the temp file "+tempLocationPath.toString());
- // also, delete a temporary thumbnail image file, if exists:
- // (TODO: probably not a very good style, that the size of the thumbnail
- // is hard-coded here; it may change in the future...)
- Path tempThumbnailPath = Paths.get(tempLocationPath.toString() + ".thumb64");
Files.delete(tempLocationPath);
- if (tempThumbnailPath.toFile().exists()) {
- Files.delete(tempThumbnailPath);
- }
} catch (IOException ex) {
// (non-fatal - it's just a temp file.)
logger.warning("Failed to delete temp file "+tempLocationPath.toString());
}
+
// Any necessary post-processing:
performPostProcessingTasks(dataFile);
}
@@ -1018,65 +352,36 @@ from a local InputStream (or a readChannel) into the
}
}
- /**
- For the restructuring of zipped shapefiles, create a timestamped directory.
- This directory is deleted after successful restructuring.
-
- Naming convention: getFilesTempDirectory() + "shp_" + "yyyy-MM-dd-hh-mm-ss-SSS"
- */
- private File getShapefileUnzipTempDirectory(){
-
- String tempDirectory = this.getFilesTempDirectory();
- if (tempDirectory == null){
- logger.severe("Failed to retrieve tempDirectory, null was returned" );
+ private List listGeneratedTempFiles(Path tempDirectory, String baseName) {
+ List generatedFiles = new ArrayList<>();
+
+ // for example, .thumb64 or .thumb400.
+
+ if (baseName == null || baseName.equals("")) {
return null;
}
- String datestampedFileName = "shp_" + new SimpleDateFormat("yyyy-MM-dd-hh-mm-ss-SSS").format(new Date());
- String datestampedFolderName = tempDirectory + "/" + datestampedFileName;
-
- File datestampedFolder = new File(datestampedFolderName);
- if (!datestampedFolder.isDirectory()) {
- /* Note that "createDirectories()" must be used - not
- * "createDirectory()", to make sure all the parent
- * directories that may not yet exist are created as well.
- */
- try {
- Files.createDirectories(Paths.get(datestampedFolderName));
- } catch (IOException ex) {
- logger.severe("Failed to create temp. directory to unzip shapefile: " + datestampedFolderName );
- return null;
- }
- }
- return datestampedFolder;
- }
-
- public String getFilesTempDirectory() {
- String filesRootDirectory = System.getProperty("dataverse.files.directory");
- if (filesRootDirectory == null || filesRootDirectory.equals("")) {
- filesRootDirectory = "/tmp/files";
- }
- String filesTempDirectory = filesRootDirectory + "/temp";
+ DirectoryStream.Filter