Skip to content

Commit

Permalink
Merge branch 'develop' into 5052-fix-explore-button-render
Browse files Browse the repository at this point in the history
  • Loading branch information
sekmiller committed Sep 27, 2018
2 parents 1ff3a25 + bdedf46 commit 423e372
Show file tree
Hide file tree
Showing 53 changed files with 1,913 additions and 1,378 deletions.
2 changes: 1 addition & 1 deletion conf/docker/build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -100,7 +100,7 @@ fi
#
# Build init-container
#
cp ../../scripts/installer/postgres-setup dataverse-glassfish/init-container
cp ../../scripts/installer/install dataverse-glassfish/init-container
docker build -t $HUBORG/init-container:$TAG dataverse-glassfish/init-container
if [ "$1" == 'internal' ]; then
echo "Skipping docker push because we're using the internal Minishift registry."
Expand Down
5 changes: 2 additions & 3 deletions conf/docker/dataverse-glassfish/init-container/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,6 @@ RUN yum install -y \
postgresql \
sha1sum

COPY postgres-setup /
COPY install /

ENTRYPOINT ["/postgres-setup"]
CMD ["dataverse"]
ENTRYPOINT ["/install", "--pg_only", "--yes"]
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ ExecStart = /usr/bin/java -jar /usr/local/glassfish4/glassfish/lib/client/appser
ExecStop = /usr/bin/java -jar /usr/local/glassfish4/glassfish/lib/client/appserver-cli.jar stop-domain
ExecReload = /usr/bin/java -jar /usr/local/glassfish4/glassfish/lib/client/appserver-cli.jar restart-domain
LimitNOFILE=32768
DefaultTimeoutStartSec=120s # current default is 90s
Type = forking

[Install]
Expand Down
9 changes: 8 additions & 1 deletion doc/sphinx-guides/source/admin/dataverses-datasets.rst
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ Moves a dataset whose id is passed to a dataverse whose alias is passed. If the
Link a Dataset
^^^^^^^^^^^^^^

Creates a link between a dataset and a dataverse (see the Linked Dataverses + Linked Datasets section of the :doc:`/user/dataverse-management` guide for more information). Only accessible to superusers. ::
Creates a link between a dataset and a dataverse (see the Linked Dataverses + Linked Datasets section of the :doc:`/user/dataverse-management` guide for more information). ::

curl -H "X-Dataverse-key: $API_TOKEN" -X PUT http://$SERVER/api/datasets/$linked-dataset-id/link/$linking-dataverse-alias

Expand All @@ -51,3 +51,10 @@ Unlink a Dataset
Removes a link between a dataset and a dataverse. Only accessible to superusers. ::

curl -H "X-Dataverse-key: $API_TOKEN" -X DELETE http://$SERVER/api/datasets/$linked-dataset-id/deleteLink/$linking-dataverse-alias

Mint new PID for a Dataset
^^^^^^^^^^^^^^^^^^^^^^^^^^

Mints a new identifier for a dataset previously registered with a handle. Only accessible to superusers. ::

curl -H "X-Dataverse-key: $API_TOKEN" -X POST http://$SERVER/api/admin/$dataset-id/reregisterHDLToPID
7 changes: 7 additions & 0 deletions doc/sphinx-guides/source/api/native-api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -501,6 +501,13 @@ In the example below, the curator has saved the JSON file as :download:`reason-f

The review process can sometimes resemble a tennis match, with the authors submitting and resubmitting the dataset over and over until the curators are satisfied. Each time the curators send a "reason for return" via API, that reason is persisted into the database, stored at the dataset version level.

Link a Dataset
~~~~~~~~~~~~~~

Creates a link between a dataset and a dataverse (see the Linked Dataverses + Linked Datasets section of the :doc:`/user/dataverse-management` guide for more information). ::

curl -H "X-Dataverse-key: $API_TOKEN" -X PUT http://$SERVER/api/datasets/$linked-dataset-id/link/$linking-dataverse-alias

Dataset Locks
~~~~~~~~~~~~~

Expand Down
7 changes: 6 additions & 1 deletion doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,12 @@ Persistent Identifiers and Publishing Datasets

Persistent identifiers are a required and integral part of the Dataverse platform. They provide a URL that is guaranteed to resolve to the datasets or files they represent. Dataverse currently supports creating identifiers using DOI and Handle.

By default and for testing convenience, the installer configures a temporary DOI test namespace through EZID. This is sufficient to create and publish datasets and files, but they are not citable nor guaranteed to be preserved. Note that any datasets or files created using the test configuration cannot be directly migrated and would need to be created again once a valid DOI namespace is configured.
By default, the installer configures a test DOI namespace (10.5072) with DataCite as the registration provider. Please note that as of the release 4.9.3, we can no longer use EZID as the provider. Unlike EZID, DataCite requires that you register for a test account (please contact support@datacite.org). Once you receive the login name and password for the account, configure it in your domain.xml, as the following two JVM options::

<jvm-options>-Ddoi.username=...</jvm-options>
<jvm-options>-Ddoi.password=...</jvm-options>

and restart Glassfish. Once this is done, you will be able to publish datasets and files, but the persistent identifiers will not be citable or guaranteed to be preserved. Note that any datasets or files created using the test configuration cannot be directly migrated and would need to be created again once a valid DOI namespace is configured.

To properly configure persistent identifiers for a production installation, an account and associated namespace must be acquired for a fee from a DOI or HDL provider: **EZID** (http://ezid.cdlib.org), **DataCite** (https://www.datacite.org), **Handle.Net** (https://www.handle.net).

Expand Down
5 changes: 4 additions & 1 deletion doc/sphinx-guides/source/installation/installation-main.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,11 +22,14 @@ You should have already downloaded the installer from https://github.com/IQSS/da

Unpack the zip file - this will create the directory ``dvinstall``.

**Important:** The installer will need to use the PostgreSQL command line utility ``psql`` in order to configure the database. If the executable is not in your system PATH, the installer will try to locate it on your system. However, we strongly recommend that you check and make sure it is in the PATH. This is especially important if you have multiple versions of PostgreSQL installed on your system. Make sure the psql that came with the version that you want to use with your Dataverse is the first on your path. For example, if the PostgreSQL distribution you are running is installed in /Library/PostgreSQL/9.6, add /Library/PostgreSQL/9.6/bin to the beginning of your $PATH variable. If you are *running* multiple PostgreSQL servers, make sure you know the port number of the one you want to use, as the installer will need it in order to connect to the database (the first PostgreSQL distribution installed on your system is likely using the default port 5432; but the second will likely be on 5433, etc.) Does every word in this paragraph make sense? If it does, great - because you definitely need to be comfortable with basic system tasks in order to install Dataverse. If not - if you don't know how to check where your PostgreSQL is installed, or what port it is running on, or what a $PATH is... it's not too late to stop. Because it will most likely not work. And if you contact us for help, these will be the questions we'll be asking you - so, again, you need to be able to answer them comfortably for it to work.

Execute the installer script like this (but first read the note below about not running the installer as root)::

$ cd dvinstall
$ ./install


**It is no longer necessary to run the installer as root!**

Just make sure the user running the installer has write permission to:
Expand Down Expand Up @@ -173,7 +176,7 @@ mail.smtp.socketFactory.class javax.net.ssl.SSLSocketFactory

The mail session can also be set from command line. To use this method, you will need to delete your notifyMailSession and create a new one. See the below example:

- Delete: ``./asadmin delete-javamail-resource mail/MyMailSession``
- Delete: ``./asadmin delete-javamail-resource mail/notifyMailSession``
- Create (remove brackets and replace the variables inside): ``./asadmin create-javamail-resource --mailhost [smtp.gmail.com] --mailuser [test\@test\.com] --fromaddress [test\@test\.com] --property mail.smtp.auth=[true]:mail.smtp.password=[password]:mail.smtp.port=[465]:mail.smtp.socketFactory.port=[465]:mail.smtp.socketFactory.fallback=[false]:mail.smtp.socketFactory.class=[javax.net.ssl.SSLSocketFactory] mail/notifyMailSession``

Be sure you save the changes made here and then restart your Glassfish server to test it out.
Expand Down
22 changes: 17 additions & 5 deletions doc/sphinx-guides/source/user/dataverse-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -189,14 +189,26 @@ Click on Featured Dataverses and a pop up will appear. Select which sub datavers

Note: Featured Dataverses can only be used with published dataverses.

Linked Dataverses + Linked Datasets
======================================================
Dataset Linking
===============

Dataset linking allows a dataverse owner to "link" their dataverse to a dataset that exists outside of that dataverse, so it appears in the dataverse’s list of contents without actually *being* in that dataverse. You can link other users' datasets to your dataverse, but that does not transfer editing or other special permissions to you. The linked dataset will still be under the original user's control.

For example, researchers working on a collaborative study across institutions can each link their own individual institutional dataverses to the one collaborative dataset, making it easier for interested parties from each institution to find the study.

In order to link a dataset, you will need your account to have the "Add Dataset" permission on the Dataverse that is doing the linking. If you created the dataverse then you should have this permission already, but if not then you will need to ask the admin of that dataverse to assign that permission to your account. You do not need any special permissions on the dataset being linked.

Currently, the ability to link a dataverse to another dataverse or a dataset to a dataverse is a super user only feature.
To link a dataset to your dataverse, you must navigate to that dataset and click the white "Link" button in the upper-right corner of the dataset page. This will open up a window where you can type in the name of the dataverse that you would like to link the dataset to. Select your dataverse and click the save button. This will establish the link, and the dataset will now appear under your dataverse.

There is currently no way to remove established links in the UI. If you need to remove a link between a dataverse and a dataset, please contact the support team for the Dataverse installation you are using.


Dataverse Linking
======================================================

If you link a dataset to your dataverse, that means that the dataset will appear in the list of datasets contained within your dataverse. Linking another dataverse to your dataverse works the same way. You can link other users' dataverses and datasets to your dataverse, but that does not transfer editing or other special permissions to you. The linked dataverse or dataset will still be under the original user's control.
Similarly to dataset linking, dataverse linking allows a dataverse owner to "link" their dataverse to another dataverse, so the dataverse being linked will appear in the linking dataverse's list of contents without actually *being* in that dataverse. Currently, the ability to link a dataverse to another dataverse is a superuser only feature.

If you need to have a dataverse or dataset linked to your dataverse, please contact the support team for the Dataverse installation you are using.
If you need to have a dataverse linked to your dataverse, please contact the support team for the Dataverse installation you are using.

Publish Your Dataverse
=================================================================
Expand Down
31 changes: 26 additions & 5 deletions pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,11 @@
<project.timezone>UTC</project.timezone>
<project.language>en</project.language>
<project.region>US</project.region>

<junit.version>4.12</junit.version>
<junit.jupiter.version>5.3.1</junit.jupiter.version>
<junit.vintage.version>5.3.1</junit.vintage.version>
<junit.platform.version>1.3.1</junit.platform.version>
</properties>

<repositories>
Expand Down Expand Up @@ -66,12 +71,27 @@
<artifactId>passay</artifactId>
<version>1.1.0</version>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-api</artifactId>
<version>${junit.jupiter.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.8.1</version>
<version>${junit.version}</version>
<scope>test</scope>
<type>jar</type>
</dependency>
<dependency>
<groupId>org.junit.jupiter</groupId>
<artifactId>junit-jupiter-engine</artifactId>
<version>${junit.jupiter.version}</version>
</dependency>
<dependency>
<groupId>org.junit.vintage</groupId>
<artifactId>junit-vintage-engine</artifactId>
<version>${junit.vintage.version}</version>
</dependency>
<dependency>
<groupId>org.glassfish</groupId>
Expand Down Expand Up @@ -354,11 +374,12 @@
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
<version>1.7.7</version>
</dependency>
</dependency>
<dependency>
<groupId>org.mockito</groupId>
<artifactId>mockito-core</artifactId>
<version>1.10.19</version>
<version>2.22.0</version>
<scope>test</scope>
</dependency>
<!-- Added for DataCite -->
<!--dependency>
Expand Down Expand Up @@ -638,7 +659,7 @@
<configuration>
<!-- testsToExclude come from the profile-->
<excludedGroups>${testsToExclude}</excludedGroups>
<argLine>-Duser.timezone=${project.timezone} -Dfile.encoding=${project.build.sourceEncoding} -Duser.language=${project.language} -Duser.region=${project.region}</argLine>
<argLine>${argLine} -Duser.timezone=${project.timezone} -Dfile.encoding=${project.build.sourceEncoding} -Duser.language=${project.language} -Duser.region=${project.region}</argLine>
</configuration>
</plugin>
</plugins>
Expand Down
2 changes: 1 addition & 1 deletion scripts/api/setup-all.sh
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@ curl -X PUT -d /dataverseuser.xhtml?editMode=CREATE "$SERVER/admin/settings/:Sig
curl -X PUT -d doi "$SERVER/admin/settings/:Protocol"
curl -X PUT -d 10.5072 "$SERVER/admin/settings/:Authority"
curl -X PUT -d "FK2/" "$SERVER/admin/settings/:Shoulder"
curl -X PUT -d EZID "$SERVER/admin/settings/:DoiProvider"
curl -X PUT -d DataCite "$SERVER/admin/settings/:DoiProvider"
curl -X PUT -d burrito $SERVER/admin/settings/BuiltinUsers.KEY
curl -X PUT -d localhost-only $SERVER/admin/settings/:BlockedApiPolicy
echo
Expand Down
2 changes: 1 addition & 1 deletion scripts/database/reference_data.sql
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ INSERT INTO guestbook(
-- gets an answer. See also https://github.com/IQSS/dataverse/issues/2598#issuecomment-158219334
CREATE UNIQUE INDEX dataverse_alias_unique_idx on dataverse (LOWER(alias));
CREATE UNIQUE INDEX index_authenticateduser_lower_email ON authenticateduser (lower(email));
CREATE UNIQUE INDEX index_builtinuser_lower_email ON builtinuser (lower(email));
-- this field has been removed from builtinuser; CREATE UNIQUE INDEX index_builtinuser_lower_email ON builtinuser (lower(email));

--Edit Dataset: Investigate and correct multiple draft issue: https://github.com/IQSS/dataverse/issues/2132
--This unique index will prevent the multiple draft issue
Expand Down
16 changes: 2 additions & 14 deletions scripts/installer/Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -9,17 +9,15 @@ JHOVE_SCHEMA=${INSTALLER_ZIP_DIR}/jhoveConfig.xsd
SOLR_SCHEMA=${INSTALLER_ZIP_DIR}/schema.xml
SOLR_CONFIG=${INSTALLER_ZIP_DIR}/solrconfig.xml
INSTALL_SCRIPT=${INSTALLER_ZIP_DIR}/install
GLASSFISH_STARTUP_SCRIPT=${INSTALLER_ZIP_DIR}/glassfish-startup
POSTGRES_SCRIPT=${INSTALLER_ZIP_DIR}/postgres-setup

installer: dvinstall.zip

clean:
/bin/rm -rf ${INSTALLER_ZIP_DIR} dvinstall.zip

dvinstall.zip: ${GLASSFISH_SETUP_SCRIPT} ${POSTGRES_DRIVERS} ${DISTRIBUTION_WAR_FILE} ${API_SCRIPTS} ${DB_SCRIPT} ${JHOVE_CONFIG} ${JHOVE_SCHEMA} ${SOLR_SCHEMA} ${SOLR_CONFIG} ${INSTALL_SCRIPT} ${GLASSFISH_STARTUP_SCRIPT} ${POSTGRES_SCRIPT}
dvinstall.zip: ${GLASSFISH_SETUP_SCRIPT} ${POSTGRES_DRIVERS} ${DISTRIBUTION_WAR_FILE} ${API_SCRIPTS} ${DB_SCRIPT} ${JHOVE_CONFIG} ${JHOVE_SCHEMA} ${SOLR_SCHEMA} ${SOLR_CONFIG} ${INSTALL_SCRIPT}
@echo making installer...
zip -r dvinstall.zip ${GLASSFISH_SETUP_SCRIPT} ${POSTGRES_DRIVERS} ${DISTRIBUTION_WAR_FILE} ${API_SCRIPTS} ${DB_SCRIPT} ${JHOVE_CONFIG} ${JHOVE_SCHEMA} ${SOLR_SCHEMA} ${SOLR_CONFIG} ${INSTALL_SCRIPT} ${GLASSFISH_STARTUP_SCRIPT} ${POSTGRES_SCRIPT}
zip -r dvinstall.zip ${GLASSFISH_SETUP_SCRIPT} ${POSTGRES_DRIVERS} ${DISTRIBUTION_WAR_FILE} ${API_SCRIPTS} ${DB_SCRIPT} ${JHOVE_CONFIG} ${JHOVE_SCHEMA} ${SOLR_SCHEMA} ${SOLR_CONFIG} ${INSTALL_SCRIPT}
@echo
@echo "Done!"

Expand Down Expand Up @@ -47,16 +45,6 @@ ${GLASSFISH_SETUP_SCRIPT}: glassfish-setup.sh
mkdir -p ${INSTALLER_ZIP_DIR}
/bin/cp glassfish-setup.sh ${INSTALLER_ZIP_DIR}

${GLASSFISH_STARTUP_SCRIPT}: glassfish-startup
@echo copying glassfish startup
mkdir -p ${INSTALLER_ZIP_DIR}
/bin/cp glassfish-startup ${INSTALLER_ZIP_DIR}

${POSTGRES_SCRIPT}: postgres-setup
@echo copying postgres-setup
mkdir -p ${INSTALLER_ZIP_DIR}
/bin/cp postgres-setup ${INSTALLER_ZIP_DIR}

${POSTGRES_DRIVERS}: pgdriver/postgresql-42.2.2.jar
@echo copying postgres driver
@mkdir -p ${POSTGRES_DRIVERS}
Expand Down
22 changes: 17 additions & 5 deletions scripts/installer/glassfish-setup.sh
Original file line number Diff line number Diff line change
Expand Up @@ -66,10 +66,11 @@ function preliminary_setup()
# password reset token timeout in minutes
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddataverse.auth.password-reset-timeout-in-minutes=60"

# EZID DOI Settings
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddoi.password=apitest"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddoi.username=apitest"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddoi.baseurlstring=https\://ezid.cdlib.org"
# DataCite DOI Settings
# (we can no longer offer EZID with their shared test account)
#./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddoi.password=apitest"
#./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddoi.username=apitest"
./asadmin $ASADMIN_OPTS create-jvm-options "\-Ddoi.baseurlstring=https\://mds.test.datacite.org"

./asadmin $ASADMIN_OPTS create-jvm-options "-Ddataverse.timerServer=true"
# enable comet support
Expand Down Expand Up @@ -120,7 +121,18 @@ function final_setup(){

./asadmin $ASADMIN_OPTS create-jvm-options "\-Djavax.xml.parsers.SAXParserFactory=com.sun.org.apache.xerces.internal.jaxp.SAXParserFactoryImpl"

./asadmin $ASADMIN_OPTS create-javamail-resource --mailhost "$SMTP_SERVER" --mailuser "dataversenotify" --fromaddress "do-not-reply@${HOST_ADDRESS}" mail/notifyMailSession
###
# Mail server setup:
# delete any existing mail/notifyMailSession; configure port, if provided:

./asadmin delete-javamail-resource mail/notifyMailSession

if [ $SMTP_SERVER_PORT"x" != "x" ]
then
./asadmin $ASADMIN_OPTS create-javamail-resource --mailhost "$SMTP_SERVER" --mailuser "dataversenotify" --fromaddress "do-not-reply@${HOST_ADDRESS}" --property mail.smtp.port="${SMTP_SERVER_PORT}" mail/notifyMailSession
else
./asadmin $ASADMIN_OPTS create-javamail-resource --mailhost "$SMTP_SERVER" --mailuser "dataversenotify" --fromaddress "do-not-reply@${HOST_ADDRESS}" mail/notifyMailSession
fi

}

Expand Down
Loading

0 comments on commit 423e372

Please sign in to comment.