Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Formatting clean up to resolve warnings in sphinx builds [ref #7066] #7071

Merged
merged 1 commit into from
Jul 10, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
16 changes: 8 additions & 8 deletions doc/sphinx-guides/source/developers/deployment.rst
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ Once you have done the configuration above, you are ready to try running the "ec

Download :download:`ec2-create-instance.sh<https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/ec2/ec2-create-instance.sh>` and put it somewhere reasonable. For the purpose of these instructions we'll assume it's in the "Downloads" directory in your home directory.

To run it with default values you just need the script, but you may also want a current copy of the ansible :download:`group vars<https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/defaults/main.yml>`_ file.
To run it with default values you just need the script, but you may also want a current copy of the ansible `group vars <https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/defaults/main.yml>`_ file.

ec2-create-instance accepts a number of command-line switches, including:

Expand All @@ -109,18 +109,18 @@ Migrating Datafiles from Local Storage to S3

A number of pilot Dataverse installations start on local storage, then administrators are tasked with migrating datafiles into S3 or similar object stores. The files may be copied with a command-line utility such as `s3cmd<https://s3tools.org/s3cmd>`. You will want to retain the local file hierarchy, keeping the authority (for example: 10.5072) at the bucket "root."

The below example queries may assist with updating dataset and datafile locations in the Dataverse PostgresQL database. Depending on the initial version of Dataverse and subsequent upgrade path, Datafile storage identifiers may or may not include a file:// prefix, so you'll want to catch both cases.
The below example queries may assist with updating dataset and datafile locations in the Dataverse PostgresQL database. Depending on the initial version of Dataverse and subsequent upgrade path, Datafile storage identifiers may or may not include a ``file://`` prefix, so you'll want to catch both cases.

To Update Dataset Location to S3, Assuming a file:// Prefix:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To Update Dataset Location to S3, Assuming a ``file://`` Prefix
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

::

UPDATE dvobject SET storageidentifier=REPLACE(storageidentifier,'file://','s3://')
WHERE dtype='Dataset';

To Update Datafile Location to your-s3-bucket, Assuming a file:// Prefix:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To Update Datafile Location to your-s3-bucket, Assuming a ``file://`` Prefix
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

::

Expand All @@ -130,8 +130,8 @@ To Update Datafile Location to your-s3-bucket, Assuming a file:// Prefix:
AND s.id = o.owner_id AND s.harvestingclient_id IS null
AND o.storageidentifier NOT LIKE 's3://%');

To Update Datafile Location to your-s3-bucket, Assuming no file:// Prefix:
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
To Update Datafile Location to your-s3-bucket, Assuming no ``file://`` Prefix
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

::

Expand Down
6 changes: 3 additions & 3 deletions doc/sphinx-guides/source/developers/testing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -137,10 +137,10 @@ Running the Full API Test Suite Using EC2

To run the API test suite in an EC2 instance you should first follow the steps in the :doc:`deployment` section to get set up for AWS in general and EC2 in particular.

You may always retrieve a current copy of the ec2-create-instance.sh script and accompanying group_var.yml file from the `dataverse-ansible repo<https://github.com/GlobalDataverseCommunityConsortium/dataverse-ansible/>`_:
You may always retrieve a current copy of the ec2-create-instance.sh script and accompanying group_var.yml file from the `dataverse-ansible repo <https://github.com/GlobalDataverseCommunityConsortium/dataverse-ansible/>`_:

- `ec2-create-instance.sh<https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/ec2/ec2-create-instance.sh>`_
- `main.yml<https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/defaults/main.yml>`_
- `ec2-create-instance.sh <https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/ec2/ec2-create-instance.sh>`_
- `main.yml <https://raw.githubusercontent.com/GlobalDataverseCommunityConsortium/dataverse-ansible/master/defaults/main.yml>`_

Edit ``main.yml`` to set the desired GitHub repo, branch, and to ensure that the API test suite is enabled:

Expand Down