Skip to content

Commit

Permalink
User Guide revisions for Cloud Storage + Computing on the Dataset + F…
Browse files Browse the repository at this point in the history
…ile Management pg. Minor revisions to Configuration and Installation pgs in Installation Guide. [ref #3747]
  • Loading branch information
mheppler committed May 15, 2017
1 parent ba5839a commit 4fcaad6
Show file tree
Hide file tree
Showing 3 changed files with 13 additions and 8 deletions.
4 changes: 2 additions & 2 deletions doc/sphinx-guides/source/installation/config.rst
Original file line number Diff line number Diff line change
Expand Up @@ -154,7 +154,7 @@ Enabling a second authentication provider will result in the Log In page showing
File Storage: Local Filesystem vs. Swift
----------------------------------------

By default, a Dataverse installation stores data files (files uploaded by end users) on the filesystem at ``/usr/local/glassfish4/glassfish/domains/domain1/files`` but this path can vary based on answers you gave to the installer (see "Running the Dataverse Installer" under the :doc:`installation-main` section) or afterward by reconfiguring the ``dataverse.files.directory`` JVM option described below.
By default, a Dataverse installation stores data files (files uploaded by end users) on the filesystem at ``/usr/local/glassfish4/glassfish/domains/domain1/files`` but this path can vary based on answers you gave to the installer (see the :ref:`dataverse-installer` section of the Installation Guide) or afterward by reconfiguring the ``dataverse.files.directory`` JVM option described below.

Alternatively, rather than storing data files on the filesystem, you can opt for a experimental setup with a `Swift Object Storage <http://swift.openstack.org>`_ backend. Each dataset that users create gets a corresponding "container" on the Swift side, and each data file is saved as a file within that container.

Expand Down Expand Up @@ -698,7 +698,7 @@ Set the base URL for the "Compute" button for a dataset.
``curl -X PUT -d 'https://giji.massopencloud.org/application/dataverse?containerName=' http://localhost:8080/api/admin/settings/:ComputeBaseUrl``

:CloudEnvironmentName
++++++++++++++++
+++++++++++++++++++++

Set the base URL for the "Compute" button for a dataset.

Expand Down
2 changes: 2 additions & 0 deletions doc/sphinx-guides/source/installation/installation-main.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ Now that the :doc:`prerequisites` are in place, we are ready to execute the Data

.. contents:: :local:

.. _dataverse-installer:

Running the Dataverse Installer
-------------------------------

Expand Down
15 changes: 9 additions & 6 deletions doc/sphinx-guides/source/user/dataset-management.rst
Original file line number Diff line number Diff line change
Expand Up @@ -120,19 +120,22 @@ There are several advanced options available for certain file types.

.. _cloud-storage:

Cloud Storage & Computing
Cloud Storage + Computing
-------------------------

Some Dataverse installations are specifically set up to facilitate cloud-based computing. While vanilla Dataverse uses a traditional file system for storing data, Cloud Dataverse uses a Swift object storage database. This allows users to perform computations on data using an integrated cloud computing environment. This feature is considered experimental at this time, and some of the kinks are still being worked out.
Dataverse installations can be configured to facilitate cloud-based storage and/or computing. (This feature is considered experimental at this time, and some of the kinks are still being worked out.) While the default configuration for Dataverse uses a traditional, local file system for storing data, a cloud enabled Dataverse installations can use a Swift object storage database. This allows users to perform computations on data using an integrated cloud computing environment.

You'll know you're using Cloud Dataverse if you see a "Compute" button on dataset and file pages. Clicking that Compute button will take you directly to the cloud computing environment that is integrated with Cloud Dataverse, allowing you to perform computations on the file or dataset you were just viewing.

**Note:** At present, any file restrictions that users apply in Cloud Dataverse will not be honored in Swift. This means: if you set a file on Cloud Dataverse as "restricted", a user without proper permissions **could bypass that restriction** by accessing the file through Swift. For now, do not rely on file restrictions to limit access to data in Cloud Dataverse.
**Note:** At present, any file restrictions that users apply in Dataverse will not be supported in Swift. This means: if you set a file on Dataverse as "restricted", a user without proper permissions **could bypass that restriction** by accessing the file through Swift. For now, do not rely on file restrictions to limit access to data in a cloud enabled Dataverse.

Cloud Storage Access
~~~~~~~~~~~~~~~~~~~~

If you need to access a dataset in a more flexible way than Dataverse's Compute button provides, then you can use the Cloud Storage Access box on the dataset page to copy the dataset's container name. This unique identifer can then be pasted into whatever script or field you may be running, to allow direct access to the dataset.
For each dataset, the Cloud Storage Access provides an unique identifer which can then be utilized to allow direct access to the dataset. Click the "Copy" button to copy the identifer to your clipboard.

Cloud Computing
~~~~~~~~~~~~~~~

The "Compute" button on dataset and file pages will take you directly to the cloud computing environment that is configured with Dataverse, allowing you to perform computations on the file or dataset.

Edit Files
==========
Expand Down

0 comments on commit 4fcaad6

Please sign in to comment.