Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
  • Loading branch information
Martin Durant committed Jan 10, 2018
1 parent 789459f commit 2dbb124
Showing 1 changed file with 15 additions and 13 deletions.
28 changes: 15 additions & 13 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ GCSFS

A pythonic file-system interface to `Google Cloud Storage`_.

This software is alpha, use at your own risk.
This software is beta, use at your own risk.

Please file issues and requests on github_ and we welcome pull requests.

Expand Down Expand Up @@ -36,7 +36,7 @@ Locate and read a file:
.. code-block:: python
>>> import gcsfs
>>> fs = gcsfs.GCSFileSystem(project='my-google-project', token='/path/to/token')
>>> fs = gcsfs.GCSFileSystem(project='my-google-project')
>>> fs.ls('my-bucket')
['my-file.txt']
>>> with fs.open('my-bucket/my-file.txt', 'rb') as f:
Expand Down Expand Up @@ -77,29 +77,31 @@ Credentials

Two modes of authentication are supported:

- if ``token=None``, GCSFS will attempt to use your default gcloud
credentials or, if that fails,
will print a "device code" and a link you must follow to
authenticate with your Google identity.
- if ``token=None`` (default), GCSFS will attempt to use your default gcloud
credentials or, attempt to get credentials from the google metadata
service, or fall back to anonymous access. This will work for most
users without further action. Note that the default project may also
be found, but it is often best to supply this anyway (only affects bucket-
level operations).

- if ``token='cloud'``, we assume we are running within google (compute
or container engine) and fetch the credentials automatically from the
metadata service.

- you may supply a token generated by the
gcloud_ utility; this is either a python dictionary, or the name of a file
containing the JSON returned by logging in with the gcloud CLI tool. On
a posix system this may be at
containing the JSON returned by logging in with the gcloud CLI tool (e.g.,
``~/.config/gcloud/application_default_credentials.json`` or
``~/.config/gcloud/legacy_credentials/<YOUR GOOGLE USERNAME>/adc.json``
``~/.config/gcloud/legacy_credentials/<YOUR GOOGLE USERNAME>/adc.json``)
or any value google ``Credentials`` object.

Authorizations are cached in a local file, for a given project/access level, so
you should not need to authorize again.
- you can also generate tokens via oauth2 in the browser, with ``token='browser'``,
which, once done, will be saved in a local cache for future use.

The acquired session tokens are *not* preserved when serializing the instances, so
it is safe to pass them to worker processes on other machines if using in a
distributed computation context. Credentials must be given by a file path, however,
and this file must exist on every machine.
distributed computation context. If credentials are given by a file path, however,
then this file must exist on every machine.

Connection with Dask and Zarr
-----------------------------
Expand Down

0 comments on commit 2dbb124

Please sign in to comment.