Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Changing auth to use implicit environment. #419

Merged
merged 3 commits into from
Dec 16, 2014
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 9 additions & 11 deletions CONTRIBUTING.rst
Original file line number Diff line number Diff line change
Expand Up @@ -164,23 +164,19 @@ Running Regression Tests
bamboo-shift-455).
- ``GCLOUD_TESTS_DATASET_ID``: The name of the dataset your tests connect to.
This is typically the same as ``GCLOUD_TESTS_PROJECT_ID``.
- ``GCLOUD_TESTS_CLIENT_EMAIL``: The email for the service account you're
authenticating with
- ``GCLOUD_TESTS_KEY_FILE``: The path to an encrypted key file.
See private key
- ``GOOGLE_APPLICATION_CREDENTIALS``: The path to a JSON key file;
see ``regression/app_credentials.json.sample`` as an example. Such a file
can be downloaded directly from the developer's console by clicking
"Generate new JSON key". See private key
`docs <https://cloud.google.com/storage/docs/authentication#generating-a-private-key>`__
for explanation on how to get a private key.
for more details.

- Examples of these can be found in ``regression/local_test_setup.sample``. We
recommend copying this to ``regression/local_test_setup``, editing the values
and sourcing them into your environment::

$ source regression/local_test_setup

- The ``GCLOUD_TESTS_KEY_FILE`` value should point to a valid path (relative or
absolute) on your system where the key file for your service account can
be found.

- For datastore tests, you'll need to create composite
`indexes <https://cloud.google.com/datastore/docs/tools/indexconfig>`__
with the ``gcloud`` command line
Expand All @@ -194,8 +190,10 @@ Running Regression Tests
$ export CLOUDSDK_PYTHON_SITEPACKAGES=1

# Authenticate the gcloud tool with your account.
$ gcloud auth activate-service-account $GCLOUD_TESTS_CLIENT_EMAIL \
> --key-file=$GCLOUD_TESTS_KEY_FILE
$ SERVICE_ACCOUNT_EMAIL="some-account@developer.gserviceaccount.com"
$ P12_CREDENTIALS_FILE="path/to/keyfile.p12"
$ gcloud auth activate-service-account $SERVICE_ACCOUNT_EMAIL \
> --key-file=$P12_CREDENTIALS_FILE

# Create the indexes
$ gcloud preview datastore create-indexes regression/data/ \
Expand Down
8 changes: 2 additions & 6 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,9 +48,7 @@ Library.
.. code:: python

from gcloud import datastore
dataset = datastore.get_dataset('dataset-id-here',
'long-email@googleapis.com',
'/path/to/private.key')
dataset = datastore.get_dataset('dataset-id-here')
# Then do other things...
query = dataset.query().kind('EntityKind')
entity = dataset.entity('EntityKind')
Expand All @@ -75,9 +73,7 @@ to learn how to connect to the Cloud Storage using this Client Library.
.. code:: python

import gcloud.storage
bucket = gcloud.storage.get_bucket('bucket-id-here',
'long-email@googleapis.com',
'/path/to/private.key')
bucket = gcloud.storage.get_bucket('bucket-id-here', 'project-id')
# Then do other things...
key = bucket.get_key('/remote/path/to/file.txt')
print key.get_contents_as_string()
Expand Down
5 changes: 1 addition & 4 deletions docs/_components/datastore-getting-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -38,10 +38,7 @@ Add some data to your dataset
Open a Python console and...

>>> from gcloud import datastore
>>> dataset = datastore.get_dataset(
>>> '<your-project-id-here',
>>> '<the e-mail address you copied here>',
>>> '/path/to/<your project>.key')
>>> dataset = datastore.get_dataset('<your-dataset-id>')
>>> dataset.query().fetch()
[]
>>> entity = dataset.entity('Person')
Expand Down
13 changes: 6 additions & 7 deletions docs/_components/datastore-quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ authentication to your project:
bamboo-shift-455).
- ``GCLOUD_TESTS_DATASET_ID``: The name of the dataset your tests connect to.
This is typically the same as ``GCLOUD_TESTS_PROJECT_ID``.
- ``GCLOUD_TESTS_CLIENT_EMAIL``: The email for the service account you're
authenticating with
- ``GCLOUD_TESTS_KEY_FILE``: The path to an encrypted key file.
See private key
- ``GOOGLE_APPLICATION_CREDENTIALS``: The path to a JSON key file;
see ``regression/app_credentials.json.sample`` as an example. Such a file
can be downloaded directly from the developer's console by clicking
"Generate new JSON key". See private key
`docs <https://cloud.google.com/storage/docs/authentication#generating-a-private-key>`__
for explanation on how to get a private key.
for more details.

Run the
`example script <https://github.com/GoogleCloudPlatform/gcloud-python/blob/master/gcloud/datastore/demo/demo.py>`_
Expand Down Expand Up @@ -68,7 +68,6 @@ you can create entities and save them::

>>> from gcloud import datastore
>>> from gcloud.datastore import demo
>>> dataset = datastore.get_dataset(
>>> demo.DATASET_ID, demo.CLIENT_EMAIL, demo.PRIVATE_KEY_PATH)
>>> dataset = datastore.get_dataset(demo.DATASET_ID)

----
2 changes: 1 addition & 1 deletion docs/_components/storage-getting-started.rst
Original file line number Diff line number Diff line change
Expand Up @@ -45,7 +45,7 @@ The first step in accessing Cloud Storage
is to create a connection to the service::

>>> from gcloud import storage
>>> connection = storage.get_connection(project_name, email, key_path)
>>> connection = storage.get_connection(project_name)

We're going to use this
:class:`connection <gcloud.storage.connection.Connection>` object
Expand Down
13 changes: 6 additions & 7 deletions docs/_components/storage-quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ authentication to your project:
bamboo-shift-455).
- ``GCLOUD_TESTS_DATASET_ID``: The name of the dataset your tests connect to.
This is typically the same as ``GCLOUD_TESTS_PROJECT_ID``.
- ``GCLOUD_TESTS_CLIENT_EMAIL``: The email for the service account you're
authenticating with
- ``GCLOUD_TESTS_KEY_FILE``: The path to an encrypted key file.
See private key
- ``GOOGLE_APPLICATION_CREDENTIALS``: The path to a JSON key file;
see ``regression/app_credentials.json.sample`` as an example. Such a file
can be downloaded directly from the developer's console by clicking
"Generate new JSON key". See private key
`docs <https://cloud.google.com/storage/docs/authentication#generating-a-private-key>`__
for explanation on how to get a private key.
for more details.

Run the
`example script <https://github.com/GoogleCloudPlatform/gcloud-python/blob/master/gcloud/storage/demo/demo.py>`_
Expand Down Expand Up @@ -76,7 +76,6 @@ you can create buckets and keys::

>>> from gcloud import storage
>>> from gcloud.storage import demo
>>> connection = storage.get_connection(
>>> demo.PROJECT_NAME, demo.CLIENT_EMAIL, demo.PRIVATE_KEY_PATH)
>>> connection = storage.get_connection(demo.PROJECT_ID)

----
14 changes: 3 additions & 11 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -29,10 +29,7 @@ Cloud Datastore
.. code-block:: python

from gcloud import datastore
dataset = datastore.get_dataset(
'<your-project-id>',
'<service-account-email>',
'/path/to/your/key')
dataset = datastore.get_dataset('<dataset-id>')
entity = dataset.entity('Person')
entity['name'] = 'Your name'
entity['age'] = 25
Expand All @@ -46,13 +43,8 @@ Cloud Storage
.. _Google Cloud Storage: https://developers.google.com/storage/

.. code-block:: python

from gcloud import storage
bucket = storage.get_bucket(
'<your-bucket-name>',
'<your-project-id>',
'<service-account-email>',
'/path/to/your/key')
bucket = storage.get_bucket('<your-bucket-name>', '<your-project-id>')
key = bucket.new_key('my-test-file.txt')
key = key.upload_contents_from_string('this is test content!')

56 changes: 48 additions & 8 deletions gcloud/credentials.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,24 +17,64 @@
from oauth2client import client


def get_for_service_account(client_email, private_key_path, scope=None):
def get_credentials():
"""Gets credentials implicitly from the current environment.

.. note::
You should not need to use this function directly. Instead, use the
helper methods provided in
:func:`gcloud.datastore.__init__.get_connection` and
:func:`gcloud.datastore.__init__.get_dataset` which use this method
under the hood.

Checks environment in order of precedence:
- Google App Engine (production and testing)
- Environment variable GOOGLE_APPLICATION_CREDENTIALS pointing to
a file with stored credentials information.
- Stored "well known" file associated with `gcloud` command line tool.
- Google Compute Engine production environment.

The file referred to in GOOGLE_APPLICATION_CREDENTIALS is expected to
contain information about credentials that are ready to use. This means
either service account information or user account information with
a ready-to-use refresh token:
{ {
'type': 'authorized_user', 'type': 'service_account',
'client_id': '...', 'client_id': '...',
'client_secret': '...', OR 'client_email': '...',
'refresh_token': '..., 'private_key_id': '...',
} 'private_key': '...',
}
The second of these is simply a JSON key downloaded from the Google APIs
console. The first is a close cousin of the "client secrets" JSON file
used by `oauth2client.clientsecrets` but differs in formatting.

:rtype: :class:`oauth2client.client.GoogleCredentials`,
:class:`oauth2client.appengine.AppAssertionCredentials`,
:class:`oauth2client.gce.AppAssertionCredentials`,
:class:`oauth2client.service_account._ServiceAccountCredentials`
:returns: A new credentials instance corresponding to the implicit
environment.
"""
return client.GoogleCredentials.get_application_default()


def get_for_service_account_p12(client_email, private_key_path, scope=None):
"""Gets the credentials for a service account.

.. note::
You should not need to use this function directly.
Instead, use the helper methods provided in
:func:`gcloud.datastore.__init__.get_connection`
and
:func:`gcloud.datastore.__init__.get_dataset`
which use this method under the hood.
This method is not used by default, instead :func:`get_credentials`
is used. This method is intended to be used when the environments is
known explicitly and detecting the environment implicitly would be
superfluous.

:type client_email: string
:param client_email: The e-mail attached to the service account.

:type private_key_path: string
:param private_key_path: The path to a private key file (this file was
given to you when you created the service
account).
account). This file must be in P12 format.

:type scope: string or tuple of strings
:param scope: The scope against which to authenticate. (Different services
Expand Down
43 changes: 13 additions & 30 deletions gcloud/datastore/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,9 +17,7 @@
You'll typically use these to get started with the API:

>>> from gcloud import datastore
>>> dataset = datastore.get_dataset('dataset-id-here',
... 'long-email@googleapis.com',
... '/path/to/private.key')
>>> dataset = datastore.get_dataset('dataset-id-here')
>>> # Then do other things...
>>> query = dataset.query().kind('EntityKind')
>>> entity = dataset.entity('EntityKind')
Expand Down Expand Up @@ -53,37 +51,30 @@
"""The scope required for authenticating as a Cloud Datastore consumer."""


def get_connection(client_email, private_key_path):
from gcloud import credentials
from gcloud.datastore.connection import Connection


def get_connection():
"""Shortcut method to establish a connection to the Cloud Datastore.

Use this if you are going to access several datasets
with the same set of credentials (unlikely):

>>> from gcloud import datastore
>>> connection = datastore.get_connection(email, key_path)
>>> connection = datastore.get_connection()
>>> dataset1 = connection.dataset('dataset1')
>>> dataset2 = connection.dataset('dataset2')

:type client_email: string
:param client_email: The e-mail attached to the service account.

:type private_key_path: string
:param private_key_path: The path to a private key file (this file was
given to you when you created the service
account).

:rtype: :class:`gcloud.datastore.connection.Connection`
:returns: A connection defined with the proper credentials.
"""
from gcloud import credentials
from gcloud.datastore.connection import Connection
implicit_credentials = credentials.get_credentials()
scoped_credentials = implicit_credentials.create_scoped(SCOPE)
return Connection(credentials=scoped_credentials)

svc_account_credentials = credentials.get_for_service_account(
client_email, private_key_path, scope=SCOPE)
return Connection(credentials=svc_account_credentials)


def get_dataset(dataset_id, client_email, private_key_path):
def get_dataset(dataset_id):
"""Establish a connection to a particular dataset in the Cloud Datastore.

This is a shortcut method for creating a connection and using it
Expand All @@ -92,7 +83,7 @@ def get_dataset(dataset_id, client_email, private_key_path):
You'll generally use this as the first call to working with the API:

>>> from gcloud import datastore
>>> dataset = datastore.get_dataset('dataset-id', email, key_path)
>>> dataset = datastore.get_dataset('dataset-id')
>>> # Now you can do things with the dataset.
>>> dataset.query().kind('TestKind').fetch()
[...]
Expand All @@ -103,16 +94,8 @@ def get_dataset(dataset_id, client_email, private_key_path):
and is usually the same as your Cloud Datastore project
name.

:type client_email: string
:param client_email: The e-mail attached to the service account.

:type private_key_path: string
:param private_key_path: The path to a private key file (this file was
given to you when you created the service
account).

:rtype: :class:`gcloud.datastore.dataset.Dataset`
:returns: A dataset with a connection using the provided credentials.
"""
connection = get_connection(client_email, private_key_path)
connection = get_connection()
return connection.dataset(dataset_id)
4 changes: 2 additions & 2 deletions gcloud/datastore/connection.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ def lookup(self, dataset_id, key_pbs):

>>> from gcloud import datastore
>>> from gcloud.datastore.key import Key
>>> connection = datastore.get_connection(email, key_path)
>>> connection = datastore.get_connection()
>>> dataset = connection.dataset('dataset-id')
>>> key = Key(dataset=dataset).kind('MyKind').id(1234)

Expand Down Expand Up @@ -248,7 +248,7 @@ def run_query(self, dataset_id, query_pb, namespace=None):
uses this method to fetch data:

>>> from gcloud import datastore
>>> connection = datastore.get_connection(email, key_path)
>>> connection = datastore.get_connection()
>>> dataset = connection.dataset('dataset-id')
>>> query = dataset.query().kind('MyKind').filter('property =', 'val')

Expand Down
10 changes: 4 additions & 6 deletions gcloud/datastore/demo/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,17 +12,15 @@
# See the License for the specific language governing permissions and
# limitations under the License.

# pragma NO COVER
import os
from gcloud import datastore

__all__ = ['get_dataset', 'CLIENT_EMAIL', 'DATASET_ID', 'KEY_FILENAME']

__all__ = ['get_dataset', 'DATASET_ID']


DATASET_ID = os.getenv('GCLOUD_TESTS_DATASET_ID')
CLIENT_EMAIL = os.getenv('GCLOUD_TESTS_CLIENT_EMAIL')
KEY_FILENAME = os.getenv('GCLOUD_TESTS_KEY_FILE')


def get_dataset(): # pragma NO COVER
return datastore.get_dataset(DATASET_ID, CLIENT_EMAIL, KEY_FILENAME)
def get_dataset():
return datastore.get_dataset(DATASET_ID)
4 changes: 2 additions & 2 deletions gcloud/datastore/query.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ class Query(object):
generates a query that can be executed without any additional work::

>>> from gcloud import datastore
>>> dataset = datastore.get_dataset('dataset-id', email, key_path)
>>> dataset = datastore.get_dataset('dataset-id')
>>> query = dataset.query('MyKind')

:type kind: string
Expand Down Expand Up @@ -319,7 +319,7 @@ def fetch(self, limit=None):
For example::

>>> from gcloud import datastore
>>> dataset = datastore.get_dataset('dataset-id', email, key_path)
>>> dataset = datastore.get_dataset('dataset-id')
>>> query = dataset.query('Person').filter('name =', 'Sally')
>>> query.fetch()
[<Entity object>, <Entity object>, ...]
Expand Down
Loading