Skip to content

Commit

Permalink
Browse files Browse the repository at this point in the history
  • Loading branch information
IlyaFaer committed Aug 20, 2019
2 parents ef24044 + 504cc24 commit 2f13d36
Show file tree
Hide file tree
Showing 78 changed files with 479 additions and 416 deletions.
6 changes: 1 addition & 5 deletions api_core/README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Core Library for Google Client Libraries
========================================

|pypi| |versions| |compat_check_pypi| |compat_check_github|
|pypi| |versions|

This library is not meant to stand-alone. Instead it defines
common helpers used by all Google API clients. For more information, see the
Expand All @@ -12,10 +12,6 @@ common helpers used by all Google API clients. For more information, see the
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-api_core.svg
:target: https://pypi.org/project/google-api_core/
.. _documentation: https://googleapis.dev/python/google-api-core/latest
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-api-core
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-api-core
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dapi_core
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dapi_core


Supported Python Versions
Expand Down
12 changes: 6 additions & 6 deletions api_core/google/api_core/retry.py
Original file line number Diff line number Diff line change
Expand Up @@ -125,9 +125,9 @@ def exponential_sleep_generator(initial, maximum, multiplier=_DEFAULT_DELAY_MULT
https://cloud.google.com/storage/docs/exponential-backoff
Args:
initial (float): The minimum about of time to delay. This must
initial (float): The minimum amout of time to delay. This must
be greater than 0.
maximum (float): The maximum about of time to delay.
maximum (float): The maximum amout of time to delay.
multiplier (float): The multiplier applied to the delay.
Yields:
Expand Down Expand Up @@ -223,9 +223,9 @@ class Retry(object):
Args:
predicate (Callable[Exception]): A callable that should return ``True``
if the given exception is retryable.
initial (float): The minimum about of time to delay in seconds. This
initial (float): The minimum a,out of time to delay in seconds. This
must be greater than 0.
maximum (float): The maximum about of time to delay in seconds.
maximum (float): The maximum amout of time to delay in seconds.
multiplier (float): The multiplier applied to the delay.
deadline (float): How long to keep retrying in seconds.
"""
Expand Down Expand Up @@ -314,9 +314,9 @@ def with_delay(self, initial=None, maximum=None, multiplier=None):
"""Return a copy of this retry with the given delay options.
Args:
initial (float): The minimum about of time to delay. This must
initial (float): The minimum amout of time to delay. This must
be greater than 0.
maximum (float): The maximum about of time to delay.
maximum (float): The maximum amout of time to delay.
multiplier (float): The multiplier applied to the delay.
Returns:
Expand Down
6 changes: 1 addition & 5 deletions asset/README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Python Client for Cloud Asset API
=================================

|alpha| |pypi| |versions| |compat_check_pypi| |compat_check_github|
|alpha| |pypi| |versions|

`Cloud Asset API`_: The cloud asset API manages the history and inventory of cloud resources.

Expand All @@ -14,10 +14,6 @@ Python Client for Cloud Asset API
:target: https://pypi.org/project/google-cloud-asset/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-asset.svg
:target: https://pypi.org/project/google-cloud-asset/
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-cloud-asset
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-cloud-asset
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dasset
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dasset
.. _Cloud Asset API: https://cloud.google.com/resource-manager/docs/cloud-asset-inventory/reference/rest/
.. _Client Library Documentation: https://googleapis.dev/python/cloudasset/latest
.. _Product Documentation: https://cloud.google.com/resource-manager/docs/cloud-asset-inventory/overview
Expand Down
6 changes: 1 addition & 5 deletions automl/README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Python Client for Cloud AutoML API
==================================

|alpha| |pypi| |versions| |compat_check_pypi| |compat_check_github|
|alpha| |pypi| |versions|

The `Cloud AutoML API`_ is a suite of machine learning products that enables
developers with limited machine learning expertise to train high-quality models
Expand All @@ -17,10 +17,6 @@ transfer learning, and Neural Architecture Search technology.
:target: https://pypi.org/project/google-cloud-automl/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-automl.svg
:target: https://pypi.org/project/google-cloud-automl/
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-cloud-automl
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-cloud-automl
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dautoml
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dautoml
.. _Cloud AutoML API: https://cloud.google.com/automl
.. _Client Library Documentation: https://googleapis.dev/python/automl/latest
.. _Product Documentation: https://cloud.google.com/automl
Expand Down
6 changes: 1 addition & 5 deletions bigquery/README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Python Client for Google BigQuery
=================================

|GA| |pypi| |versions| |compat_check_pypi| |compat_check_github|
|GA| |pypi| |versions|

Querying massive datasets can be time consuming and expensive without the
right hardware and infrastructure. Google `BigQuery`_ solves this problem by
Expand All @@ -17,10 +17,6 @@ processing power of Google's infrastructure.
:target: https://pypi.org/project/google-cloud-bigquery/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-bigquery.svg
:target: https://pypi.org/project/google-cloud-bigquery/
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-cloud-bigquery
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-cloud-bigquery
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigquery
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigquery
.. _BigQuery: https://cloud.google.com/bigquery/what-is-bigquery
.. _Client Library Documentation: https://googleapis.dev/python/bigquery/latest
.. _Product Documentation: https://cloud.google.com/bigquery/docs/reference/v2/
Expand Down
23 changes: 23 additions & 0 deletions bigquery/google/cloud/bigquery/enums.py
Original file line number Diff line number Diff line change
Expand Up @@ -67,3 +67,26 @@ def _make_sql_scalars_enum():


StandardSqlDataTypes = _make_sql_scalars_enum()


# See also: https://cloud.google.com/bigquery/data-types#legacy_sql_data_types
# and https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types
class SqlTypeNames(str, enum.Enum):
"""Enum of allowed SQL type names in schema.SchemaField."""

STRING = "STRING"
BYTES = "BYTES"
INTEGER = "INTEGER"
INT64 = "INTEGER"
FLOAT = "FLOAT"
FLOAT64 = "FLOAT"
NUMERIC = "NUMERIC"
BOOLEAN = "BOOLEAN"
BOOL = "BOOLEAN"
GEOGRAPHY = "GEOGRAPHY" # NOTE: not available in legacy types
RECORD = "RECORD"
STRUCT = "RECORD"
TIMESTAMP = "TIMESTAMP"
DATE = "DATE"
TIME = "TIME"
DATETIME = "DATETIME"
80 changes: 50 additions & 30 deletions bigquery/tests/unit/test_table.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,6 @@
# limitations under the License.

import itertools
import json
import logging
import time
import unittest
Expand Down Expand Up @@ -2271,26 +2270,26 @@ def test_to_dataframe_w_bqstorage_logs_session(self):
@unittest.skipIf(
bigquery_storage_v1beta1 is None, "Requires `google-cloud-bigquery-storage`"
)
@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
def test_to_dataframe_w_bqstorage_empty_streams(self):
from google.cloud.bigquery import schema
from google.cloud.bigquery import table as mut
from google.cloud.bigquery_storage_v1beta1 import reader

arrow_fields = [
pyarrow.field("colA", pyarrow.int64()),
# Not alphabetical to test column order.
pyarrow.field("colC", pyarrow.float64()),
pyarrow.field("colB", pyarrow.utf8()),
]
arrow_schema = pyarrow.schema(arrow_fields)

bqstorage_client = mock.create_autospec(
bigquery_storage_v1beta1.BigQueryStorageClient
)
session = bigquery_storage_v1beta1.types.ReadSession(
streams=[{"name": "/projects/proj/dataset/dset/tables/tbl/streams/1234"}]
)
session.avro_schema.schema = json.dumps(
{
"fields": [
{"name": "colA"},
# Not alphabetical to test column order.
{"name": "colC"},
{"name": "colB"},
]
}
streams=[{"name": "/projects/proj/dataset/dset/tables/tbl/streams/1234"}],
arrow_schema={"serialized_schema": arrow_schema.serialize().to_pybytes()},
)
bqstorage_client.create_read_session.return_value = session

Expand Down Expand Up @@ -2327,11 +2326,20 @@ def test_to_dataframe_w_bqstorage_empty_streams(self):
@unittest.skipIf(
bigquery_storage_v1beta1 is None, "Requires `google-cloud-bigquery-storage`"
)
@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
def test_to_dataframe_w_bqstorage_nonempty(self):
from google.cloud.bigquery import schema
from google.cloud.bigquery import table as mut
from google.cloud.bigquery_storage_v1beta1 import reader

arrow_fields = [
pyarrow.field("colA", pyarrow.int64()),
# Not alphabetical to test column order.
pyarrow.field("colC", pyarrow.float64()),
pyarrow.field("colB", pyarrow.utf8()),
]
arrow_schema = pyarrow.schema(arrow_fields)

bqstorage_client = mock.create_autospec(
bigquery_storage_v1beta1.BigQueryStorageClient
)
Expand All @@ -2340,16 +2348,9 @@ def test_to_dataframe_w_bqstorage_nonempty(self):
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/1234"},
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/5678"},
]
session = bigquery_storage_v1beta1.types.ReadSession(streams=streams)
session.avro_schema.schema = json.dumps(
{
"fields": [
{"name": "colA"},
# Not alphabetical to test column order.
{"name": "colC"},
{"name": "colB"},
]
}
session = bigquery_storage_v1beta1.types.ReadSession(
streams=streams,
arrow_schema={"serialized_schema": arrow_schema.serialize().to_pybytes()},
)
bqstorage_client.create_read_session.return_value = session

Expand Down Expand Up @@ -2400,17 +2401,23 @@ def test_to_dataframe_w_bqstorage_nonempty(self):
@unittest.skipIf(
bigquery_storage_v1beta1 is None, "Requires `google-cloud-bigquery-storage`"
)
@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
def test_to_dataframe_w_bqstorage_multiple_streams_return_unique_index(self):
from google.cloud.bigquery import schema
from google.cloud.bigquery import table as mut
from google.cloud.bigquery_storage_v1beta1 import reader

arrow_fields = [pyarrow.field("colA", pyarrow.int64())]
arrow_schema = pyarrow.schema(arrow_fields)

streams = [
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/1234"},
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/5678"},
]
session = bigquery_storage_v1beta1.types.ReadSession(streams=streams)
session.avro_schema.schema = json.dumps({"fields": [{"name": "colA"}]})
session = bigquery_storage_v1beta1.types.ReadSession(
streams=streams,
arrow_schema={"serialized_schema": arrow_schema.serialize().to_pybytes()},
)

bqstorage_client = mock.create_autospec(
bigquery_storage_v1beta1.BigQueryStorageClient
Expand Down Expand Up @@ -2448,6 +2455,7 @@ def test_to_dataframe_w_bqstorage_multiple_streams_return_unique_index(self):
bigquery_storage_v1beta1 is None, "Requires `google-cloud-bigquery-storage`"
)
@unittest.skipIf(tqdm is None, "Requires `tqdm`")
@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
@mock.patch("tqdm.tqdm")
def test_to_dataframe_w_bqstorage_updates_progress_bar(self, tqdm_mock):
from google.cloud.bigquery import schema
Expand All @@ -2457,6 +2465,9 @@ def test_to_dataframe_w_bqstorage_updates_progress_bar(self, tqdm_mock):
# Speed up testing.
mut._PROGRESS_INTERVAL = 0.01

arrow_fields = [pyarrow.field("testcol", pyarrow.int64())]
arrow_schema = pyarrow.schema(arrow_fields)

bqstorage_client = mock.create_autospec(
bigquery_storage_v1beta1.BigQueryStorageClient
)
Expand All @@ -2466,8 +2477,10 @@ def test_to_dataframe_w_bqstorage_updates_progress_bar(self, tqdm_mock):
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/1234"},
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/5678"},
]
session = bigquery_storage_v1beta1.types.ReadSession(streams=streams)
session.avro_schema.schema = json.dumps({"fields": [{"name": "testcol"}]})
session = bigquery_storage_v1beta1.types.ReadSession(
streams=streams,
arrow_schema={"serialized_schema": arrow_schema.serialize().to_pybytes()},
)
bqstorage_client.create_read_session.return_value = session

mock_rowstream = mock.create_autospec(reader.ReadRowsStream)
Expand Down Expand Up @@ -2521,6 +2534,7 @@ def blocking_to_dataframe(*args, **kwargs):
@unittest.skipIf(
bigquery_storage_v1beta1 is None, "Requires `google-cloud-bigquery-storage`"
)
@unittest.skipIf(pyarrow is None, "Requires `pyarrow`")
def test_to_dataframe_w_bqstorage_exits_on_keyboardinterrupt(self):
from google.cloud.bigquery import schema
from google.cloud.bigquery import table as mut
Expand All @@ -2529,6 +2543,14 @@ def test_to_dataframe_w_bqstorage_exits_on_keyboardinterrupt(self):
# Speed up testing.
mut._PROGRESS_INTERVAL = 0.01

arrow_fields = [
pyarrow.field("colA", pyarrow.int64()),
# Not alphabetical to test column order.
pyarrow.field("colC", pyarrow.float64()),
pyarrow.field("colB", pyarrow.utf8()),
]
arrow_schema = pyarrow.schema(arrow_fields)

bqstorage_client = mock.create_autospec(
bigquery_storage_v1beta1.BigQueryStorageClient
)
Expand All @@ -2539,10 +2561,8 @@ def test_to_dataframe_w_bqstorage_exits_on_keyboardinterrupt(self):
# ends early.
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/1234"},
{"name": "/projects/proj/dataset/dset/tables/tbl/streams/5678"},
]
)
session.avro_schema.schema = json.dumps(
{"fields": [{"name": "colA"}, {"name": "colB"}, {"name": "colC"}]}
],
arrow_schema={"serialized_schema": arrow_schema.serialize().to_pybytes()},
)
bqstorage_client.create_read_session.return_value = session

Expand Down
6 changes: 1 addition & 5 deletions bigquery_datatransfer/README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Python Client for BigQuery Data Transfer API
============================================

|alpha| |pypi| |versions| |compat_check_pypi| |compat_check_github|
|alpha| |pypi| |versions|

The `BigQuery Data Transfer API`_ allows users to transfer data from partner
SaaS applications to Google BigQuery on a scheduled, managed basis.
Expand All @@ -15,10 +15,6 @@ SaaS applications to Google BigQuery on a scheduled, managed basis.
:target: https://pypi.org/project/google-cloud-bigquery-datatransfer/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-bigquery-datatransfer.svg
:target: https://pypi.org/project/google-cloud-bigquery-datatransfer/
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-cloud-bigquery-datatransfer
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-cloud-bigquery-datatransfer
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigquery_datatransfer
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigquery_datatransfer
.. _BigQuery Data Transfer API: https://cloud.google.com/bigquery/transfer
.. _Client Library Documentation: https://googleapis.dev/python/bigquerydatatransfer/latest
.. _Product Documentation: https://cloud.google.com/bigquery/docs/transfer-service-overview
Expand Down
6 changes: 1 addition & 5 deletions bigquery_storage/README.rst
Original file line number Diff line number Diff line change
@@ -1,17 +1,13 @@
Python Client for BigQuery Storage API (`Beta`_)
=================================================

|compat_check_pypi| |compat_check_github|


`BigQuery Storage API`_:

- `Client Library Documentation`_
- `Product Documentation`_

.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-cloud-bigquery-storage
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-cloud-bigquery-storage
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigquery_storage
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigquery_storage
.. _Beta: https://github.com/googleapis/google-cloud-python/blob/master/README.rst
.. _BigQuery Storage API: https://cloud.google.com/bigquery/docs/reference/storage/
.. _Client Library Documentation: https://googleapis.dev/python/bigquerystorage/latest
Expand Down
6 changes: 1 addition & 5 deletions bigtable/README.rst
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
Python Client for Google Cloud Bigtable
=======================================

|beta| |pypi| |versions| |compat_check_pypi| |compat_check_github|
|beta| |pypi| |versions|

`Google Cloud Bigtable`_ is Google's NoSQL Big Data database service. It's the
same database that powers many core Google services, including Search,
Expand All @@ -16,10 +16,6 @@ Analytics, Maps, and Gmail.
:target: https://pypi.org/project/google-cloud-bigtable/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-bigtable.svg
:target: https://pypi.org/project/google-cloud-bigtable/
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=google-cloud-bigtable
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=google-cloud-bigtable
.. |compat_check_github| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigtable
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=git%2Bgit%3A//github.com/googleapis/google-cloud-python.git%23subdirectory%3Dbigtable
.. _Google Cloud Bigtable: https://cloud.google.com/bigtable
.. _Client Library Documentation: https://googleapis.dev/python/bigtable/latest
.. _Product Documentation: https://cloud.google.com/bigtable/docs
Expand Down
Loading

0 comments on commit 2f13d36

Please sign in to comment.