Skip to content

Commit

Permalink
Prep bigquery docs for repo split (#5955)
Browse files Browse the repository at this point in the history
* Move 'docs/bigquery' to 'bigquery/docs' and leave symlink behind.

* Rename BQ's 'usage.rst' -> 'index.rst'.

* DRY 'bigquery/README.rst'<->'bigquery/docs/index.rst'.

* Add Sphinx logic for managing static redirect files.

  Add a redirect for BigQuery from 'usage.html' -> 'index.html'.

* Find snippets under 'bigquery/docs/'.
  • Loading branch information
tseaver committed Sep 13, 2018
1 parent 64caa5f commit c2a7bff
Show file tree
Hide file tree
Showing 14 changed files with 109 additions and 106 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,7 @@ htmlcov
# Built documentation
docs/_build
docs/_build_doc2dash
bigquery/docs/generated

# Virtual environment
env/
Expand Down
93 changes: 54 additions & 39 deletions bigquery/README.rst
Original file line number Diff line number Diff line change
@@ -1,50 +1,76 @@
Python Client for Google BigQuery
=================================

Python idiomatic client for `Google BigQuery`_

.. _Google BigQuery: https://cloud.google.com/bigquery/what-is-bigquery

|pypi| |versions|

- `Documentation`_
Querying massive datasets can be time consuming and expensive without the
right hardware and infrastructure. Google `BigQuery`_ solves this problem by
enabling super-fast, SQL queries against append-mostly tables, using the
processing power of Google's infrastructure.

.. _Documentation: https://googlecloudplatform.github.io/google-cloud-python/latest/bigquery/usage.html
- `Client Library Documentation`_
- `Product Documentation`_

.. |pypi| image:: https://img.shields.io/pypi/v/google-cloud-bigquery.svg
:target: https://pypi.org/project/google-cloud-bigquery/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-bigquery.svg
:target: https://pypi.org/project/google-cloud-bigquery/
.. _BigQuery: https://cloud.google.com/bigquery/what-is-bigquery
.. _Client Library Documentation: https://googlecloudplatform.github.io/google-cloud-python/latest/bigquery/index.html
.. _Product Documentation: https://cloud.google.com/bigquery/docs/reference/v2/

Quick Start
-----------

.. code-block:: console
In order to use this library, you first need to go through the following steps:

$ pip install --upgrade google-cloud-bigquery
1. `Select or create a Cloud Platform project.`_
2. `Enable billing for your project.`_
3. `Enable the Google Cloud Datastore API.`_
4. `Setup Authentication.`_

For more information on setting up your Python development environment,
such as installing ``pip`` and ``virtualenv`` on your system, please refer
to `Python Development Environment Setup Guide`_ for Google Cloud Platform.
.. _Select or create a Cloud Platform project.: https://console.cloud.google.com/project
.. _Enable billing for your project.: https://cloud.google.com/billing/docs/how-to/modify-project#enable_billing_for_a_project
.. _Enable the Google Cloud Datastore API.: https://cloud.google.com/bigquery
.. _Setup Authentication.: https://googlecloudplatform.github.io/google-cloud-python/latest/core/auth.html

.. _Python Development Environment Setup Guide: https://cloud.google.com/python/setup
Installation
~~~~~~~~~~~~

Authentication
--------------
Install this library in a `virtualenv`_ using pip. `virtualenv`_ is a tool to
create isolated Python environments. The basic problem it addresses is one of
dependencies and versions, and indirectly permissions.

With ``google-cloud-python`` we try to make authentication as painless as
possible. Check out the `Authentication section`_ in our documentation to
learn more. You may also find the `authentication document`_ shared by all
the ``google-cloud-*`` libraries to be helpful.
With `virtualenv`_, it's possible to install this library without needing system
install permissions, and without clashing with the installed system
dependencies.

.. _Authentication section: https://google-cloud-python.readthedocs.io/en/latest/core/auth.html
.. _authentication document: https://github.com/GoogleCloudPlatform/google-cloud-common/tree/master/authentication
.. _`virtualenv`: https://virtualenv.pypa.io/en/latest/

Using the API
-------------

Querying massive datasets can be time consuming and expensive without the
right hardware and infrastructure. Google `BigQuery`_ (`BigQuery API docs`_)
solves this problem by enabling super-fast, SQL queries against
append-mostly tables, using the processing power of Google's infrastructure.
Mac/Linux
^^^^^^^^^

.. _BigQuery: https://cloud.google.com/bigquery/what-is-bigquery
.. _BigQuery API docs: https://cloud.google.com/bigquery/docs/reference/v2/
.. code-block:: console
pip install virtualenv
virtualenv <your-env>
source <your-env>/bin/activate
<your-env>/bin/pip install google-cloud-bigquery
Windows
^^^^^^^

.. code-block:: console
pip install virtualenv
virtualenv <your-env>
<your-env>\Scripts\activate
<your-env>\Scripts\pip.exe install google-cloud-bigquery
Example Usage
-------------

Create a dataset
~~~~~~~~~~~~~~~~
Expand Down Expand Up @@ -106,14 +132,3 @@ Perform a query
for row in rows:
print(row.name)
See the ``google-cloud-python`` API `BigQuery documentation`_ to learn how
to connect to BigQuery using this Client Library.

.. _BigQuery documentation: https://googlecloudplatform.github.io/google-cloud-python/latest/bigquery/usage.html

.. |pypi| image:: https://img.shields.io/pypi/v/google-cloud-bigquery.svg
:target: https://pypi.org/project/google-cloud-bigquery/
.. |versions| image:: https://img.shields.io/pypi/pyversions/google-cloud-bigquery.svg
:target: https://pypi.org/project/google-cloud-bigquery/
File renamed without changes.
1 change: 1 addition & 0 deletions bigquery/docs/changelog.md
File renamed without changes.
82 changes: 22 additions & 60 deletions docs/bigquery/usage.rst → bigquery/docs/index.rst
Original file line number Diff line number Diff line change
@@ -1,57 +1,7 @@
BigQuery
========

.. toctree::
:maxdepth: 2
:hidden:

reference
dbapi

.. contents:: :local:

Installation
------------

Install the ``google-cloud-bigquery`` library using ``pip``:

.. code-block:: console
$ pip install google-cloud-bigquery
.. note::

This library changed significantly before the 1.0.0 release, especially
between version 0.27 and 0.28. See `Migrating from the BigQuery Python
client library version 0.27
<https://cloud.google.com/bigquery/docs/python-client-migration>`__ for
instructions on how to migrated your code to the most recent version of
this library.

Authentication / Configuration
------------------------------

- Use :class:`Client <google.cloud.bigquery.client.Client>` objects to configure
your applications.

- :class:`Client <google.cloud.bigquery.client.Client>` objects hold both a ``project``
and an authenticated connection to the BigQuery service.

- The authentication credentials can be implicitly determined from the
environment or directly via
:meth:`from_service_account_json <google.cloud.bigquery.client.Client.from_service_account_json>`
and
:meth:`from_service_account_p12 <google.cloud.bigquery.client.Client.from_service_account_p12>`.

- After setting :envvar:`GOOGLE_APPLICATION_CREDENTIALS` and
:envvar:`GOOGLE_CLOUD_PROJECT` environment variables, create an instance of
:class:`Client <google.cloud.bigquery.client.Client>`.

.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client()
.. include:: /../bigquery/README.rst

Using the Library
=================

Projects
--------
Expand All @@ -68,8 +18,8 @@ To override the project inferred from the environment, pass an explicit

.. code-block:: python
>>> from google.cloud import bigquery
>>> client = bigquery.Client(project='PROJECT_ID')
from google.cloud import bigquery
client = bigquery.Client(project='PROJECT_ID')
Project ACLs
Expand Down Expand Up @@ -155,7 +105,7 @@ Tables exist within datasets. See BigQuery documentation for more information
on `Tables <https://cloud.google.com/bigquery/docs/tables>`_.

Table operations
~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~~~~
List tables for the dataset:

.. literalinclude:: snippets.py
Expand Down Expand Up @@ -237,7 +187,7 @@ Upload table data from a file:
:end-before: [END bigquery_load_from_file]

Load table data from Google Cloud Storage
*****************************************
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

See also: `Loading JSON data from Cloud Storage
<https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-json>`_.
Expand Down Expand Up @@ -324,7 +274,7 @@ Queries


Querying data
~~~~~~~~~~~~~~~~~~~~~~~~~~~
~~~~~~~~~~~~~

Run a query and wait for it to finish:

Expand Down Expand Up @@ -371,8 +321,11 @@ See BigQuery documentation for more information on
:end-before: [END bigquery_query_params_named]


Jobs
----

List jobs for a project
-----------------------
~~~~~~~~~~~~~~~~~~~~~~~

Jobs describe actions performed on data in BigQuery tables:

Expand Down Expand Up @@ -451,8 +404,17 @@ and load it into a new table:
:start-after: [START bigquery_load_table_dataframe]
:end-before: [END bigquery_load_table_dataframe]

API Reference
=============

.. toctree::
:maxdepth: 2

reference
dbapi

Changelog
---------
=========

For a list of all ``google-cloud-bigquery`` releases:

Expand Down
File renamed without changes.
File renamed without changes.
8 changes: 8 additions & 0 deletions bigquery/docs/usage.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
<html>
<head>
<meta http-equiv="refresh" content="1; url=./index.html:" />
<script>
window.location.href = "./index.html"
</script>
</head>
</html>
7 changes: 2 additions & 5 deletions bigquery/nox.py
Original file line number Diff line number Diff line change
Expand Up @@ -147,10 +147,7 @@ def snippets(session, py):

# Run py.test against the system tests.
session.run(
'py.test',
os.path.join(os.pardir, 'docs', 'bigquery', 'snippets.py'),
*session.posargs
)
'py.test', os.path.join('docs', 'snippets.py'), *session.posargs)


@nox.session
Expand All @@ -167,7 +164,7 @@ def lint(session):
session.run('flake8', os.path.join('google', 'cloud', 'bigquery'))
session.run('flake8', 'tests')
session.run(
'flake8', os.path.join(os.pardir, 'docs', 'bigquery', 'snippets.py'))
'flake8', os.path.join('docs', 'snippets.py'))


@nox.session
Expand Down
1 change: 1 addition & 0 deletions docs/bigquery
1 change: 0 additions & 1 deletion docs/bigquery/changelog.md

This file was deleted.

19 changes: 19 additions & 0 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,7 @@
import email
import os
import pkg_resources
import shutil

# If extensions (or modules to document with autodoc) are in another directory,
# add these directories to sys.path here. If the directory is relative to the
Expand Down Expand Up @@ -319,3 +320,21 @@
'pandas': ('http://pandas.pydata.org/pandas-docs/stable/', None),
'python': ('https://docs.python.org/3', None),
}

# Static HTML pages, e.g. to support redirects
# See: https://tech.signavio.com/2017/managing-sphinx-redirects
# HTML pages to be copied from source to target
static_html_pages = [
'bigquery/usage.html',
]

def copy_static_html_pages(app, docname):
if app.builder.name == 'html':
for static_html_page in static_html_pages:
target_path = app.outdir + '/' + static_html_page
src_path = app.srcdir + '/' + static_html_page
if os.path.isfile(src_path):
shutil.copyfile(src_path, target_path)

def setup(app):
app.connect('build-finished', copy_static_html_pages)
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
Core Libraries <core/index>
Asset Management <asset/index>
AutoML <automl/index>
BigQuery <bigquery/usage>
BigQuery <bigquery/index>
BigQuery Data-Transfer <bigquery_datatransfer/index>
Bigtable <bigtable/usage>
Container <container/index>
Expand Down

0 comments on commit c2a7bff

Please sign in to comment.