Skip to content
Permalink
Branch: master
Commits on Apr 9, 2019
  1. Add bigquerystorage_jupyter_tutorial_query_default region tag.

    tswast committed Apr 9, 2019
  2. Add magics tutorial with BigQuery Storage API integration. (#2087)

    tswast committed Apr 9, 2019
    * Add magics tutorial with BigQuery Storage API integration.
    
    This is a notebooks tutorial, modeled after the Jupyter notebook example
    code for BigQuery. Use some caution when running these tests, as they
    run some large-ish (5 GB processed) queries and download about 500 MB
    worth of data. This is intentional, as the BigQuery Storage API is most
    useful for downloading large results.
    
    * Update deps.
    
    * Don't run big queries on Travis.
Commits on Apr 4, 2019
  1. Remove temporary dataset from bqstorage pandas tutorial (#2088)

    tswast committed Apr 4, 2019
    * Remove temporary dataset from bqstorage pandas tutorial
    
    As of google-cloud-bigquery version 1.11.1, the `to_dataframe` method
    will fallback to the tabledata.list API when the BigQuery Storage API
    fails to read the query results.
    
    * Remove unused imports
Commits on Feb 7, 2019
  1. BigQuery Storage API sample for reading pandas dataframe (#1994)

    tswast committed Feb 7, 2019
    * BigQuery Storage API sample for reading pandas dataframe
    
    How to get a pandas DataFrame, fast!
    
    The first two examples use the existing BigQuery client. These examples
    create a thread pool and read in parallel. The final example shows using
    just the new BigQuery Storage client, but only shows how to read with a
    single thread.
Commits on Jan 9, 2019
  1. Add missing explanation about local_server callback URL

    tswast committed Jan 9, 2019
    Per feedback on CL 227863277
Commits on Jan 8, 2019
  1. Make TODO clearer what action to take. (#1963)

    tswast committed Jan 8, 2019
    * Make TODO clearer what action to take.
    
    Describe the variables to set explicitly (rather than "this"), as
    suggested in internal CL 227863277.
    
    * Use standard 'uncomment the variable below' for TODO
    
    * Move variable below TODO.
Commits on Jan 2, 2019
  1. Refactor BQ user credentials sample to use region tags (#1952)

    tswast committed Jan 2, 2019
    Also add comments for the needed values and use a concrete value where
    possible according to the sample rubric.
Commits on Sep 23, 2018
  1. Test pythonvirtualenvoperator sample in Python 3. (#1714)

    tswast committed Sep 23, 2018
Commits on Sep 18, 2018
  1. Add Composer samples that explicitly call python2 (#1711)

    tswast committed Sep 18, 2018
    * Add Composer samples that explicitly call python2
    
    * Use new unit_testing module for Python 2 DAG samples.
  2. Test Composer DAG samples to verify they contain a valid DAG

    tswast committed Sep 18, 2018
    Inspiration from Circle 1 of
    https://medium.com/wbaa/datas-inferno-7-circles-of-data-testing-hell-with-airflow-cef4adff58d8
    
    Beyond just asserting there are no syntax errors, we can verify that the
    DAG files actually contain a DAG and that Airflow detects no cycles in
    that DAG.
Commits on Sep 13, 2018
  1. Add Composer Environment Migration script (#1705)

    tswast committed Sep 13, 2018
    Add a script for creating a clone of an existing Cloud Composer
    Environment using the latest Composer image version.
Commits on Sep 11, 2018
  1. Remove appengine BigQuery sample (#1700)

    tswast committed Sep 11, 2018
    google-api-python-client no longer uses oauth2client by default. Since
    this sample is only linked to and not included in any tutorials,
    removing it for now rather than updating it to use google-auth.
  2. Test Composer samples on Python 3 (#1688)

    tswast committed Sep 11, 2018
    * Test Composer samples on Python 3
    
    Note: I had to skip the use_local_deps workflow sample because of a
    change in relative versus absolute imports between Python 2 and Python
    3. I have not yet checked the workflow within Airflow to see if it runs
    under Python 3.
    
    * Composer: Use slugify for unicode. Run local deps test in Python 3.
Commits on Aug 31, 2018
  1. Remove ipython Python 2 modifier from requirements.txt (#1675)

    tswast committed Aug 31, 2018
    If ipython has tagged their packages correctly, then the modifier is not necessary. Sending a PR to check. Bug 113341391.
Commits on Aug 22, 2018
  1. Composer: Move trigger response DAG to GitHub. (#1645)

    tswast committed Aug 22, 2018
    * Composer: Move trigger response DAG to GitHub.
    
    Sample originally was posted at https://cloud.google.com/composer/docs/how-to/using/triggering-with-gcf#wzxhzdk15gcs_response_dagpywzxhzdk16
    
    * Composer: Add snippet to get the client ID for a composer env.
Commits on Aug 9, 2018
  1. Add guide on using multiple Python versions for local development (#1621

    tswast committed Aug 9, 2018
    )
    
    * Add guide on using multiple Python versions for local development
    * Add note about why these instructions differ from the recommended GCP setup.
Commits on Aug 1, 2018
  1. Update vision web_detect test image (#1607)

    tswast committed Aug 1, 2018
    The original image no longer appears on cloud.google.com/vision
Commits on Jul 25, 2018
  1. Use example node pool names for affinity argument (#1601)

    tswast committed Jul 25, 2018
    I believe this will make it clearer what the actual value should be for the affinity label.
Commits on Jul 24, 2018
  1. Add samples for using GCP connections in an Airflow DAG (#1590)

    tswast committed Jul 24, 2018
Commits on Jul 19, 2018
  1. Use absolute / implicit relative imports for local deps

    tswast committed Jul 19, 2018
    Since Composer is Python 2.7 only for now, this sample can use implicit
    relative imports. Airflow doesn't seem to support explicit relative
    imports when I try to run the use_local_deps.py file in Composer.
    
    Aside: Airflow is using the imp.load_source method to load the DAG
    modules. This will be problematic for Python 3 support, see:
    https://issues.apache.org/jira/browse/AIRFLOW-2243.
Commits on Jun 15, 2018
  1. Composer: Add schedule to simple workflow. (#1520)

    tswast committed Jun 15, 2018
    Without a schedule this DAG gets scheduled way too often when I upload it to a Cloud Composer environment.
Commits on May 7, 2018
  1. composer: add region tags to simple dag example (#1473)

    tswast committed May 7, 2018
Commits on May 2, 2018
  1. Composer: add simple workflow example. Use 1.10beta instead of git. (#…

    tswast committed May 2, 2018
    …1469)
    
    The `composer_simple` sample is intended to demonstrate the concepts of
    constructing Apache Airflow DAGs rather than any specific set of
    operators.
    
    By using the 1.10beta instead of git for the dependency, we have a
    stabler package than building on master. Also, this will be easier for
    Travis to cache.
Commits on May 1, 2018
  1. Composer: initial workflow and REST samples (#1468)

    tswast committed May 1, 2018
    * Add quickstart example DAG.
    
    Add first airflow example DAG. This DAG demonstrates how to call
    Dataproc from airflow.
    
    Change-Id: I2f0464da37f11085dfc927f4f4d96d6a99634aad
    
    * Add region tags for cloud.google.com.
    
    Also, adjusts indentation to 4 spaces so that `flake8` passes.
    
    Change-Id: Id3e1fbfa5bd41a4e11e0cd7991b058b063241e6b
    
    * Add composer_quickstart_steps region tag.
    
    Change-Id: Ib0bbacd91529fac272e2da0c72abca5db9b4fad8
    
    * Test Composer workflow sample.
    
    The GCP Airflow operators only work on Python 2.7 (a constraint
    they inherited from Apache Beam), so I've made some changes to nox.py to
    exclude them from the Python 3 builds.
    
    Change-Id: Ia10971dd5a7b14279b236041836e317e79693258
    
    * Add DAG that runs BigQuery and notifies user.
    
    Add a sample Airflow DAG that will create BigQuery dataset, run
    BigQuery, export results to Cloud Storage, notify user by email when
    results are ready, and clean up the dataset.
    
    Change-Id: Ia8242df29223d910b2d269a9bb93720b35470b7a
    
    * Composer: add sample to get GCS bucket for DAGs
    
    Since there are not Cloud client libraries for Composer yet, this sample
    uses the REST API directly via the google-auth and requests libraries.
    
    Sample to be used at https://cloud.google.com/composer/docs/how-to/using/managing-dags
    
    Also, enables Kokoro testing for Composer samples. (Uses Python 2.7
    since Cloud Composer is currently restricted to Python 2.7)
    
    Change-Id: Icb37e079992c88eedc06cdcc3d72db5106d10ef5
    
    * Add tests for BQ notify DAG.
    
    Requires master copy of Airflow for `bigquery_get_data` operartor.
    
    Change-Id: I73cd2cfb2458b67bed1a77e65966d5018e8bb45d
    
    * Composer: Fix flake8 errors.
    
    Change-Id: I2856bc6cb866bd6f7abbac8de3323797a83c9857
    
    * Composer: add region tags to notification DAG sample.
    
    Change-Id: I657e052fa851daa7c72045762090a2e27dd406d3
    
    * Set machine type in quickstart to n1-standard-1.
    
    The default machine type was n1-standard-4, which exceeds trial
    project quota. This CL changes the machine type to n1-standard-1
    since a more powerful machine is not necessary for quickstart
    sample data.
    
    Change-Id: I46af68c29145f7a7ce303afdad4708bda7d9e6dd
    
    * Add composer config to travis test env.
    
    Change-Id: I9c27c75cbea8c5ed4edf859d26980e24ea270eea
Commits on Mar 21, 2018
  1. bigquery_datatransfer -> bigquerydatatransfer

    tswast committed Mar 21, 2018
    Update region tags because we are treating BigQuery Data Transfer Service as its own product in the samples tracker.
Commits on Dec 27, 2017
  1. BigQuery: Add Data Transfer Service quickstart. (#1295)

    tswast committed Dec 27, 2017
    * BigQuery: Add Data Transfer Service quickstart.
    
    Client library docs:
    https://googlecloudplatform.github.io/google-cloud-python/latest/bigquery_datatransfer/index.html
    
    * Enable BigQuery Data Transfer API in test project.
    * Remove project from quickstart test assertion. Don't depend on specific data sources being available.
    
    I believe the reason the tests are failing is that the data sources
    weren't allowed for the test project because the API was enabled, but
    the project wasn't enrolled as described in
    https://cloud.google.com/bigquery/docs/enable-transfer-service
Commits on Dec 12, 2017
  1. Merge pull request #1270 from GoogleCloudPlatform/tswast-simple-app

    tswast committed Dec 12, 2017
    BigQuery: rewrite simple app tutorial.
Commits on Dec 7, 2017
  1. Fix broken link to core Client service account helper. (#1256)

    tswast authored and theacodes committed Dec 7, 2017
Commits on Dec 4, 2017
  1. BigQuery: rewrite simple app tutorial.

    tswast committed Dec 2, 2017
    - Add region tags for needed dependencies.
    - Use more relevant query from public datasets.
Commits on Dec 1, 2017
  1. /s/buckets/datasets in BigQuery auth sample (#1242)

    tswast authored and theacodes committed Dec 1, 2017
Commits on Sep 19, 2017
Commits on Aug 22, 2017
  1. Merge pull request #1086 from GoogleCloudPlatform/tswast-patch-1

    tswast committed Aug 22, 2017
    BigQuery : max_results changes page size not full list size
  2. Remove unnecessary list() call

    tswast committed Aug 22, 2017
  3. Remove max_results argument from table.fetch_data() call.

    tswast committed Aug 22, 2017
    The sample uses islice instead, which demonstrates that fetch_data returns an iterable.
Commits on Aug 21, 2017
  1. BigQuery : max_results changes page size not full list size

    tswast committed Aug 21, 2017
    Also, running a query is not "preferred" if you do want all the data. If you do run a query, you end up reading data from the destination table anyway.
Older
You can’t perform that action at this time.