Google Datalab Library
Clone or download
qimingj Fix an issue that TF Events are not parsed because tf record appears …
…to be corrupted. (#710)

* Fix an issue that TF Events are not parsed because tf record appears to be corrupted.

* Follow up CR comments.

* Fix travis tests failure.

* fix for flake.
Latest commit 3f098f0 Oct 5, 2018
Permalink
Failed to load latest commit information.
datalab Avoid throwing exceptions on bigquery tables that are not printable. (#… Jun 14, 2018
docs
externs/ts/require Fix path. May 24, 2016
google Fix an issue that TF Events are not parsed because tf record appears … Oct 5, 2018
legacy_tests Switching to google.auth (#516) Sep 12, 2017
solutionbox Update calls to a since-renamed beam API. (#694) Jul 11, 2018
tests Add descriptive message on successful deployment to Composer (#700) Aug 18, 2018
.build-bot.json Expose inception v3 checkpoint in image_to_vec transform. (#498) Sep 6, 2017
.coveragerc Run and export test coverage numbers to coveralls (#406) Jun 12, 2017
.gitignore Switching to google.auth (#516) Sep 12, 2017
.travis.yml
CONTRIBUTING.md
LICENSE.txt Initial version from https://github.com/GoogleCloudPlatform/datalab/ May 17, 2016
README.md Update README.md (#523) Sep 15, 2017
install-no-virtualenv.sh
install-virtualenv.sh Use -e in all shell scripts to fail on error (#107) Dec 8, 2016
release.sh
setup.cfg Fix all flake8 warnings. Enable all flake8 codes except E111, E114, E… Mar 21, 2017
setup.py Pins the google-cloud-monitoring version (#705) Aug 17, 2018
tox.ini Update calls to a since-renamed beam API. (#694) Jul 11, 2018

README.md

datalab Build Status PyPI Package

Google Cloud Datalab Python package. Used in Google Cloud Datalab and can be used in Jupyter Notebook.

This adds a number of Python modules such as google.datalab.bigquery, google.datalab.storage, etc, for accessing Google Cloud Platform services as well as adding some new cell magics such as %chart, %bigquery, %storage, etc.

See https://github.com/googledatalab/notebooks for samples of using this package.

Installation

This package is available on PyPI as datalab:

pip install datalab

Using in Jupyter

After datalab installation, enable datalab's frontend in Jupyter by running:

jupyter nbextension install --py datalab.notebook --sys-prefix

See further details Jupyter Kernel and Notebook Extensions.

Then in a notebook cell, enable datalab's magics with:

%load_ext google.datalab.kernel

(Note: If you hit an error "module traceback cannot be imported", try setting the following environment variable: CLOUDSDK_PYTHON_SITEPACKAGES=1)

Alternatively add this to your ipython_config.py file in your profile:

c = get_config()
c.InteractiveShellApp.extensions = [
    'google.datalab.kernel'
]

You will typically put this under ~/.ipython/profile_default. See the IPython docs for more about IPython profiles.

If you want to access Google Cloud Platform services such as BigQuery, you will also need to install gcloud. You will need to use gcloud to authenticate; e.g. with:

gcloud auth login

You will also need to set the project ID to use; either set a PROJECT_ID environment variable to the project name, or call set_datalab_project_id(name) from within your notebook.

Documentation

You can read the Sphinx generated docs at: http://googledatalab.github.io/pydatalab/

Development installation

If you'd like to work on the package, it's useful to be able to install from source. You will need the Typescript compiler installed.

First:

git clone https://github.com/googledatalab/pydatalab.git
cd pydatalab

Then do one of the folowing:

./install-virtualenv.sh  # For use in Python virtual environments
./install-no-virtualenv.sh  # For installing in a non-virtual environment

You can ignore the message about running jupyter nbextension enable; it is not required.