| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -4,8 +4,3 @@ jobs: | |
| parameters: | ||
| name: Linux | ||
| vmImage: ubuntu-16.04 | ||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,23 +1,16 @@ | ||
| ARG PYTHON_VERSION | ||
| FROM ibis:$PYTHON_VERSION | ||
|
|
||
| COPY . /ibis | ||
| WORKDIR /ibis | ||
|
|
||
| # fonts are for docs | ||
| RUN apt-get -qq update --yes \ | ||
| && apt-get -qq install --yes ttf-dejavu iputils-ping \ | ||
| && rm -rf /var/lib/apt/lists/* \ | ||
| && /opt/conda/bin/conda config --add channels conda-forge \ | ||
| && /opt/conda/bin/conda update --all --yes \ | ||
| && /opt/conda/bin/conda clean --all --yes \ | ||
| && pip install -e . --no-deps --ignore-installed --no-cache-dir | ||
|
|
||
| SHELL ["conda", "run", "-n", "ibis-env", "/bin/bash", "-c"] |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| google-cloud-bigquery-core >=1.12.0,<1.24.0dev | ||
| pydata-google-auth |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,5 @@ | ||
| sqlalchemy>=1.3 | ||
| clickhouse-cityhash | ||
| clickhouse-driver>=0.1.3 | ||
| clickhouse-sqlalchemy | ||
| lz4 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,6 @@ | ||
| sqlalchemy>=1.3 | ||
| impyla>=0.15.0 | ||
| requests>=2.24 | ||
| thrift>=0.9.3 | ||
| thriftpy2>=0.4 | ||
| thrift_sasl>=0.2.1 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| sqlalchemy>=1.3 | ||
| pymysql |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| pymapd==0.24 | ||
| pyarrow |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| pyarrow>=0.13 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| sqlalchemy>=1.3 | ||
| psycopg2>=2.8 | ||
| geoalchemy2>=0.6 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| # Need to import double-conversion below otherwise `import pyarrow` fails with ImportError: libdouble-conversion.so.3 | ||
| double-conversion | ||
| pyarrow=0.12.1 | ||
| pyspark=2.4.3 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| pyspark>=2.4.3 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| # Need to import double-conversion below otherwise `import pyarrow` fails with ImportError: libdouble-conversion.so.3 | ||
| double-conversion | ||
| pyarrow=0.12.1 | ||
| pyspark=2.4.3 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1 @@ | ||
| pyspark>=2.4.3 |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,6 +1,6 @@ | ||
| #!/bin/bash -e | ||
|
|
||
| export PYTHON_VERSION="3.7" | ||
|
|
||
| docker-compose build ibis | ||
| docker-compose build ibis-docs | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,94 @@ | ||
| # This is a copy of https://github.com/conda-forge/ibis-framework-feedstock/blob/master/recipe/meta.yaml | ||
| # Changes required to the recipe will be performed and tested with this file during | ||
| # development of Ibis, and this file needs to replace the original one on releases. | ||
| # | ||
| # Changes to the original file that need to be restored when copying to release: | ||
| # - Set the version in the first line: {% set version = "1.3.0" %} | ||
| # - Add `sha256` key to the `source` section, with the tar.gz hash | ||
| # - Set the `number` in the `build` section to the appropriate build number | ||
| # - Remove this comment from the beginning of the file | ||
|
|
||
| package: | ||
| name: ibis-framework | ||
| version: {{ version }} | ||
|
|
||
| source: | ||
| url: https://github.com/ibis-project/ibis/archive/{{ version }}.tar.gz | ||
|
|
||
| build: | ||
| number: 1 | ||
| script: {{ PYTHON }} -m pip install . --no-deps --ignore-installed --no-cache-dir -vvv | ||
| # uncomment noarch when pymapd and pyspark issues are fixed for py38 | ||
| # noarch: python | ||
|
|
||
| requirements: | ||
| host: | ||
| - pip | ||
| - python | ||
| - setuptools | ||
|
|
||
| run: | ||
| - clickhouse-driver >=0.1.3 | ||
| - clickhouse-cityhash # [not win] | ||
| - clickhouse-sqlalchemy | ||
| - geoalchemy2 | ||
| - geopandas | ||
| - google-cloud-bigquery-core >=1.12.0,<1.24.0dev | ||
| - graphviz | ||
| - impyla >=0.15.0 | ||
| - lz4 | ||
| - multipledispatch >=0.6 | ||
| - numpy >=1.15 | ||
| - pandas >=0.25.3 | ||
| - psycopg2 | ||
| - pyarrow >=0.15 | ||
| - pydata-google-auth | ||
| - pymapd 0.24 # [py<38] | ||
| - pymysql | ||
| - pyspark >=2.4.3 # [py<38] | ||
| - pytables >=3.0.0 | ||
| - python | ||
| - python-graphviz | ||
| - python-hdfs >=2.0.16 | ||
| - pytz | ||
| - regex | ||
| - requests | ||
| - shapely | ||
| - setuptools | ||
| - sqlalchemy >=1.1 | ||
| - thrift >=0.11 | ||
| - thriftpy2 | ||
| - toolz | ||
|
|
||
| test: | ||
| imports: | ||
| - ibis | ||
| - ibis.backends.bigquery | ||
| - ibis.backends.clickhouse | ||
| - ibis.backends.csv | ||
| - ibis.backends.parquet | ||
| - ibis.backends.hdf5 | ||
| - ibis.backends.impala | ||
| - ibis.backends.mysql | ||
| - ibis.backends.omniscidb # [py<38] | ||
| - ibis.backends.pandas | ||
| - ibis.backends.postgres | ||
| - ibis.backends.pyspark # [py<38] | ||
| - ibis.backends.spark | ||
| - ibis.backends.sqlite | ||
|
|
||
| about: | ||
| license: Apache-2.0 | ||
| license_family: Apache | ||
| license_file: LICENSE.txt | ||
| home: http://www.ibis-project.org | ||
| summary: Productivity-centric Python Big Data Framework | ||
|
|
||
| extra: | ||
| recipe-maintainers: | ||
| - cpcloud | ||
| - mariusvniekerk | ||
| - wesm | ||
| - kszucs | ||
| - xmnlab | ||
| - jreback |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,22 @@ | ||
| #!/bin/bash -e | ||
| # Run the Ibis tests. Two environment variables are considered: | ||
| # - PYTEST_BACKENDS: Space-separated list of backends to run | ||
| # - PYTEST_EXPRESSION: Marker expression, for example "not udf" | ||
|
|
||
| TESTS_DIRS="ibis/tests" | ||
| for BACKEND in $PYTEST_BACKENDS; do | ||
| if [[ -d ibis/$BACKEND/tests ]]; then | ||
| TESTS_DIRS="$TESTS_DIRS ibis/$BACKEND/tests" | ||
| fi | ||
| done | ||
|
|
||
| echo "TESTS_DIRS: $TESTS_DIRS" | ||
| echo "PYTEST_EXPRESSION: $PYTEST_EXPRESSION" | ||
|
|
||
|
|
||
| pytest $TESTS_DIRS \ | ||
| -m "${PYTEST_EXPRESSION}" \ | ||
| -ra \ | ||
| --junitxml=junit.xml \ | ||
| --cov=ibis \ | ||
| --cov-report=xml:coverage.xml |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,41 @@ | ||
| CREATE OR REPLACE TABLE `testing.functional_alltypes_parted` | ||
| ( | ||
| index INT64, | ||
| Unnamed_0 INT64, | ||
| id INT64, | ||
| bool_col BOOL, | ||
| tinyint_col INT64, | ||
| smallint_col INT64, | ||
| int_col INT64, | ||
| bigint_col INT64, | ||
| float_col FLOAT64, | ||
| double_col FLOAT64, | ||
| date_string_col STRING, | ||
| string_col STRING, | ||
| timestamp_col TIMESTAMP, | ||
| year INT64, | ||
| month INT64 | ||
| ) | ||
| PARTITION BY DATE(_PARTITIONTIME) | ||
| OPTIONS ( | ||
| require_partition_filter=false | ||
| ); | ||
|
|
||
| CREATE OR REPLACE TABLE `testing.functional_alltypes` | ||
| ( | ||
| index INT64, | ||
| Unnamed_0 INT64, | ||
| id INT64, | ||
| bool_col BOOL, | ||
| tinyint_col INT64, | ||
| smallint_col INT64, | ||
| int_col INT64, | ||
| bigint_col INT64, | ||
| float_col FLOAT64, | ||
| double_col FLOAT64, | ||
| date_string_col STRING, | ||
| string_col STRING, | ||
| timestamp_col TIMESTAMP, | ||
| year INT64, | ||
| month INT64 | ||
| ); |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,64 @@ | ||
| #!/bin/bash -e | ||
| # Set up conda environment for Ibis in GitHub Actions | ||
| # The base environment of the provided conda is used | ||
| # This script only installs the base dependencies. | ||
| # Dependencies for the backends need to be installed separately. | ||
|
|
||
| PYTHON_VERSION="${1:-3.7}" | ||
| BACKENDS="$2" | ||
|
|
||
| echo "PYTHON_VERSION: $PYTHON_VERSION" | ||
| echo "BACKENDS: $BACKENDS" | ||
|
|
||
| if [[ -n "$CONDA" ]]; then | ||
| # Add conda to Path | ||
| OS_NAME=$(uname) | ||
| case $OS_NAME in | ||
| Linux) | ||
| CONDA_PATH="$CONDA/bin" | ||
| ;; | ||
| MINGW*) | ||
| # Windows | ||
| CONDA_POSIX=$(cygpath -u "$CONDA") | ||
| CONDA_PATH="$CONDA_POSIX:$CONDA_POSIX/Scripts:$CONDA_POSIX/Library:$CONDA_POSIX/Library/bin:$CONDA_POSIX/Library/mingw-w64/bin" | ||
| ;; | ||
| *) | ||
| echo "$OS_NAME not supported." | ||
| exit 1 | ||
| esac | ||
| PATH=${CONDA_PATH}:${PATH} | ||
| # Prepend conda path to system path for the subsequent GitHub Actions | ||
| echo "${CONDA_PATH}" >> $GITHUB_PATH | ||
| else | ||
| echo "Running without adding conda to PATH." | ||
| fi | ||
|
|
||
| conda update -n base -c anaconda --all --yes conda | ||
| conda install -n base -c anaconda --yes python=${PYTHON_VERSION} | ||
| conda env update -n base --file=environment.yml | ||
| python -m pip install -e . | ||
|
|
||
| if [[ -n "$BACKENDS" ]]; then | ||
| python ci/datamgr.py download | ||
| for BACKEND in $BACKENDS; do | ||
| # For the oldest python version supported (currently 3.7) we first try to | ||
| # install the minimum supported dependencies `ci/deps/$BACKEND-min.yml`. | ||
| # If the file does not exist then we install the normal dependencies | ||
| # (if there are dependencies). For other python versions we simply install | ||
| # the normal dependencies if they exist. | ||
| if [[ $PYTHON_VERSION == "3.7" && -f "ci/deps/$BACKEND-min.yml" ]]; then | ||
| conda install -n base -c conda-forge --file="ci/deps/$BACKEND-min.yml" | ||
| else | ||
| if [[ -f "ci/deps/$BACKEND.yml" ]]; then | ||
| conda install -n base -c conda-forge --file="ci/deps/$BACKEND.yml" | ||
| fi | ||
| fi | ||
|
|
||
| # TODO load impala data in the same way as the rest of the backends | ||
| if [[ "$BACKEND" == "impala" ]]; then | ||
| python ci/impalamgr.py load --data | ||
| else | ||
| python ci/datamgr.py $BACKEND | ||
| fi | ||
| done | ||
| fi |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,176 @@ | ||
| .. currentmodule:: ibis.bigquery.api | ||
|
|
||
| .. _backends.bigquery: | ||
|
|
||
| BigQuery | ||
| ======== | ||
|
|
||
| To use the BigQuery client, you will need a Google Cloud Platform account. | ||
| Use the `BigQuery sandbox <https://cloud.google.com/bigquery/docs/sandbox>`__ | ||
| to try the service for free. | ||
|
|
||
| .. _install.bigquery: | ||
|
|
||
| `BigQuery <https://cloud.google.com/bigquery/>`_ Quickstart | ||
| ----------------------------------------------------------- | ||
|
|
||
| Install dependencies for Ibis's BigQuery dialect: | ||
|
|
||
| :: | ||
|
|
||
| pip install ibis-framework[bigquery] | ||
|
|
||
| Create a client by passing in the project id and dataset id you wish to operate | ||
| with: | ||
|
|
||
|
|
||
| .. code-block:: python | ||
| >>> con = ibis.bigquery.connect(project_id='ibis-gbq', dataset_id='testing') | ||
| By default ibis assumes that the BigQuery project that's billed for queries is | ||
| also the project where the data lives. | ||
|
|
||
| However, it's very easy to query data that does **not** live in the billing | ||
| project. | ||
|
|
||
| .. note:: | ||
|
|
||
| When you run queries against data from other projects **the billing project | ||
| will still be billed for any and all queries**. | ||
|
|
||
| If you want to query data that lives in a different project than the billing | ||
| project you can use the :meth:`ibis.bigquery.client.BigQueryClient.database` | ||
| method of :class:`ibis.bigquery.client.BigQueryClient` objects: | ||
|
|
||
| .. code-block:: python | ||
| >>> db = con.database('other-data-project.other-dataset') | ||
| >>> t = db.my_awesome_table | ||
| >>> t.sweet_column.sum().execute() # runs against the billing project | ||
| .. _api.bigquery: | ||
|
|
||
| API | ||
| --- | ||
| .. currentmodule:: ibis.backends.bigquery | ||
|
|
||
| The BigQuery client is accessible through the ``ibis.bigquery`` namespace. | ||
| See :ref:`backends.bigquery` for a tutorial on using this backend. | ||
|
|
||
| Use the ``ibis.bigquery.connect`` function to create a BigQuery | ||
| client. If no ``credentials`` are provided, the | ||
| :func:`pydata_google_auth.default` function fetches default credentials. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| BigQueryClient.database | ||
| BigQueryClient.list_databases | ||
| BigQueryClient.list_tables | ||
| BigQueryClient.table | ||
|
|
||
| The BigQuery client object | ||
| -------------------------- | ||
|
|
||
| To use Ibis with BigQuery, you first must connect to BigQuery using the | ||
| :func:`ibis.bigquery.connect` function, optionally supplying Google API | ||
| credentials: | ||
|
|
||
| .. code-block:: python | ||
| import ibis | ||
| client = ibis.bigquery.connect( | ||
| project_id=YOUR_PROJECT_ID, | ||
| dataset_id='bigquery-public-data.stackoverflow' | ||
| ) | ||
| .. _udf.bigquery: | ||
|
|
||
| User Defined functions (UDF) | ||
| ---------------------------- | ||
|
|
||
| .. note:: | ||
|
|
||
| BigQuery only supports element-wise UDFs at this time. | ||
|
|
||
| BigQuery supports UDFs through JavaScript. Ibis provides support for this by | ||
| turning Python code into JavaScript. | ||
|
|
||
| The interface is very similar to the pandas UDF API: | ||
|
|
||
| .. code-block:: python | ||
| import ibis.expr.datatypes as dt | ||
| from ibis.bigquery import udf | ||
| @udf([dt.double], dt.double) | ||
| def my_bigquery_add_one(x): | ||
| return x + 1.0 | ||
| Ibis will parse the source of the function and turn the resulting Python AST | ||
| into JavaScript source code (technically, ECMAScript 2015). Most of the Python | ||
| language is supported including classes, functions and generators. | ||
|
|
||
| When you want to use this function you call it like any other Python | ||
| function--only it must be called on an ibis expression: | ||
|
|
||
| .. code-block:: python | ||
| t = ibis.table([('a', 'double')]) | ||
| expr = my_bigquery_add_one(t.a) | ||
| print(ibis.bigquery.compile(expr)) | ||
| .. _bigquery-privacy: | ||
|
|
||
| Privacy | ||
| ------- | ||
|
|
||
| This package is subject to the `NumFocus privacy policy | ||
| <https://numfocus.org/privacy-policy>`_. Your use of Google APIs with this | ||
| module is subject to each API's respective `terms of service | ||
| <https://developers.google.com/terms/>`_. | ||
|
|
||
| Google account and user data | ||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
|
|
||
| Accessing user data | ||
| ~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| The :func:`~ibis.bigquery.api.connect` function provides access to data | ||
| stored in Google BigQuery and other sources such as Google Sheets or Cloud | ||
| Storage, via the federated query feature. Your machine communicates directly | ||
| with the Google APIs. | ||
|
|
||
| Storing user data | ||
| ~~~~~~~~~~~~~~~~~ | ||
|
|
||
| By default, your credentials are stored to a local file, such as | ||
| ``~/.config/pydata/ibis.json``. All user data is stored on | ||
| your local machine. **Use caution when using this library on a shared | ||
| machine**. | ||
|
|
||
| Sharing user data | ||
| ~~~~~~~~~~~~~~~~~ | ||
|
|
||
| The BigQuery client only communicates with Google APIs. No user data is | ||
| shared with PyData, NumFocus, or any other servers. | ||
|
|
||
| Policies for application authors | ||
| ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ | ||
|
|
||
| Do not use the default client ID when using Ibis from an application, | ||
| library, or tool. Per the `Google User Data Policy | ||
| <https://developers.google.com/terms/api-services-user-data-policy>`_, your | ||
| application must accurately represent itself when authenticating to Google | ||
| API services. | ||
|
|
||
| Extending the BigQuery backend | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
|
|
||
| * Create a Google Cloud project. | ||
| * Set the ``GOOGLE_BIGQUERY_PROJECT_ID`` environment variable. | ||
| * Populate test data: ``python ci/datamgr.py bigquery`` | ||
| * Run the test suite: ``pytest ibis/bigquery/tests`` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,40 @@ | ||
| .. _install.clickhouse: | ||
|
|
||
| `Clickhouse <https://clickhouse.yandex/>`_ | ||
| ------------------------------------------ | ||
|
|
||
| Install dependencies for Ibis's Clickhouse dialect(minimal supported version is `0.1.3`): | ||
|
|
||
| :: | ||
|
|
||
| pip install ibis-framework[clickhouse] | ||
|
|
||
| Create a client by passing in database connection parameters such as ``host``, | ||
| ``port``, ``database``, and ``user`` to :func:`ibis.clickhouse.connect`: | ||
|
|
||
|
|
||
| .. code-block:: python | ||
| con = ibis.clickhouse.connect(host='clickhouse', port=9000) | ||
| .. _api.clickhouse: | ||
|
|
||
| API | ||
| === | ||
| .. currentmodule:: ibis.backends.clickhouse | ||
|
|
||
| The ClickHouse client is accessible through the ``ibis.clickhouse`` namespace. | ||
|
|
||
| Use ``ibis.clickhouse.connect`` to create a client. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| ClickhouseClient.close | ||
| ClickhouseClient.exists_table | ||
| ClickhouseClient.exists_database | ||
| ClickhouseClient.get_schema | ||
| ClickhouseClient.set_database | ||
| ClickhouseClient.list_databases | ||
| ClickhouseClient.list_tables |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,43 @@ | ||
| .. _install.mysql: | ||
|
|
||
| `MySQL <https://www.mysql.com/>`_ | ||
| ================================= | ||
|
|
||
| Install dependencies for Ibis's MySQL dialect: | ||
|
|
||
| :: | ||
|
|
||
| pip install ibis-framework[mysql] | ||
|
|
||
| Create a client by passing a connection string or individual parameters to | ||
| :func:`ibis.mysql.connect`: | ||
|
|
||
| .. code-block:: python | ||
| con = ibis.mysql.connect(url='mysql+pymysql://ibis:ibis@mysql/ibis_testing') | ||
| con = ibis.mysql.connect( | ||
| user='ibis', | ||
| password='ibis', | ||
| host='mysql', | ||
| database='ibis_testing', | ||
| ) | ||
| .. _api.mysql: | ||
|
|
||
| API | ||
| --- | ||
| .. currentmodule:: ibis.backends.mysql | ||
|
|
||
| The MySQL client is accessible through the ``ibis.mysql`` namespace. | ||
|
|
||
| Use ``ibis.mysql.connect`` with a SQLAlchemy-compatible connection string to | ||
| create a client. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| MySQLClient.database | ||
| MySQLClient.list_databases | ||
| MySQLClient.list_tables | ||
| MySQLClient.table |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,46 @@ | ||
| .. _install.postgres: | ||
|
|
||
| `PostgreSQL <https://www.postgresql.org/>`_ | ||
| =========================================== | ||
|
|
||
| Install dependencies for Ibis's PostgreSQL dialect: | ||
|
|
||
| :: | ||
|
|
||
| pip install ibis-framework[postgres] | ||
|
|
||
| Create a client by passing a connection string to the ``url`` parameter or | ||
| individual parameters to :func:`ibis.postgres.connect`: | ||
|
|
||
| .. code-block:: python | ||
| con = ibis.postgres.connect( | ||
| url='postgresql://postgres:postgres@postgres:5432/ibis_testing' | ||
| ) | ||
| con = ibis.postgres.connect( | ||
| user='postgres', | ||
| password='postgres', | ||
| host='postgres', | ||
| port=5432, | ||
| database='ibis_testing', | ||
| ) | ||
| .. _api.postgres: | ||
|
|
||
| API | ||
| --- | ||
| .. currentmodule:: ibis.backends.postgres | ||
|
|
||
| The PostgreSQL client is accessible through the ``ibis.postgres`` namespace. | ||
|
|
||
| Use ``ibis.postgres.connect`` with a SQLAlchemy-compatible connection string to | ||
| create a client. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| PostgreSQLClient.database | ||
| PostgreSQLClient.list_tables | ||
| PostgreSQLClient.list_databases | ||
| PostgreSQLClient.table |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,58 @@ | ||
| .. _install.spark: | ||
|
|
||
| `PySpark/Spark SQL <https://spark.apache.org/sql/>`_ | ||
| ==================================================== | ||
|
|
||
| Install dependencies for Ibis's Spark dialect: | ||
|
|
||
| :: | ||
|
|
||
| pip install ibis-framework[spark] | ||
|
|
||
| Create a client by passing in the spark session as a parameter to | ||
| :func:`ibis.spark.connect`: | ||
|
|
||
| .. code-block:: python | ||
| con = ibis.spark.connect(spark_session) | ||
| .. _api.spark: | ||
|
|
||
| API | ||
| --- | ||
|
|
||
| SparkSQL client (Experimental) | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
| .. currentmodule:: ibis.backends.spark | ||
|
|
||
| The Spark SQL client is accessible through the ``ibis.spark`` namespace. | ||
|
|
||
| Use ``ibis.spark.connect`` to create a client. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| SparkClient.database | ||
| SparkClient.list_databases | ||
| SparkClient.list_tables | ||
| SparkClient.table | ||
|
|
||
| .. _api.pyspark: | ||
|
|
||
| PySpark client (Experimental) | ||
| ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ | ||
| .. currentmodule:: ibis.backends.pyspark | ||
|
|
||
| The PySpark client is accessible through the ``ibis.pyspark`` namespace. | ||
|
|
||
| Use ``ibis.pyspark.connect`` to create a client. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| PySparkClient.database | ||
| PySparkClient.list_databases | ||
| PySparkClient.list_tables | ||
| PySparkClient.table |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,40 @@ | ||
| .. _install.sqlite: | ||
|
|
||
| `SQLite <https://www.sqlite.org/>`_ | ||
| =================================== | ||
|
|
||
| Install dependencies for Ibis's SQLite dialect: | ||
|
|
||
| :: | ||
|
|
||
| pip install ibis-framework[sqlite] | ||
|
|
||
| Create a client by passing a path to a SQLite database to | ||
| :func:`ibis.sqlite.connect`: | ||
|
|
||
| .. code-block:: python | ||
| >>> import ibis | ||
| >>> ibis.sqlite.connect('path/to/my/sqlite.db') | ||
| See http://blog.ibis-project.org/sqlite-crunchbase-quickstart/ for a quickstart | ||
| using SQLite. | ||
|
|
||
| .. _api.sqlite: | ||
|
|
||
| API | ||
| --- | ||
| .. currentmodule:: ibis.backends.sqlite | ||
|
|
||
| The SQLite client is accessible through the ``ibis.sqlite`` namespace. | ||
|
|
||
| Use ``ibis.sqlite.connect`` to create a SQLite client. | ||
|
|
||
| .. autosummary:: | ||
| :toctree: ../generated/ | ||
|
|
||
| connect | ||
| SQLiteClient.attach | ||
| SQLiteClient.database | ||
| SQLiteClient.list_tables | ||
| SQLiteClient.table |