| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,97 @@ | ||
| --- | ||
| title: Ibis for streaming | ||
| --- | ||
|
|
||
| Ibis has support for streaming operations, which can be executed on Flink, | ||
| Spark Structured Streaming, and RisingWave. | ||
|
|
||
| ## Setup | ||
|
|
||
| We demonstrate the streaming operations with a real-time fraud detection example. | ||
| If you have Kafka set up in your infrastructure, you can connect to your existing Kafka | ||
| topics as well. | ||
|
|
||
| You can find our code setup [here](https://github.com/ibis-project/realtime-fraud-detection). | ||
| Feel free to clone the repository if you want to follow along. | ||
|
|
||
| ## Window aggregation | ||
| Computes aggregations over windows. | ||
|
|
||
| The output schema consists of `window_start`, `window_end`, the group | ||
| by column if applicable (optional), and the aggregation results. | ||
|
|
||
| Tumble and hop windows are supported. Tumbling windows have a fixed size and do not overlap. | ||
| Hopping windows (aka sliding windows) are configured by both window size and window slide. The | ||
| additional window slide parameter controls how frequently a sliding window is started. | ||
|
|
||
| For more, see [Flink's documentation on Windowing TVFs](https://nightlies.apache.org/flink/flink-docs-release-1.19/docs/dev/table/sql/queries/window-tvf/) | ||
| and [Spark's documentation on time windows](https://spark.apache.org/docs/latest/structured-streaming-programming-guide.html#types-of-time-windows). | ||
|
|
||
| ```python | ||
| t = con.table("payment") # table corresponding to the `payment` topic | ||
|
|
||
| # tumble window | ||
| expr = ( | ||
| t.window_by(time_col=t.createTime) | ||
| .tumble(size=ibis.interval(seconds=30)) | ||
| .agg(by=["provinceId"], avgPayAmount=_.payAmount.mean()) | ||
| ) | ||
|
|
||
| # hop window | ||
| expr = ( | ||
| t.window_by(time_col=t.createTime) | ||
| .hop(size=ibis.interval(seconds=30), slide=ibis.interval(seconds=15)) | ||
| .agg(by=["provinceId"], avgPayAmount=_.payAmount.mean()) | ||
| ) | ||
| ``` | ||
|
|
||
| ## Over aggregation | ||
| Computes aggregate values for every input row, over either a row range or a time range. | ||
|
|
||
| ::: {.callout-note} | ||
| Spark Structured Streaming does not support aggregation using the `OVER` syntax. You need to use | ||
| window aggregation to aggregate over time windows. | ||
| ::: | ||
|
|
||
| ```python | ||
| expr = ( | ||
| t.select( | ||
| province_id=t.provinceId, | ||
| pay_amount=t.payAmount.sum().over( | ||
| range=(-ibis.interval(seconds=10), 0), | ||
| group_by=t.provinceId, | ||
| order_by=t.createTime, | ||
| ), | ||
| ) | ||
| ) | ||
| ``` | ||
|
|
||
|
|
||
| ## Stream-table join | ||
| Joining a stream with a static table. | ||
|
|
||
| ```python | ||
| provinces = ( | ||
| "Beijing", | ||
| "Shanghai", | ||
| "Hangzhou", | ||
| "Shenzhen", | ||
| "Jiangxi", | ||
| "Chongqing", | ||
| "Xizang", | ||
| ) | ||
| province_id_to_name_df = pd.DataFrame( | ||
| enumerate(provinces), columns=["provinceId", "province"] | ||
| ) | ||
| expr = t.join(province_id_to_name_df, ["provinceId"]) | ||
| ``` | ||
|
|
||
| ## Stream-stream join | ||
| Joining two streams. | ||
|
|
||
| ```python | ||
| order = con.table("order") # table corresponding to the `order` topic | ||
| expr = t.join( | ||
| order, [t.orderId == order.orderId, t.createTime == order.createTime] | ||
| ) | ||
| ``` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,4 @@ | ||
| ibis-bench | ||
| tpch_data | ||
| results_data | ||
| bench_logs_v* |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,3 @@ | ||
| tpch_data | ||
| results_data | ||
| bench_logs_v* |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,146 @@ | ||
| --- | ||
| title: "Ibis - Now flying on Snowflake" | ||
| author: | ||
| - Phillip Cloud | ||
| - Tyler White | ||
| error: false | ||
| date: "2024-06-19" | ||
| categories: | ||
| - blog | ||
| - new feature | ||
| - snowflake | ||
| --- | ||
|
|
||
| Ibis allows you to push down compute operations on your data where it lives, | ||
| with the performance being as powerful as the backend you're connected to. But | ||
| what happens if Ibis is running _inside_ the backend you're connected to? | ||
|
|
||
| In this post, we will discuss how we got Ibis running on a Snowflake virtual | ||
| warehouse. | ||
|
|
||
| ## Why would we want to do this? | ||
|
|
||
| Snowflake has released several features to enable users to execute native | ||
| Python code on the platform. These features include a new notebook development | ||
| interface, Streamlit in Snowflake, the Native App framework, and Python within | ||
| functions and stored procedures. | ||
|
|
||
| If users could use Ibis directly within the platform, developers could more | ||
| easily switch between a local execution engine during development and | ||
| efficiently deploy and operationalize that same code on Snowflake. | ||
|
|
||
| But this isn't without its challenges; there were a few things we needed to | ||
| figure out, and these are the questions we will answer throughout the post. | ||
|
|
||
| - How can we get an Ibis connection to Snowflake - from within Snowflake? | ||
| - How can we use third-party packages in Snowflake? | ||
| - How are we going to test this to ensure it works? | ||
|
|
||
| ## Getting the Ibis connection | ||
|
|
||
| The release of Ibis 9.0 includes the introduction of a new method, | ||
| [`from_snowpark`](../../backends/snowflake.qmd#ibis.backends.snowflake.Backend.from_snowpark) | ||
| to provide users with a convenient mechanism to take an existing Snowpark | ||
| session and create an Ibis Snowflake backend instance with it. | ||
|
|
||
| Here's what this looks like: | ||
|
|
||
| ```python | ||
| import ibis | ||
| import snowflake.snowpark as sp | ||
|
|
||
| session = sp.Session.builder.create() | ||
| con = ibis.snowflake.from_snowpark(session) | ||
| ``` | ||
|
|
||
| This connection uses the same session within Snowflake, so temporary objects | ||
| can be accessed using Snowpark or Ibis in the same process! The contexts of | ||
| stored procedures already have a session available, meaning we can use this | ||
| new method and start writing Ibis expressions. | ||
|
|
||
| The way this works is that Ibis plucks out an attribute on the Snowpark | ||
| session, which gives us the [`snowflake-connector-python`](https://github.com/snowflakedb/snowflake-connector-python) [`SnowflakeConnection`](https://github.com/snowflakedb/snowflake-connector-python/blob/42fa6ebe9404e0e17afdacfcaceb311dda5cde3e/src/snowflake/connector/connection.py#L313) instance used | ||
| by Snowpark. | ||
|
|
||
| Since Ibis uses `snowflake-connector-python` for all Snowflake-related | ||
| connection we just reuse that existing instance. | ||
|
|
||
| ## Uploading third-party packages | ||
|
|
||
| Snowflake has many packages already made available out of the box through the | ||
| Snowflake Anaconda channel, but unfortunately, Ibis and a few of its | ||
| dependencies aren't available. Packages containing pure Python code can be | ||
| uploaded to stages for use within the platform, so we devised a clever solution | ||
| to upload and reference these to get them working. | ||
|
|
||
| ```python | ||
| import os | ||
| import shutil | ||
| import tempfile | ||
|
|
||
|
|
||
| def add_packages(d, session): | ||
| import parsy | ||
| import pyarrow_hotfix | ||
| import rich | ||
| import sqlglot | ||
| import ibis | ||
|
|
||
| for module in (ibis, parsy, pyarrow_hotfix, rich, sqlglot): | ||
| pkgname = module.__name__ | ||
| pkgpath = os.path.join(d, pkgname) | ||
| shutil.copytree(os.path.dirname(module.__file__), pkgpath) | ||
| session.add_import(pkgname, import_path=pkgname) | ||
|
|
||
|
|
||
| d = tempfile.TemporaryDirectory() | ||
| os.chdir(d.name) | ||
| add_packages(d.name, session) | ||
| ``` | ||
|
|
||
| We can now register a stored procedure that imports these modules and is able | ||
| to reference some of the additional dependencies that are already available. | ||
|
|
||
| ```python | ||
| session.sproc.register( | ||
| ibis_sproc, | ||
| return_type=sp.types.StructType(), | ||
| name="THE_IBIS_SPROC", | ||
| imports=["ibis", "parsy", "pyarrow_hotfix", "sqlglot", "rich"], | ||
| packages=[ | ||
| "snowflake-snowpark-python", | ||
| "toolz", | ||
| "atpublic", | ||
| "pyarrow", | ||
| "pandas", | ||
| "numpy", | ||
| ], | ||
| ) | ||
| ``` | ||
|
|
||
| ::: {.callout-info} | ||
| ## More permanent solutions to packaging | ||
|
|
||
| It's possible that a more permanent solution could be achieved with a `put` or | ||
| `put_stream` method rather than using the `add_import` method. This would allow | ||
| for the packages to be referenced across multiple stored procedures or other | ||
| places within the Snowflake platform. | ||
| ::: | ||
|
|
||
| ## Testing! | ||
|
|
||
| While this is a clever solution, we must ensure it works consistently. A | ||
| special unit test has been written for this exact case! The test creates a | ||
| stored procedure by adding the necessary imports to the Snowpark session. | ||
| Within the stored procedure, we define an Ibis expression, and we use the Ibis | ||
| `to_sql` method to extract the generated SQL to pass to Snowpark to return a | ||
| Snowpark DataFrame! | ||
|
|
||
| ## Conclusion | ||
|
|
||
| While it's usually pretty easy to add new backends with Ibis, this was the | ||
| first instance of supporting an additional interface to an existing backend. | ||
|
|
||
| We hope you take this for a spin! If you run into any challenges or want | ||
| additional support, open an [issue](https://github.com/ibis-project/ibis/issues) | ||
| or join us on [Zulip](https://ibis-project.zulipchat.com/) and let us know! |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| MOD(EXTRACT(dayofweek FROM DATETIME('2017-01-01T04:55:59')) + 5, 7) AS `DayOfWeekIndex_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| INITCAP(CAST(DATETIME('2017-01-01T04:55:59') AS STRING FORMAT 'DAY')) AS `DayOfWeekName_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| MOD(EXTRACT(dayofweek FROM DATETIME('2017-01-01T04:55:59')) + 5, 7) AS `DayOfWeekIndex_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| INITCAP(CAST(DATETIME('2017-01-01T04:55:59') AS STRING FORMAT 'DAY')) AS `DayOfWeekName_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| MOD(EXTRACT(dayofweek FROM DATETIME('2017-01-01T04:55:59')) + 5, 7) AS `DayOfWeekIndex_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| INITCAP(CAST(DATETIME('2017-01-01T04:55:59') AS STRING FORMAT 'DAY')) AS `DayOfWeekName_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| TIME(`t0`.`ts`) AS `tmp` | ||
| FROM `t` AS `t0` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(hour FROM DATETIME('2017-01-01T04:55:59')) AS `tmp` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(hour FROM TIME(4, 55, 59)) AS `tmp` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(hour FROM DATETIME('2017-01-01T04:55:59')) AS `tmp` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(hour FROM TIME(4, 55, 59)) AS `tmp` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(hour FROM DATETIME('2017-01-01T04:55:59')) AS `tmp` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(year FROM DATETIME('2017-01-01T04:55:59')) AS `ExtractYear_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(year FROM DATETIME('2017-01-01T04:55:59')) AS `ExtractYear_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,2 +1,2 @@ | ||
| SELECT | ||
| EXTRACT(year FROM DATETIME('2017-01-01T04:55:59')) AS `ExtractYear_datetime_datetime_2017_1_1_4_55_59` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,11 +1,7 @@ | ||
| SELECT | ||
| * | ||
| FROM `t0` AS `t0` | ||
| EXCEPT DISTINCT | ||
| SELECT | ||
| * | ||
| FROM `t1` AS `t1` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,11 +1,7 @@ | ||
| SELECT | ||
| * | ||
| FROM `t0` AS `t0` | ||
| INTERSECT DISTINCT | ||
| SELECT | ||
| * | ||
| FROM `t1` AS `t1` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,11 +1,7 @@ | ||
| SELECT | ||
| * | ||
| FROM `t0` AS `t0` | ||
| UNION ALL | ||
| SELECT | ||
| * | ||
| FROM `t1` AS `t1` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,11 +1,7 @@ | ||
| SELECT | ||
| * | ||
| FROM `t0` AS `t0` | ||
| UNION DISTINCT | ||
| SELECT | ||
| * | ||
| FROM `t1` AS `t1` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,16 @@ | ||
| SELECT | ||
| COUNT(`t2`.`foo`) AS `count` | ||
| FROM ( | ||
| SELECT | ||
| `t1`.`string_col`, | ||
| SUM(`t1`.`float_col`) AS `foo` | ||
| FROM ( | ||
| SELECT | ||
| * | ||
| FROM `alltypes` AS `t0` | ||
| WHERE | ||
| `t0`.`timestamp_col` < DATETIME('2014-01-01T00:00:00') | ||
| ) AS `t1` | ||
| GROUP BY | ||
| 1 | ||
| ) AS `t2` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| SELECT | ||
| TIME_ADD(TIME(12, 34, 56), INTERVAL 789101 MICROSECOND) AS `datetime_time_12_34_56_789101` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,2 @@ | ||
| SELECT | ||
| TIME(12, 34, 56) AS `datetime_time_12_34_56` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| parse_timestamp('%F %Z', CONCAT(`t0`.`date_string_col`, ' America/New_York'), 'UTC') AS `StringToTimestamp_StringConcat_date_string_col_' America_New_York'_'%F %Z'` | ||
| FROM `functional_alltypes` AS `t0` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,11 +1,7 @@ | ||
| SELECT | ||
| * | ||
| FROM `functional_alltypes` AS `t0` | ||
| UNION ALL | ||
| SELECT | ||
| * | ||
| FROM `functional_alltypes` AS `t0` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,11 +1,7 @@ | ||
| SELECT | ||
| * | ||
| FROM `functional_alltypes` AS `t0` | ||
| UNION DISTINCT | ||
| SELECT | ||
| * | ||
| FROM `functional_alltypes` AS `t0` |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."double_col" AS "double_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| GREATEST("t0"."int_col", 10) AS "Greatest((int_col, 10))" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| GREATEST("t0"."int_col", "t0"."bigint_col") AS "Greatest((int_col, bigint_col))" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| LEAST("t0"."int_col", 10) AS "Least((int_col, 10))" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| LEAST("t0"."int_col", "t0"."bigint_col") AS "Least((int_col, bigint_col))" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."bigint_col" AS "bigint_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."bool_col" AS "bool_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."date_string_col" AS "date_string_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."double_col" AS "double_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."float_col" AS "float_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."id" AS "id" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."int_col" AS "int_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."month" AS "month" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."smallint_col" AS "smallint_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."string_col" AS "string_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."timestamp_col" AS "timestamp_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."tinyint_col" AS "tinyint_col" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| "t0"."year" AS "year" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| indexOf(['a','b','c'], "t0"."string_col") - 1 AS "FindInSet(string_col, ('a', 'b', 'c'))" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,3 +1,3 @@ | ||
| SELECT | ||
| CASE "t0"."string_col" WHEN 'foo' THEN 'bar' WHEN 'baz' THEN 'qux' ELSE 'default' END AS "SimpleCase(string_col, ('foo', 'baz'), ('bar', 'qux'), 'default')" | ||
| FROM "functional_alltypes" AS "t0" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,5 +1,5 @@ | ||
| SELECT | ||
| "t0"."a" AS "a", | ||
| COALESCE(countIf(NOT ( | ||
| "t0"."b" | ||
| )), 0) AS "A", | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,17 +1,17 @@ | ||
| SELECT | ||
| "t1"."id" AS "id", | ||
| "t1"."bool_col" AS "bool_col", | ||
| "t1"."tinyint_col" AS "tinyint_col", | ||
| "t1"."smallint_col" AS "smallint_col", | ||
| "t1"."int_col" AS "int_col", | ||
| "t1"."bigint_col" AS "bigint_col", | ||
| "t1"."float_col" AS "float_col", | ||
| "t1"."double_col" AS "double_col", | ||
| "t1"."date_string_col" AS "date_string_col", | ||
| "t1"."string_col" AS "string_col", | ||
| "t1"."timestamp_col" AS "timestamp_col", | ||
| "t1"."year" AS "year", | ||
| "t1"."month" AS "month" | ||
| FROM "functional_alltypes" AS "t1" | ||
| INNER JOIN "functional_alltypes" AS "t2" | ||
| ON "t1"."id" = "t2"."id" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,5 +1,5 @@ | ||
| SELECT | ||
| "t1"."key" AS "key", | ||
| SUM(( | ||
| ( | ||
| "t1"."value" + 1 | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,5 +1,5 @@ | ||
| SELECT | ||
| "t1"."key" AS "key", | ||
| SUM(( | ||
| ( | ||
| "t1"."value" + 1 | ||
|
|
||
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,26 +1,26 @@ | ||
| SELECT | ||
| "t2"."playerID" AS "playerID", | ||
| "t2"."yearID" AS "yearID", | ||
| "t2"."stint" AS "stint", | ||
| "t2"."teamID" AS "teamID", | ||
| "t2"."lgID" AS "lgID", | ||
| "t2"."G" AS "G", | ||
| "t2"."AB" AS "AB", | ||
| "t2"."R" AS "R", | ||
| "t2"."H" AS "H", | ||
| "t2"."X2B" AS "X2B", | ||
| "t2"."X3B" AS "X3B", | ||
| "t2"."HR" AS "HR", | ||
| "t2"."RBI" AS "RBI", | ||
| "t2"."SB" AS "SB", | ||
| "t2"."CS" AS "CS", | ||
| "t2"."BB" AS "BB", | ||
| "t2"."SO" AS "SO", | ||
| "t2"."IBB" AS "IBB", | ||
| "t2"."HBP" AS "HBP", | ||
| "t2"."SH" AS "SH", | ||
| "t2"."SF" AS "SF", | ||
| "t2"."GIDP" AS "GIDP" | ||
| FROM "batting" AS "t2" | ||
| ANY JOIN "awards_players" AS "t3" | ||
| ON "t2"."playerID" = "t3"."awardID" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,26 +1,26 @@ | ||
| SELECT | ||
| "t2"."playerID" AS "playerID", | ||
| "t2"."yearID" AS "yearID", | ||
| "t2"."stint" AS "stint", | ||
| "t2"."teamID" AS "teamID", | ||
| "t2"."lgID" AS "lgID", | ||
| "t2"."G" AS "G", | ||
| "t2"."AB" AS "AB", | ||
| "t2"."R" AS "R", | ||
| "t2"."H" AS "H", | ||
| "t2"."X2B" AS "X2B", | ||
| "t2"."X3B" AS "X3B", | ||
| "t2"."HR" AS "HR", | ||
| "t2"."RBI" AS "RBI", | ||
| "t2"."SB" AS "SB", | ||
| "t2"."CS" AS "CS", | ||
| "t2"."BB" AS "BB", | ||
| "t2"."SO" AS "SO", | ||
| "t2"."IBB" AS "IBB", | ||
| "t2"."HBP" AS "HBP", | ||
| "t2"."SH" AS "SH", | ||
| "t2"."SF" AS "SF", | ||
| "t2"."GIDP" AS "GIDP" | ||
| FROM "batting" AS "t2" | ||
| LEFT ANY JOIN "awards_players" AS "t3" | ||
| ON "t2"."playerID" = "t3"."awardID" |
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,26 +1,26 @@ | ||
| SELECT | ||
| "t2"."playerID" AS "playerID", | ||
| "t2"."yearID" AS "yearID", | ||
| "t2"."stint" AS "stint", | ||
| "t2"."teamID" AS "teamID", | ||
| "t2"."lgID" AS "lgID", | ||
| "t2"."G" AS "G", | ||
| "t2"."AB" AS "AB", | ||
| "t2"."R" AS "R", | ||
| "t2"."H" AS "H", | ||
| "t2"."X2B" AS "X2B", | ||
| "t2"."X3B" AS "X3B", | ||
| "t2"."HR" AS "HR", | ||
| "t2"."RBI" AS "RBI", | ||
| "t2"."SB" AS "SB", | ||
| "t2"."CS" AS "CS", | ||
| "t2"."BB" AS "BB", | ||
| "t2"."SO" AS "SO", | ||
| "t2"."IBB" AS "IBB", | ||
| "t2"."HBP" AS "HBP", | ||
| "t2"."SH" AS "SH", | ||
| "t2"."SF" AS "SF", | ||
| "t2"."GIDP" AS "GIDP" | ||
| FROM "batting" AS "t2" | ||
| INNER JOIN "awards_players" AS "t3" | ||
| ON "t2"."playerID" = "t3"."awardID" |