Skip to content

Commit

Permalink
add doc about Yandex Query operator (apache#39445)
Browse files Browse the repository at this point in the history
* add yq sample

* fix style and links to doc from provider.yaml

* fix style again

* more links

* reorg operators doc

* fix links to how-to-guide
  • Loading branch information
uzhastik authored and RodrigoGanancia committed May 10, 2024
1 parent d46fca5 commit 99cb6bc
Show file tree
Hide file tree
Showing 9 changed files with 116 additions and 9 deletions.
4 changes: 2 additions & 2 deletions airflow/providers/yandex/provider.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -62,14 +62,14 @@ integrations:
- integration-name: Yandex.Cloud Dataproc
external-doc-url: https://cloud.yandex.com/dataproc
how-to-guide:
- /docs/apache-airflow-providers-yandex/operators.rst
- /docs/apache-airflow-providers-yandex/operators/dataproc.rst
logo: /integration-logos/yandex/Yandex-Cloud.png
tags: [service]

- integration-name: Yandex.Cloud YQ
external-doc-url: https://cloud.yandex.com/en/services/query
how-to-guide:
- /docs/apache-airflow-providers-yandex/operators.rst
- /docs/apache-airflow-providers-yandex/operators/yq.rst
logo: /integration-logos/yandex/Yandex-Cloud.png
tags: [service]

Expand Down
2 changes: 1 addition & 1 deletion docs/apache-airflow-providers-yandex/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@
Configuration <configurations-ref>
Connection types <connections/yandexcloud>
Lockbox Secret Backend <secrets-backends/yandex-cloud-lockbox-secret-backend>
Operators <operators>
Operators <operators/index>

.. toctree::
:hidden:
Expand Down
28 changes: 28 additions & 0 deletions docs/apache-airflow-providers-yandex/operators/index.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
.. Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
.. http://www.apache.org/licenses/LICENSE-2.0
.. Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
Yandex.Cloud Operators
======================


.. toctree::
:maxdepth: 1
:glob:

*
28 changes: 28 additions & 0 deletions docs/apache-airflow-providers-yandex/operators/yq.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
.. Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
.. http://www.apache.org/licenses/LICENSE-2.0
.. Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
Yandex Query Operators
======================
`Yandex Query <https://yandex.cloud/en/services/query>`__ is a service in the Yandex Cloud to process data from different sources such as
`Object Storage <https://yandex.cloud/ru/services/storage>`__, `MDB ClickHouse <https://yandex.cloud/ru/services/managed-clickhouse>`__,
`MDB PostgreSQL <https://yandex.cloud/ru/services/managed-postgresql>`__, `Yandex DataStreams <https://yandex.cloud/ru/services/data-streams>`__ using SQL scripts.

Using the operators
^^^^^^^^^^^^^^^^^^^
To learn how to use Yandex Query operator,
see `example DAG <https://github.com/apache/airflow/tree/providers-yandex/|version|/tests/system/providers/yandex/example_yandexcloud_yq.py>`__.
4 changes: 2 additions & 2 deletions tests/system/providers/yandex/example_yandexcloud.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
# under the License.
from __future__ import annotations

import os
from datetime import datetime

import yandex.cloud.dataproc.v1.cluster_pb2 as cluster_pb
Expand All @@ -32,8 +31,9 @@
from airflow import DAG
from airflow.decorators import task
from airflow.providers.yandex.hooks.yandex import YandexCloudBaseHook
from tests.system.utils import get_test_env_id

ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
ENV_ID = get_test_env_id()
DAG_ID = "example_yandexcloud_hook"

# Fill it with your identifiers
Expand Down
4 changes: 2 additions & 2 deletions tests/system/providers/yandex/example_yandexcloud_dataproc.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
# under the License.
from __future__ import annotations

import os
import uuid
from datetime import datetime

Expand All @@ -32,6 +31,7 @@

# Name of the datacenter where Dataproc cluster will be created
from airflow.utils.trigger_rule import TriggerRule
from tests.system.utils import get_test_env_id

# should be filled with appropriate ids

Expand All @@ -41,7 +41,7 @@
# Dataproc cluster jobs will produce logs in specified s3 bucket
S3_BUCKET_NAME_FOR_JOB_LOGS = ""

ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
ENV_ID = get_test_env_id()
DAG_ID = "example_yandexcloud_dataproc_operator"

with DAG(
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
# under the License.
from __future__ import annotations

import os
from datetime import datetime

from airflow import DAG
Expand All @@ -28,6 +27,7 @@

# Name of the datacenter where Dataproc cluster will be created
from airflow.utils.trigger_rule import TriggerRule
from tests.system.utils import get_test_env_id

# should be filled with appropriate ids

Expand All @@ -37,7 +37,7 @@
# Dataproc cluster will use this bucket as distributed storage
S3_BUCKET_NAME = ""

ENV_ID = os.environ.get("SYSTEM_TESTS_ENV_ID")
ENV_ID = get_test_env_id()
DAG_ID = "example_yandexcloud_dataproc_lightweight"

with DAG(
Expand Down
51 changes: 51 additions & 0 deletions tests/system/providers/yandex/example_yandexcloud_yq.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
from __future__ import annotations

from datetime import datetime

from airflow.models.dag import DAG
from airflow.operators.empty import EmptyOperator
from airflow.providers.yandex.operators.yq import YQExecuteQueryOperator
from tests.system.utils import get_test_env_id

ENV_ID = get_test_env_id()
DAG_ID = "example_yandexcloud_yq"

with DAG(
DAG_ID,
schedule=None,
start_date=datetime(2021, 1, 1),
tags=["example"],
) as dag:
run_this_last = EmptyOperator(
task_id="run_this_last",
)

yq_operator = YQExecuteQueryOperator(task_id="sample_query", sql="select 33 as d, 44 as t")
yq_operator >> run_this_last

from tests.system.utils.watcher import watcher

# This test needs watcher in order to properly mark success/failure
# when "teardown" task with trigger rule is part of the DAG
list(dag.tasks) >> watcher()

from tests.system.utils import get_test_run # noqa: E402

# Needed to run the example DAG with pytest (see: tests/system/README.md#run_via_pytest)
test_run = get_test_run(dag)

0 comments on commit 99cb6bc

Please sign in to comment.