diff --git a/docs/security/authorization.md b/docs/security/authorization.md deleted file mode 100644 index 15078088f9e..00000000000 --- a/docs/security/authorization.md +++ /dev/null @@ -1,49 +0,0 @@ - - -# ACL Management Guide - -- [Authorization Modes](#1) - - [Storage-Based Authorization](#1.1) - - [SQL-Standard Based Authorization](#1.2) - - [Ranger Security Support](#1.3) - - -

Authorization Modes

- -Three primary modes for Kyuubi authorization are available by [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security): - -

Storage-Based Authorization

- -Enabling Storage Based Authorization in the `Hive Metastore Server` uses the HDFS permissions to act as the main source for verification and allows for consistent data and metadata authorization policy. This allows control over metadata access by verifying if the user has permission to access corresponding directories on the HDFS. Similar with `HiveServer2`, files and directories will be translated into hive metadata objects, such as dbs, tables, partitions, and be protected from end user's queries through Kyuubi. - -Storage-Based Authorization offers users with Database, Table and Partition-level coarse-gained access control. - -

SQL-Standard Based Authorization

- -Enabling SQL-Standard Based Authorization gives users more fine-gained control over access comparing with Storage Based Authorization. Besides of the ability of Storage Based Authorization, SQL-Standard Based Authorization can improve it to Views and Column-level. Unfortunately, Spark SQL does not support grant/revoke statements which controls access, this might be done only through the HiveServer2. But it's gratifying that [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security) makes Spark SQL be able to understand this fine-grain access control granted or revoked by Hive. - -With [Kyuubi](https://github.com/apache/incubator-kyuubi), the SQL-Standard Based Authorization is guaranteed for the security configurations, metadata, and storage information is preserved from end users. - -Please refer to the [Submarine Spark Security](https://submarine.apache.org/docs/userDocs/submarine-security/spark-security/README) in the online documentation for an overview on how to configure SQL-Standard Based Authorization for Spark SQL. - -

Ranger Security Support (Recommended)

- -[Apache Ranger](https://ranger.apache.org/) is a framework to enable, monitor and manage comprehensive data security across the Hadoop platform but end before Spark or Spark SQL. The [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security) enables Kyuubi with control access ability reusing [Ranger Plugin for Hive MetaStore -](https://cwiki.apache.org/confluence/display/RANGER/Ranger+Plugin+for+Hive+MetaStore). [Apache Ranger](https://ranger.apache.org/) makes the scope of existing SQL-Standard Based Authorization expanded but without supporting Spark SQL. [Submarine Spark Security](https://github.com/apache/submarine/tree/master/submarine-security/spark-security) sticks them together. - -Please refer to the [Submarine Spark Security](https://submarine.apache.org/docs/userDocs/submarine-security/spark-security/README) in the online documentation for an overview on how to configure Ranger for Spark SQL. diff --git a/docs/security/authorization/index.rst b/docs/security/authorization/index.rst new file mode 100644 index 00000000000..f7241847a20 --- /dev/null +++ b/docs/security/authorization/index.rst @@ -0,0 +1,26 @@ +.. Licensed to the Apache Software Foundation (ASF) under one or more + contributor license agreements. See the NOTICE file distributed with + this work for additional information regarding copyright ownership. + The ASF licenses this file to You under the Apache License, Version 2.0 + (the "License"); you may not use this file except in compliance with + the License. You may obtain a copy of the License at + +.. http://www.apache.org/licenses/LICENSE-2.0 + +.. Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + +.. image:: https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg + :align: center + :width: 25% + +Kyuubi Authorization Guide +========================== + +.. toctree:: + :maxdepth: 2 + + Spark AuthZ Plugin diff --git a/docs/security/authorization/spark/build.md b/docs/security/authorization/spark/build.md new file mode 100644 index 00000000000..c4d71d1aaa7 --- /dev/null +++ b/docs/security/authorization/spark/build.md @@ -0,0 +1,104 @@ + + + + +# Building Kyuubi Spark AuthZ Plugin + +Kyuubi logo + +## Build with Apache Maven + +Kyuubi Spark AuthZ Plugin is built using [Apache Maven](http://maven.apache.org). +To build it, `cd` to the root direct of kyuubi project and run: + +```shell +build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DskipTests +``` + +After a while, if everything goes well, you will get the plugin finally in two parts: + +- The main plugin jar, which is under `./extensions/spark/kyuubi-spark-authz/target/kyuubi-spark-authz_${scala.binary.version}-${project.version}.jar` +- The least transitive dependencies needed, which are under `./extensions/spark/kyuubi-spark-authz/target/scala-${scala.binary.version}/jars` + +### Build against Different Apache Spark Versions + +The maven option `spark.version` is used for specifying Spark version to compile with and generate corresponding transitive dependencies. +By default, it is always built with the latest `spark.version` defined in kyuubi project main pom file. +Sometimes, it may be incompatible with other Spark distributions, then you may need to build the plugin on your own targeting the Spark version you use. + +For example, + +```shell +build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DskipTests -Dspark.version=3.0.2 +``` + +The available `spark.version`s are shown in the following table. + +| Spark Version | Supported | Remark | +|:-----------------:|:-----------:|:--------------------------------------------------------------------------------------------------------------------------------:| +| master | √ | - | +| 3.3.x | √ | - | +| 3.2.x | √ | - | +| 3.1.x | √ | - | +| 3.0.x | √ | - | +| 2.4.x | √ | - | +| 2.3.x and earlier | × | [PR 2367](https://github.com/apache/incubator-kyuubi/pull/2367) is used to track how we work with older releases with scala 2.11 | + +Currently, Spark released with Scala 2.12 are supported. + +### Build against Different Apache Ranger Versions + +The maven option `ranger.version` is used for specifying Ranger version to compile with and generate corresponding transitive dependencies. +By default, it is always built with the latest `ranger.version` defined in kyuubi project main pom file. +Sometimes, it may be incompatible with other Ranger Admins, then you may need to build the plugin on your own targeting the Ranger Admin version you connect with. + +```shell +build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DskipTests -Dranger.version=0.7.0 +``` + +The available `ranger.version`s are shown in the following table. + +| Ranger Version | Supported | Remark | +|:--------------:|:-----------:|:------:| +| 2.2.x | √ | - | +| 2.1.x | √ | - | +| 2.0.x | √ | - | +| 1.2.x | √ | - | +| 1.1.x | √ | - | +| 1.0.x | √ | - | +| 0.7.x | √ | - | +| 0.6.x | √ | - | + +Currently, all ranger releases are supported. + +## Test with ScalaTest Maven plugin +If you omit `-DskipTests` option in the command above, you will also get all unit tests run. + +```shell +build/mvn clean package -pl :kyuubi-spark-authz_2.12 +``` + +If any bug occurs and you want to debug the plugin yourself, you can configure `-DdebugForkedProcess=true` and `-DdebuggerPort=5005`(optional). + +```shell +build/mvn clean package -pl :kyuubi-spark-authz_2.12 -DdebugForkedProcess=true +``` + +The tests will suspend at startup and wait for a remote debugger to attach to the configured port. + +We will appreciate if you can share the bug or the fix to the Kyuubi community. diff --git a/docs/security/authorization/spark/index.rst b/docs/security/authorization/spark/index.rst new file mode 100644 index 00000000000..73d0f3475c3 --- /dev/null +++ b/docs/security/authorization/spark/index.rst @@ -0,0 +1,28 @@ +.. Licensed to the Apache Software Foundation (ASF) under one or more + contributor license agreements. See the NOTICE file distributed with + this work for additional information regarding copyright ownership. + The ASF licenses this file to You under the Apache License, Version 2.0 + (the "License"); you may not use this file except in compliance with + the License. You may obtain a copy of the License at + +.. http://www.apache.org/licenses/LICENSE-2.0 + +.. Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + +.. image:: https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg + :align: center + :width: 25% + +Kyuubi Spark AuthZ Plugin +========================= + +.. toctree:: + :maxdepth: 2 + + Overview + Building + Installing diff --git a/docs/security/authorization/spark/install.md b/docs/security/authorization/spark/install.md new file mode 100644 index 00000000000..5d3dc22a820 --- /dev/null +++ b/docs/security/authorization/spark/install.md @@ -0,0 +1,126 @@ + + +Kyuubi logo + + +# Installing and Configuring Kyuubi Spark AuthZ Plugin + +## Pre-install + +- [Apache Ranger](https://ranger.apache.org/) + + This plugin works as a ranger rest client with Apache Ranger admin server to do privilege check. + Thus, a ranger server need to be installed ahead and available to use. + +- Building(optional) + + If your ranger admin or spark distribution is not compatible with the official pre-built [artifact](https://mvnrepository.com/artifact/org.apache.kyuubi/kyuubi-spark-authz) in maven central. + You need to [build](build.md) the plugin targeting the spark/ranger you are using by yourself. + +## Install + +With the `kyuubi-spark-authz_*.jar` and its transitive dependencies available for spark runtime classpath, such as +- Copied to `$SPARK_HOME/jars`, or +- Specified to `spark.jars` configuration + +## Configure + +### Settings for Connecting Ranger Admin + +#### ranger-spark-security.xml +- Create `ranger-spark-security.xml` in `$SPARK_HOME/conf` and add the following configurations +for pointing to the right Ranger admin server. + +```xml + + + ranger.plugin.spark.policy.rest.url + ranger admin address like http://ranger-admin.org:6080 + + + + ranger.plugin.spark.service.name + a ranger hive service name + + + + ranger.plugin.spark.policy.cache.dir + ./a ranger hive service name/policycache + + + + ranger.plugin.spark.policy.pollIntervalMs + 5000 + + + + ranger.plugin.spark.policy.source.impl + org.apache.ranger.admin.client.RangerAdminRESTClient + + + +``` + +#### ranger-spark-audit.xml + +Create `ranger-spark-audit.xml` in `$SPARK_HOME/conf` and add the following configurations +to enable/disable auditing. + +```xml + + + + xasecure.audit.is.enabled + true + + + + xasecure.audit.destination.db + false + + + + xasecure.audit.destination.db.jdbc.driver + com.mysql.jdbc.Driver + + + + xasecure.audit.destination.db.jdbc.url + jdbc:mysql://10.171.161.78/ranger + + + + xasecure.audit.destination.db.password + rangeradmin + + + + xasecure.audit.destination.db.user + rangeradmin + + + +``` + +### Settings for Spark Session Extensions + +Add `org.apache.kyuubi.plugin.spark.authz.ranger.RangerSparkExtension` to the spark configuration `spark.sql.extensions`. + +```properties +spark.sql.extensions=org.apache.kyuubi.plugin.spark.authz.ranger.RangerSparkExtension +``` diff --git a/docs/security/authorization/spark/overview.md b/docs/security/authorization/spark/overview.md new file mode 100644 index 00000000000..52cae38801e --- /dev/null +++ b/docs/security/authorization/spark/overview.md @@ -0,0 +1,57 @@ + + +# Kyuubi AuthZ Plugin For Spark SQL + +Kyuubi logo + +Security is one of the fundamental features for enterprise adoption with Kyuubi. +When deploying Kyuubi against secured clusters, +storage-based authorization is enabled by default, which only provides file-level coarse-grained authorization mode. +When row/column-level fine-grained access control is required, +we can enhance the data access model with the Kyuubi Spark AuthZ plugin. + +## Authorization in Kyuubi + +### Storage-based Authorization + +As Kyuubi supports multi tenancy, a tenant can only visit authorized resources, +including computing resources, data, etc. +Most file systems, such as HDFS, support ACL management based on files and directories. + +A so called Storage-based authorization mode is supported by Kyuubi by default. +In this model, all objects, such as databases, tables, partitions, in meta layer are mapping to folders or files in the storage layer, +as well as their permissions. + +Storage-based authorization offers users with database, table and partition-level coarse-gained access control. + +### SQL-standard authorization with Ranger + +A SQL-standard authorization usually offers a row/colum-level fine-grained access control to meet the real-world data security need. + +[Apache Ranger](https://ranger.apache.org/) is a framework to enable, monitor and manage comprehensive data security across the Hadoop platform. +This plugin enables Kyuubi with data and metadata control access ability for Spark SQL Engines, including, + +- Column-level fine-grained authorization +- Row-level fine-grained authorization, a.k.a. Row-level filtering +- Data masking + +## The Plugin Itself + +Kyuubi Spark Authz Plugin itself provides general purpose for ACL management for data & metadata while using Spark SQL. +It is not necessary to deploy it with the Kyuubi server and engine, and can be used as an extension for any Spark SQL jobs. +However, the authorization always requires a robust authentication layer and multi tenancy support, so Kyuubi is a perfect match. diff --git a/docs/security/index.rst b/docs/security/index.rst index 63e6d896ccc..2e8cee8228b 100644 --- a/docs/security/index.rst +++ b/docs/security/index.rst @@ -13,18 +13,20 @@ See the License for the specific language governing permissions and limitations under the License. -.. image:: ../imgs/kyuubi_logo.png +.. image:: https://svn.apache.org/repos/asf/comdev/project-logos/originals/kyuubi-1.svg :align: center + :width: 25% Kyuubi Security Overview ======================== +Securing Kyuubi involves enabling authentication(authn), authorization(authz) and encryption, etc. + .. toctree:: :maxdepth: 2 - :numbered: 3 Authentication + Authorization kinit hadoop_credentials_manager - authorization diff --git a/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala b/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala index daaa6754ac2..979fc550b54 100644 --- a/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala +++ b/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala @@ -26,7 +26,7 @@ import org.apache.kyuubi.plugin.spark.authz.util.RuleEliminateMarker *
    *
  • Table/Column level authorization(yes)
  • *
  • Row level filtering(yes)
  • - *
  • Data masking(no)
  • + *
  • Data masking(yes)
  • *
      * * To work with Spark SQL, we need to enable it via spark extensions