Skip to content
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions core/src/main/scala/org/apache/spark/ui/JWSFilter.scala
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ import jakarta.servlet.http.{HttpServletRequest, HttpServletResponse}
* Like the other UI filters, the following configurations are required to use this filter.
* {{{
* - spark.ui.filters=org.apache.spark.ui.JWSFilter
* - spark.org.apache.spark.ui.JWSFilter.param.key=BASE64URL-ENCODED-YOUR-PROVIDED-KEY
* - spark.org.apache.spark.ui.JWSFilter.param.secretKey=BASE64URL-ENCODED-YOUR-PROVIDED-KEY
* }}}
* The HTTP request should have {@code Authorization: Bearer <jws>} header.
* {{{
Expand All @@ -52,7 +52,7 @@ private class JWSFilter extends Filter {
* - WeakKeyException will happen if the user-provided value is insufficient
*/
override def init(config: FilterConfig): Unit = {
key = Keys.hmacShaKeyFor(Decoders.BASE64URL.decode(config.getInitParameter("key")));
key = Keys.hmacShaKeyFor(Decoders.BASE64URL.decode(config.getInitParameter("secretKey")));
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Currently, key can be exposed if spark.ui.filters configuration is removed mistakenly.
Screenshot 2024-08-03 at 22 02 55

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

secretKey will be redact by Spark, right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As you make the change #47596 (comment), do we still need to change this param to secretKey?

}

override def doFilter(req: ServletRequest, res: ServletResponse, chain: FilterChain): Unit = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -509,7 +509,7 @@ class StandaloneRestSubmitSuite extends SparkFunSuite {
rpcEnv = Some(RpcEnv.create("rest-with-filter", localhost, 0, conf, securityManager))
val fakeMasterRef = rpcEnv.get.setupEndpoint("fake-master", new DummyMaster(rpcEnv.get))
conf.set(MASTER_REST_SERVER_FILTERS.key, "org.apache.spark.ui.JWSFilter")
conf.set("spark.org.apache.spark.ui.JWSFilter.param.key", TEST_KEY)
conf.set("spark.org.apache.spark.ui.JWSFilter.param.secretKey", TEST_KEY)
server = Some(new StandaloneRestServer(localhost, 0, conf, fakeMasterRef, "spark://fake:7077"))
server.get.start()
}
Expand All @@ -521,7 +521,7 @@ class StandaloneRestSubmitSuite extends SparkFunSuite {
rpcEnv = Some(RpcEnv.create("rest-with-filter", localhost, 0, conf, securityManager))
val fakeMasterRef = rpcEnv.get.setupEndpoint("fake-master", new DummyMaster(rpcEnv.get))
conf.set(MASTER_REST_SERVER_FILTERS.key, "org.apache.spark.ui.JWSFilter")
conf.set("spark.org.apache.spark.ui.JWSFilter.param.key", TEST_KEY)
conf.set("spark.org.apache.spark.ui.JWSFilter.param.secretKey", TEST_KEY)
server = Some(new StandaloneRestServer(localhost, 0, conf, fakeMasterRef, "spark://fake:7077"))
val port = server.get.start()
val masterUrl = s"spark://$localhost:$port"
Expand Down
6 changes: 3 additions & 3 deletions core/src/test/scala/org/apache/spark/ui/JWSFilterSuite.scala
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ class JWSFilterSuite extends SparkFunSuite {
test("Succeed to initialize") {
val filter = new JWSFilter()
val params = new JHashMap[String, String]
params.put("key", TEST_KEY)
params.put("secretKey", TEST_KEY)
filter.init(new DummyFilterConfig(params))
}

Expand All @@ -59,7 +59,7 @@ class JWSFilterSuite extends SparkFunSuite {

val filter = new JWSFilter()
val params = new JHashMap[String, String]
params.put("key", TEST_KEY)
params.put("secretKey", TEST_KEY)
val conf = new DummyFilterConfig(params)
filter.init(conf)

Expand All @@ -84,7 +84,7 @@ class JWSFilterSuite extends SparkFunSuite {

val filter = new JWSFilter()
val params = new JHashMap[String, String]
params.put("key", TEST_KEY)
params.put("secretKey", TEST_KEY)
val conf = new DummyFilterConfig(params)
filter.init(conf)

Expand Down
6 changes: 5 additions & 1 deletion docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -794,7 +794,7 @@ Apart from these, the following properties are also available, and may be useful
</tr>
<tr>
<td><code>spark.redaction.regex</code></td>
<td>(?i)secret|password|token|access[.]key</td>
<td>(?i)secret|password|token|access[.]?key</td>
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, this is missed at #47392 .

<td>
Regex to decide which Spark configuration properties and environment variables in driver and
executor environments contain sensitive information. When this regex matches a property key or
Expand Down Expand Up @@ -1733,6 +1733,10 @@ Apart from these, the following properties are also available, and may be useful
<br /><code>spark.ui.filters=com.test.filter1</code>
<br /><code>spark.com.test.filter1.param.name1=foo</code>
<br /><code>spark.com.test.filter1.param.name2=bar</code>
<br />
<br />Note that some filter requires additional dependencies. For example,
the built-in <code>org.apache.spark.ui.JWSFilter</code> requires
<code>jjwt-impl</code> and <code>jjwt-jackson</code> jar files.
</td>
<td>1.0.0</td>
</tr>
Expand Down
12 changes: 9 additions & 3 deletions docs/security.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,9 +49,13 @@ specified below, the secret must be defined by setting the `spark.authenticate.s
option. The same secret is shared by all Spark applications and daemons in that case, which limits
the security of these deployments, especially on multi-tenant clusters.

The REST Submission Server does not support authentication. You should
ensure that all network access to the REST API (port 6066 by default)
is restricted to hosts that are trusted to submit jobs.
The REST Submission Server supports HTTP `Authorization` header with
a cryptographically signed JSON Web Token via `JWSFilter`.
To enable authorization, Spark Master should have
`spark.master.rest.filters=org.apache.spark.ui.JWSFilter` and
`spark.org.apache.spark.ui.JWSFilter.param.secretKey=BASE64URL-ENCODED-KEY` configurations, and
client should provide HTTP `Authorization` header which contains JSON Web Token signed by
the shared secret key.

### YARN

Expand Down Expand Up @@ -351,6 +355,8 @@ The following options control the authentication of Web UIs:
<td><code>spark.ui.filters</code></td>
<td>None</td>
<td>
Spark supports HTTP <code>Authorization</code> header with a cryptographically signed
JSON Web Token via <code>org.apache.spark.ui.JWSFilter</code>. <br />
See the <a href="configuration.html#spark-ui">Spark UI</a> configuration for how to configure
filters.
</td>
Expand Down
19 changes: 19 additions & 0 deletions docs/spark-standalone.md
Original file line number Diff line number Diff line change
Expand Up @@ -263,6 +263,14 @@ SPARK_MASTER_OPTS supports the following system properties:
</td>
<td>1.3.0</td>
</tr>
<tr>
<td><code>spark.master.rest.filters</code></td>
<td>(None)</td>
<td>
Comma separated list of filter class names to apply to the Master REST API.
</td>
<td>4.0.0</td>
</tr>
<tr>
<td><code>spark.master.useAppNameAsAppId.enabled</code></td>
<td><code>false</code></td>
Expand Down Expand Up @@ -676,6 +684,17 @@ The following is the response from the REST API for the above <code>create</code
}
```

When Spark master requires HTTP <code>Authorization</code> header via
<code>spark.master.rest.filters=org.apache.spark.ui.JWSFilter</code> and
<code>spark.org.apache.spark.ui.JWSFilter.param.secretKey=BASE64URL-ENCODED-KEY</code>
configurations, <code>curl</code> CLI command can provide the required header like the following.

```bash
$ curl -XPOST http://IP:PORT/v1/submissions/create \
--header "Authorization: Bearer USER-PROVIDED-WEB-TOEN-SIGNED-BY-THE-SAME-SHARED-KEY"
...
```

For <code>sparkProperties</code> and <code>environmentVariables</code>, users can use place
holders for server-side environment variables like the following.

Expand Down