You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
NAME STATUS ROLES AGE VERSION
sigma01 Ready control-plane,master 642d v1.21.5
sigma02 Ready control-plane,master 642d v1.21.5
sigma03 Ready <none> 641d v1.21.5
sigma04 Ready <none> 641d v1.21.5
However, Spark 3.2.0 is looking for persistentvolumeclaims.
2021-11-19 17:51:23.534 ERROR [ main] o.a.s.i.Logging : Uncaught exception in thread main
io.fabric8.kubernetes.client.KubernetesClientException: Failure executing: GET at: https://kubernetes.default.svc/api/v1/namespaces/development/persistentvolumeclaims?labelSelector=spark-app-selector%3Dspark-632f340566e44cd68a2d6f34c2ff7bb7. Message: Forbidden!Configured service account doesn't have access. Service account may have been revoked. persistentvolumeclaims is forbidden: User "system:serviceaccount:development:spark-operator-development-spark" cannot list resource "persistentvolumeclaims" in API group "" in the namespace "development".
at io.fabric8.kubernetes.client.dsl.base.OperationSupport.requestFailure(OperationSupport.java:639) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.OperationSupport.assertResponseCode(OperationSupport.java:576) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:543) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:504) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.OperationSupport.handleResponse(OperationSupport.java:487) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.listRequestHelper(BaseOperation.java:163) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.list(BaseOperation.java:672) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.deleteList(BaseOperation.java:786) ~[processing-assembly-0.15.78.jar:0.15.78]
at io.fabric8.kubernetes.client.dsl.base.BaseOperation.delete(BaseOperation.java:704) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.$anonfun$stop$6(KubernetesClusterSchedulerBackend.scala:138) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.scheduler.cluster.k8s.KubernetesClusterSchedulerBackend.stop(KubernetesClusterSchedulerBackend.scala:139) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.scheduler.TaskSchedulerImpl.stop(TaskSchedulerImpl.scala:927) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.scheduler.DAGScheduler.stop(DAGScheduler.scala:2516) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.SparkContext.$anonfun$stop$12(SparkContext.scala:2086) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.util.Utils$.tryLogNonFatalError(Utils.scala:1442) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.SparkContext.stop(SparkContext.scala:2086) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.$anonfun$runMain$13(SparkSubmit.scala:963) ~[processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.$anonfun$runMain$13$adapted(SparkSubmit.scala:963) ~[processing-assembly-0.15.78.jar:0.15.78]
at scala.Option.foreach(Option.scala:437) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:963) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052) [processing-assembly-0.15.78.jar:0.15.78]
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [processing-assembly-0.15.78.jar:0.15.78]
The text was updated successfully, but these errors were encountered:
Spark operator deployed with the following values:
The resulting
ClusterRole
will have the following permissions:Kubernetes running on Flatcar stable:
However, Spark 3.2.0 is looking for
persistentvolumeclaims
.The text was updated successfully, but these errors were encountered: