Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-29021][K8S] Use mount path key to configure hostPath volume specific conf #25730

Closed
wants to merge 1 commit into from
Closed

Conversation

yaooqinn
Copy link
Member

@yaooqinn yaooqinn commented Sep 9, 2019

What changes were proposed in this pull request?

Use mount.path as key suffix to retrieve host path value.

Why are the changes needed?

got the following exception while configuring hostPath as spark.local.dir for k8s pods

Exception in thread "main" java.util.NoSuchElementException: key not found: hostPath.spark-local-dir-5.options.path
	at scala.collection.MapLike.default(MapLike.scala:235)
	at scala.collection.MapLike.default$(MapLike.scala:234)
	at scala.collection.AbstractMap.default(Map.scala:63)
	at scala.collection.MapLike.apply(MapLike.scala:144)
	at scala.collection.MapLike.apply$(MapLike.scala:143)
	at scala.collection.AbstractMap.apply(Map.scala:63)
	at org.apache.spark.deploy.k8s.KubernetesVolumeUtils$.parseVolumeSpecificConf(KubernetesVolumeUtils.scala:70)
	at org.apache.spark.deploy.k8s.KubernetesVolumeUtils$.$anonfun$parseVolumesWithPrefix$1(KubernetesVolumeUtils.scala:43)
	at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:237)
	at scala.collection.immutable.HashSet$HashSet1.foreach(HashSet.scala:321)
	at scala.collection.immutable.HashSet$HashTrieSet.foreach(HashSet.scala:977)
	at scala.collection.TraversableLike.map(TraversableLike.scala:237)
	at scala.collection.TraversableLike.map$(TraversableLike.scala:230)
	at scala.collection.AbstractSet.scala$collection$SetLike$$super$map(Set.scala:51)
	at scala.collection.SetLike.map(SetLike.scala:104)
	at scala.collection.SetLike.map$(SetLike.scala:104)
	at scala.collection.AbstractSet.map(Set.scala:51)
	at org.apache.spark.deploy.k8s.KubernetesVolumeUtils$.parseVolumesWithPrefix(KubernetesVolumeUtils.scala:33)
	at org.apache.spark.deploy.k8s.KubernetesConf$.createDriverConf(KubernetesConf.scala:179)
	at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.run(KubernetesClientApplication.scala:214)
	at org.apache.spark.deploy.k8s.submit.KubernetesClientApplication.start(KubernetesClientApplication.scala:198)
	at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:920)
	at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:179)
	at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:202)
	at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:89)
	at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:999)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1008)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Otherwise, you need to configure them redundantly both mount.xxx and options.xxx

spark.kubernetes.driver.volumes.hostPath.spark-local-dir-0.mount.path=/mnt/dfs/0
spark.kubernetes.driver.volumes.hostPath.spark-local-dir-0.options.path=/mnt/dfs/0

Does this PR introduce any user-facing change?

no

How was this patch tested?

manually verified

@SparkQA
Copy link

SparkQA commented Sep 9, 2019

Test build #110345 has finished for PR 25730 at commit 82ce2fd.

  • This patch fails Spark unit tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@SparkQA
Copy link

SparkQA commented Sep 9, 2019

@SparkQA
Copy link

SparkQA commented Sep 9, 2019

@yaooqinn
Copy link
Member Author

yaooqinn commented Sep 9, 2019

I guess that I misunderstand the meaning of mount path and options path , close it

@yaooqinn yaooqinn closed this Sep 9, 2019
@yaooqinn yaooqinn deleted the SPARK-29021 branch September 9, 2019 11:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants