-
Notifications
You must be signed in to change notification settings - Fork 8.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[bitnami/spark] Allow to access Spark worker page easily #16227
Comments
Hi, It seems to me that this would be a case where add a public service endpoint to each node might be necessary. You can check the |
Thanks @javsalgar ! I think However, for redis-cluster helm chart, |
Thanks! I will forward this to the engineering team, but we cannot guarantee an ETA. However, if you want to speed up the process, would you like to submit a PR? |
Thanks @javsalgar ! Unfortunately, I think I don't have enough knowledge for this task. |
Hi, |
Hi.
|
Thanks @jeluizferreira ! And thanks @javsalgar and @rafariossaa too! I found the related document at https://github.com/bitnami/charts/tree/main/bitnami/spark#configuring-spark-master-as-reverse-proxy In my case I am using Cloudflare Tunnel. So I don't need Here is my step: First I pointed Then deploy Spark by helm upgrade \
spark \
spark \
--install \
--repo=https://charts.bitnami.com/bitnami \
--namespace=hm-spark \
--create-namespace \
--values=my-values.yaml my-values.yaml master:
configOptions:
-Dspark.ui.reverseProxy=true
-Dspark.ui.reverseProxyUrl=https://spark.mydomain.com
worker:
configOptions:
-Dspark.ui.reverseProxy=true
-Dspark.ui.reverseProxyUrl=https://spark.mydomain.com Then I am able to visit both master and worker pages! 😃 |
Name and Version
bitnami/spark 3.3.2-debian-11-r19
What is the problem this feature will solve?
It would be great to allow to access Spark worker page.
Prefect also has a master / worker pattern.
In Prefect server (master) helm chart, there is a publicApiUrl, so that all UI related URLs will point to that URL.
If existing helm values not able to do it, it would be great to provide something like that, thanks! 😃
What is the feature you are proposing to solve the problem?
Originally asked at Stack Overflow. Below is a copy:
I have a local k3s Kubernetes created by Rancher Desktop.
I installed Spark by
Currently I am doing
port-forward
to access the Spark master UI at http://localhost:4040/
However, I won't be able to open Spark worker job pages as they are using Kubernentes internal cluster IP.
Here is this chart values: https://github.com/bitnami/charts/blob/main/bitnami/spark/values.yaml
Is there any value I can set to help me access the Spark worker job page? Thanks!
I found a similar question deployed by Docker Swarm but also has no accepted answers.
What alternatives have you considered?
spark-on-k8s-operator would be potential option.
The text was updated successfully, but these errors were encountered: