Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question/Request: Using one Argo controller to manage workflows in multiple clusters #1802

Closed
danxmoran opened this issue Nov 27, 2019 · 7 comments

Comments

@danxmoran
Copy link
Contributor

Is this a BUG REPORT or FEATURE REQUEST?:

Question

What happened:

Depending on the project, my team needs to run different workflows in different k8s clusters. Our original plan was to run one instance of Argo in each cluster and update our client-side code to pick the appropriate cluster to use for each submission. This isn't ideal because:

  1. We have N Argo UIs to check instead of 1
  2. We always need at least one pod running in each cluster, which means we can't use node auto-scaling to scale down to 0

I just found this config, which makes it look like the Argo controller can orchestrate calls to external k8s clusters. How granular is that functionality? Is there a way to register multiple cluster contexts in the mounted kubeconfig, and specify the context on a per-submission basis?

What you expected to happen:

It would be ideal for our use-case if Workflow objects accepted an optional contextName field specifying the external cluster where its pods should run.

@simster7
Copy link
Member

Hmmm, I believe the config that you pointed out only allows for one workflow controller to dispatch workflows to a single other K8s cluster. So if this were to be used as a solution to your problem, then you would still need one controller per different cluster you would want to deploy to.

Don't take this for granted, but I believe that supporting the multi-context behavior you described would be a pretty large scale change and one we don't currently have in our roadmap, unfortunately.

@danxmoran
Copy link
Contributor Author

Thanks for the info @simster7, it's understandable that the multi-context behavior is a ways out. As a work-around, do you think it'd be feasible for me to do something like:

  1. Run N Argo controller deployments within our "core" cluster, each in a different namespace and configured to point at a different external cluster using the existing kubeconfig options
  2. Run a single deployment of the Argo UI in the "core" cluster, with namespace isolation disabled

I think that setup would still address the main pain points I mentioned in the issue description. I'll experiment with it on my own, but if you can see any immediate problems I'd appreciate the heads-up 😄

@simster7
Copy link
Member

I don't see any issues with that from the get-go, but I have not tried something like that myself, so let me know how it turns out!

Also, we are currently working on a refactor of the UI and one of the ideas floating around is to allow the user to specify a kubeconfig file to use in order to communicate with the Workflow controller. Do you think that this is something that could solve your problem?

@simster7
Copy link
Member

simster7 commented Dec 3, 2019

Closing this, feel free to reopen if necessary

@simster7 simster7 closed this as completed Dec 3, 2019
@idristarwala
Copy link

@simster7 we have a similar requirement, where we want to be able to deploy workflows to multiple clusters and also be able to view all the workflows from a single Argo UI instance.. is the UI refactor still under works ?

@luozhaoyu
Copy link

similar issue: #3523

@dabenson4
Copy link

dabenson4 commented Mar 8, 2022

@danxmoran Hi, were you able to figure something out? I tried to accomplish your idea, but I ended up just having the kubeconfig setup on the init and wait containers, but the pod still deploys on the core cluster, instead of the external tied up to the second workflow controller I have, I even run the argo workflow with the instanceid and I see the it registered on the wf controllers logs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants