-
Notifications
You must be signed in to change notification settings - Fork 70
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
EKS authentication failing #71
Comments
Having the same issue. I believe the route cause is that the plugin is not using the kubectl command and the implementation of the plugin does not support the custom command to authenticate to AWS. |
Facing the same issue with |
Will this plugin supports Blue/green deployments to EKS? |
Also facing this issue when using EKS with the plugin. I'm not sure how this plugin works, but the official kubernetes client supports |
Having the same issue. Any update on this point? |
anyone has a hint about the subject ? |
An alternative design would be set this sensible data on secrets in the Kubernetes cluster and just use them in the Jenkins slave images. |
Beware that 4.1.1 is broken in this regard. |
Facing the same issue, any updates to this? |
For me, It works only with 4.1.2 (last version of kubernetes-client now) `
` |
It looks like the Plugin cannot handle working with the |
Is this project still being maintained? cc: @ArieShout |
+1 |
1 similar comment
+1 |
Obligatory 6 month follow-up. Any updates? I would love to use this plugin for EKS. |
I'm trying to set up a deployment job to an EKS cluster, which has been set up properly and used manually, but whenever the
kubernetesDeploy
pipeline step runs, it fails due to an authentication error:In order to replicate manual cluster authentication, I've made sure the
aws-iam-authenticator
tool is available in all slave PATHs, and my pre-deploy stage generates the~/.aws/credentials
file required for the authenticator to generate a token. It then appends the Jenkins IAM access key and ID as required for the profile:I've verified that it generates with the correct secrets and format.
The stored Kubeconfig in Jenkins is the one generated with the AWS CLI, as specified here, with a modification in the
user
block to specify that the profile used from the credentials file I generated previously should be thejenkins
one - as follows:I also edited the cluster permissions as described in the EKS docs with the correct jenkins IAM user and permissions block:
This works when using the
jenkins
profile to manually affect the cluster.My expected behaviour is as follows:
~/.aws/credentials
file is generated on the slave with access keys from the credential binding pluginaws-iam-authenticator
to generate a token for its API use, resulting in the tool generating a token with the credentials specified in~/.aws/credentials
, and proceeds to deploy per usual.However, it seems that despite proper credentials, per my initial error log the plugin is not authenticating to the cluster correctly, resulting in the
system:anonymous
permissions being applied and subsequent deployment failure.Have I just made a huge mistake in configuration somewhere along the line?
The text was updated successfully, but these errors were encountered: