Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

can not create fargate profile with existing VPC #2746

Closed
timharsch opened this issue Oct 16, 2020 · 2 comments
Closed

can not create fargate profile with existing VPC #2746

timharsch opened this issue Oct 16, 2020 · 2 comments
Labels

Comments

@timharsch
Copy link

timharsch commented Oct 16, 2020

What happened?
could not create cluster with fargate, when using existing VPCs.

What you expected to happen?
cluster should not fail to create.

command used:
eksctl create cluster -f outputs/fargate-with-vpc.yaml.beta11Vpc.yaml --verbose=4

How to reproduce it?
Include the steps to reproduce the bug.
If using a config, include it here, removing any sensitive information!

apiVersion: eksctl.io/v1alpha5
kind: ClusterConfig

metadata:
  name: beta11-eks
  region: us-east-1

vpc:
  id: "vpc-aaa"
  cidr: "10.191.0.0/16"
  subnets:
    public:
      us-east-1a:
        id: "subnet-xxx"
        cidr: "10.191.0.0/20"
      us-east-1b:
        id: "subnet-yyy"
        cidr: "10.191.32.0/20"

fargateProfiles:
  - name: fargate-default
    selectors:
      - namespace: default
      - namespace: kube-system

Anything else we need to know?
What OS are you using, are you using a downloaded binary or did you compile eksctl, what type of AWS credentials are you using (i.e. default/named profile, MFA) - please don't include actual credentials though!

downloaded binary.

Versions
Please paste in the output of these commands:

eksctl version; echo -----; kubectl version
0.30.0-rc.1
-----
0.30.0-rc.1
-----
Client Version: version.Info{Major:"1", Minor:"19", GitVersion:"v1.19.2", GitCommit:"f5743093fd1c663cb0cbc89748f730662345d44d", GitTreeState:"clean", BuildDate:"2020-09-16T13:41:02Z", GoVersion:"go1.15", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"17+", GitVersion:"v1.17.9-eks-4c6976", GitCommit:"4c6976793196d70bc5cd29d56ce5440c9473648e", GitTreeState:"clean", BuildDate:"2020-07-17T18:46:04Z", GoVersion:"go1.13.9", Compiler:"gc", Platform:"linux/amd64"}

Logs
Include the output of the command line when running eksctl. If possible, eksctl should be run with debug logs. For example:
eksctl get clusters -v 4
Make sure you redact any sensitive information before posting.
If the output is long, please consider a Gist.

Log seen during create:

2020-10-16T03:46:12Z [✖]  failed to create Fargate profile "fargate-default" on EKS cluster "beta11-eks": failed to create Fargate profile "fargate-default": InvalidParameterException: No subnets were given, and the subnets provided during cluster creation are all public
{
  RespMetadata: {
    StatusCode: 400,
    RequestID: "ddd5910a-2425-4c8d-a81f-8b76eXXX"
  },
  Message_: "No subnets were given, and the subnets provided during cluster creation are all public"
}

Can get the same error using the create fargateprofile command

$ eksctl create fargateprofile --namespace default --cluster beta11-eks --name fp-de
[ℹ]  creating Fargate profile "fp-de" on EKS cluster "beta11-eks"
Error: failed to create Fargate profile "fp-de" on EKS cluster "beta11-eks": failed to create Fargate profile "fp-de": InvalidParameterException: No subnets were given, and the subnets provided during cluster creation are all public
{
  RespMetadata: {
    StatusCode: 400,
    RequestID: "e237b53f-7816-4b68-840e-046031de7XXX"
  },
  Message_: "No subnets were given, and the subnets provided during cluster creation are all public"
}
@michaelbeaumont
Copy link
Contributor

Message_: "No subnets were given, and the subnets provided during cluster creation are all public"

Fargate profiles require private subnets and it appears you haven't provided any during cluster creation or any explicitly as a property under fargateProfiles.

@timharsch
Copy link
Author

Thanks for the hints @michaelbeaumont. I was able to get past the error. For future travelers, this is what I did:

  1. Switch to private subnets per the Fargate docs you provided
  2. Tag my subnets per the eksctl docs (see 6th bullet: tagging of subnets)
aws ec2 create-tags \
  --resources "${SUBA_ID}" "${SUBB_ID}" \
  --tags Key="kubernetes.io/cluster/${EKS_CLUSTER_NAME}",Value=shared
aws ec2 create-tags \
  --resources "${SUBA_ID}" "${SUBB_ID}" \
  --tags Key=kubernetes.io/role/internal-elb,Value=1
  1. Added subnets property to fargateProfiles. Here's a sanitized version of my template:
apiVersion: eksctl.io/v1alpha5
kind: ClusterConfig

metadata:
  name: beta11-eks
  region: us-east-1

vpc:
  id: "vpc-vvvvvvvvvvvvvvvvv"
  cidr: "10.191.0.0/16"
  subnets: # FARGATE Must be Private
    private:
      us-east-1a:
        id: "subnet-xxxxxxxxxxxxxxxxx"  
        cidr: "10.191.16.0/20" 
      us-east-1b:
        id: "subnet-yyyyyyyyyyyyyyyyy" 
        cidr: "10.191.48.0/20"

fargateProfiles:
  - name: fargate-default
    selectors:
      - namespace: default
      - namespace: kube-system
    subnets: [ "subnet-xxxxxxxxxxxxxxxxx", "subnet-yyyyyyyyyyyyyyyyy" ]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants