Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue in Chapter 4 - Building K8s clusters with Ansible #41

Closed
benheu opened this issue Apr 3, 2020 · 3 comments
Closed

Issue in Chapter 4 - Building K8s clusters with Ansible #41

benheu opened this issue Apr 3, 2020 · 3 comments

Comments

@benheu
Copy link

benheu commented Apr 3, 2020

Not sure if it's a problem more people are having, but when:

  • I'm reaching the section : Running the cluster build playbook
  • I run the playbook ansible-playbook -i inventory main.yml
  • And the playbook can't connect to any of the VMs using the IPs 192.168.7.x

This is what I get in verbose from ansible:

<192.168.7.4> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o 'IdentityFile="/home/ben/.vagrant.d/insecure_private_key"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="vagrant"' -o ConnectTimeout=10 -o ControlPath=/home/ben/.ansible/cp/236294e04e 192.168.7.4 '/bin/sh -c '"'"'echo ~vagrant && sleep 0'"'"''
<192.168.7.2> (255, b'', b'Received disconnect from 192.168.7.2 port 22:2: Too many authentication failures\r\nDisconnected from 192.168.7.2 port 22\r\n')
fatal: [kube1]: UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Received disconnect from 192.168.7.2 port 22:2: Too many authentication failures\r\nDisconnected from 192.168.7.2 port 22",
"unreachable": true
}
...

The only way I could make it work was to record the hosts in the ~/.ssh/config file

Host 192.168.7.2
Hostname 192.168.7.2
User vagrant
IdentityFile ~/.vagrant.d/insecure_private_key
IdentitiesOnly yes
Port 22
....

I'm unsure whether it's a problem from my local config, maybe my vagrant version.
Anyway I thought I'd share that, hopefully that helps someone with the same issue

Super interesting book by the way 👍
Config:

Ubuntu 18.04.4
Vagrant 2.2.6
Virtualbox 6.0.18
ansible 2.9.6
python version = 3.6.9

@geerlingguy
Copy link
Owner

@benheu - That is very strange—the only major difference I can see is you're using Vagrant 2.2.6 (2.2.7 is latest) and VirtualBox 6.0.18 (6.1.6 is latest). Maybe there's a difference there?

If it works with the config in ~/.ssh/config it seems that's the same as what ansible was trying to use via Vagrant...

@geerlingguy
Copy link
Owner

Interesting, in researching this further I was brought to an issue on a different project I maintain: geerlingguy/drupal-vm#70

In it, someone mentioned IdentitiesOnly yes was the fix for them, too. It seems like this is a problem that may affect you if you use ssh identities and have more than 5 keys loaded via ssh-add!

By specifying the key in your ~/.ssh/config file directly, it offers that key first (no matter how many identities you have in your active ssh agent), and then authentication works.

Check how many keys are active with ssh-add -l and see if there are five or more.

Closing this issue as I believe that's the reason behind this and the fix, but thanks for posting your experience, hopefully to the benefit of future people who might Google the same issue!

@benheu
Copy link
Author

benheu commented Apr 18, 2020

Nice catch! Indeed I have 5 of them, thanks for taking the time looking into that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants