Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

authorized_key with delegate_to not always adding keys to ~/.ssh/authorized_keys #29693

Closed
ansibot opened this issue Sep 11, 2017 · 14 comments
Closed
Labels
affects_2.0 This issue/PR affects Ansible v2.0 bot_closed bug This issue/PR relates to a bug. collection:ansible.posix collection Related to Ansible Collections work module This issue/PR relates to a module. needs_collection_redirect https://github.com/ansible/ansibullbot/blob/master/docs/collection_migration.md support:community This issue/PR relates to code supported by the Ansible community. system System category

Comments

@ansibot
Copy link
Contributor

ansibot commented Sep 11, 2017

From @zachradtka on 2016-08-25T19:13:13Z

ISSUE TYPE
  • Bug Report
COMPONENT NAME

authorized_key

ANSIBLE VERSION
ansible 2.0.2.0
CONFIGURATION
OS / ENVIRONMENT

N/A

SUMMARY

Utilizing delegate_to and authorized_key to implement passworless SSH on a cluster does not work. Some, not all keys will get added to ~/.ssh/authorized_keys while Ansible reports that all keys have been added.

STEPS TO REPRODUCE

I have a cluster that has 4 machines. I am attempting to implement passworldless SSH on the entire cluster for a single user (e.g. the user hdfs on a Hadoop cluster). What I am trying to do is:

  1. Generate passwordless SSH keys on each host, if it does not already exist
  2. Get the public key (id_rsa.pub) from each machine
  3. Add each machines public key to every other machine's ~/.ssh/authorized_keys file

An example of what I am trying to accomplish is below.

For clarity, the variables I am using are defined as:

  • user is a user
  • user_home_dir is the users home directory
  • hosts is a group named "hosts" in the inventory file
- name: Ensure SSH keys don't already exist
  stat:
    path: "{{ user_home_dir }}/.ssh/id_rsa"
  become: yes
  become_user: "{{ user }}"
  become_method: sudo
  register: ssh_key
  tags:
    - ssh-keys

- name: Generate SSH keys for passwordless SSH
  shell: ssh-keygen -N '' -f {{ user_home_dir }}/.ssh/id_rsa
  become: yes
  become_user: "{{ user }}"
  become_method: sudo
  when: ssh_key.stat.exists == False

- name: Get `id_rsa.pub`
  shell: cat {{ user_home_dir }}/.ssh/id_rsa.pub
  register: id_rsa_pub

# TODO: For debug purposes only
- name: DEBUG Check Keys
  debug:
    msg: "{{ id_rsa_pub.stdout }}"

- name: Add authorized key to all hosts
  authorized_key: 
    user: "{{ user }}"
    key: "{{ id_rsa_pub.stdout }}"
  delegate_to: "{{ item }}"
  with_items: "{{ groups['hosts'] }}"
  when: id_rsa_pub.stdout is defined

EXPECTED RESULTS

I expected all hosts to contain the public key of all other hosts. The easiest way to check this is to cat .ssh/authorized_keys on each hosts. If this did work, I would expect the following (Assuming hosts: host_A, host_B, host_C, host_D).

The authorized keys on host_A:

[user@host_A ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D
ssh-rsa <key_redacted> user@host_C
ssh-rsa <key_redacted> user@host_A
ssh-rsa <key_redacted> user@host_B

The authorized keys on host_B:

[user@host_B ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D
ssh-rsa <key_redacted> user@host_C
ssh-rsa <key_redacted> user@host_A
ssh-rsa <key_redacted> user@host_B

The authorized keys on host_C:

[user@host_C ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D
ssh-rsa <key_redacted> user@host_C
ssh-rsa <key_redacted> user@host_A
ssh-rsa <key_redacted> user@host_B

The authorized keys on host_D:

[user@host_D ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D
ssh-rsa <key_redacted> user@host_C
ssh-rsa <key_redacted> user@host_A
ssh-rsa <key_redacted> user@host_B
ACTUAL RESULTS

The results vary every time the command is run. What is problematic is that Ansible reports that all hosts have added the public keys from all other hosts. The output of the command is below.

TASK [role_name : Add authorized key to all hosts] ***********
changed: [host_A -> host_A] => (item=host_A)
changed: [host_B -> host_A] => (item=host_A)
changed: [host_D -> host_A] => (item=host_A)
changed: [host_C -> host_A] => (item=host_A)
changed: [host_A -> host_B] => (item=host_B)
changed: [host_B -> host_B] => (item=host_B)
changed: [host_D -> host_B] => (item=host_B)
changed: [host_C -> host_B] => (item=host_B)
changed: [host_D -> host_C] => (item=host_C)
changed: [host_C -> host_C] => (item=host_C)
changed: [host_B -> host_C] => (item=host_C)
changed: [host_A -> host_C] => (item=host_C)
changed: [host_D -> host_D] => (item=host_D)
changed: [host_A -> host_D] => (item=host_D)
changed: [host_C -> host_D] => (item=host_D)
changed: [host_B -> host_D] => (item=host_D)

Looking at each hosts authorized_keys file shows that only one host (host_B) has all of the other hosts keys:

The authorized keys on host_A:

[user@host_A ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_A

The authorized keys on host_B:

[user@host_B ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D
ssh-rsa <key_redacted> user@host_C
ssh-rsa <key_redacted> user@host_A
ssh-rsa <key_redacted> user@host_B

The authorized keys on host_C:

[user@host_C ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_C

The authorized keys on host_D:

[user@host_D ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D

To further give evidence that there is an error, when I run the playbook again, the authorized_key module should report a status ok for each action that was already completed. The result of a subsequent run shows that some of the statuses are changed.

changed: [host_B -> host_A] => (item=host_A)
changed: [host_C -> host_A] => (item=host_A)
ok: [host_A -> host_A] => (item=host_A)
changed: [host_D -> host_A] => (item=host_A)
ok: [host_A -> host_B] => (item=host_B)
ok: [host_B -> host_B] => (item=host_B)
ok: [host_C -> host_B] => (item=host_B)
ok: [host_D -> host_B] => (item=host_B)
changed: [host_B -> host_C] => (item=host_C)
changed: [host_A -> host_C] => (item=host_C)
ok: [host_C -> host_C] => (item=host_C)
changed: [host_D -> host_C] => (item=host_C)
changed: [host_B -> host_D] => (item=host_D)
changed: [host_A -> host_D] => (item=host_D)
changed: [host_C -> host_D] => (item=host_D)
ok: [host_D -> host_D] => (item=host_D)

For brevity, only checking host_A's authorized keys file reveals that after a second run, host_C's public key is still missing.

[user@host_A ~]$ cat .ssh/authorized_keys
ssh-rsa <key_redacted> user@host_D
ssh-rsa <key_redacted> user@host_A
ssh-rsa <key_redacted> user@host_B
WORKAROUND

Because of this error, I have created the following workaround. Instead of using the authorized_key module, I just run a simple shell command.

- name: Add authorized key to all hosts
  shell: echo {{ id_rsa_pub.stdout }} >> {{ user_home_dir }}/.ssh/authorized_keys
  delegate_to: "{{ item }}"
  with_items: "{{ groups['hosts'] }}"
  become: yes
  become_user: "{{ user }}"
  become_method: sudo
  when: id_rsa_pub.stdout is defined

Copied from original issue: ansible/ansible-modules-core#4542

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @ansibot on 2016-08-25T19:13:13Z

@ansible ping, this issue is waiting for your response.
click here for bot help

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @gerhard-tinned on 2016-08-25T19:13:13Z

Version: ansible 2.1.1.0
OS: Linux Mint (management) / CentOS (target)

I have a slightly different setup but i noticed the affect as well. I generate a ssh-key on one host and upload it using the "authorized_key" with "delegate_to" to a second host. While testing this playbook i noticed that not every time the ssh-key is added to the second host's authorized_keys file.

Without the deletgate_to, i did not notice this issue so far.

@ansibot ansibot added the affects_2.0 This issue/PR affects Ansible v2.0 label Sep 11, 2017
@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @gerhard-tinned on 2016-08-25T19:13:13Z

+1

Is this the wrong place for this ticket? Is this a dead project? Anyone except the ansibot??

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @gerhard-tinned on 2016-08-25T19:13:13Z

I thought Ansible is an active project ???

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @gaddman on 2016-08-25T19:13:13Z

I've seen this same behaviour, using connection: local, rather than delegate_to. Ansible version 2.1.2.0

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @sivel on 2016-08-25T19:13:13Z

This could very well be related to having multiple instances of the authorized keys module attempting to update a single file at once.

If you had multiple processes updating a single file simultaneously, the results are likely to be unexpected.

You can probably get around this by running the authorized_keys task, in a play that sets serial to 1, so that only 1 host is being processed at a single time.

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @gerhard-tinned on 2016-08-25T19:13:13Z

Is there anyone working on this ... except the ansibot??

@ansibot
Copy link
Contributor Author

ansibot commented Sep 11, 2017

From @gerhard-tinned on 2016-08-25T19:13:13Z

What is the new ticket number now after the merge?

@ansibot ansibot added bug_report module This issue/PR relates to a module. support:core This issue/PR relates to code supported by the Ansible Engineering Team. labels Sep 11, 2017
@gerhard-tinned
Copy link

Is there a plan on when this will be fixed?

@ansibot ansibot added bug This issue/PR relates to a bug. and removed bug_report labels Mar 1, 2018
openstack-gerrit pushed a commit to openstack/openstack-ansible-os_keystone that referenced this issue Mar 14, 2018
When delegating with the authorized_key module, writes of multiple keys
against the same host's file can occur at the same time, leading to
missing keys.[0]

To avoid conflicting delegation between hosts, the registered
'keystone_pubkey' fact now contains a list of SSH keys of all hosts the
current batch of the play, rather than only the key of the current host.
The first host within each batch will handle distribution of that
batch's keys to all hosts within the play.

[0] ansible/ansible#29693

Change-Id: I386e84eba46aa164db22618b7a6ac53b86eeeaf0
openstack-gerrit pushed a commit to openstack/openstack-ansible-os_keystone that referenced this issue Mar 14, 2018
When delegating with the authorized_key module, writes of multiple keys
against the same host's file can occur at the same time, leading to
missing keys.[0]

To avoid conflicting delegation between hosts, the registered
'keystone_pubkey' fact now contains a list of SSH keys of all hosts the
current batch of the play, rather than only the key of the current host.
The first host within each batch will handle distribution of that
batch's keys to all hosts within the play.

[0] ansible/ansible#29693

Change-Id: I386e84eba46aa164db22618b7a6ac53b86eeeaf0
(cherry picked from commit 97428cb)
@ikr0m
Copy link
Contributor

ikr0m commented Aug 24, 2018

+1

@ansibot ansibot added the system System category label Feb 17, 2019
@DavidVentura
Copy link
Contributor

any updates on this? I have the exact same issue

@sigio
Copy link
Contributor

sigio commented Dec 2, 2019

Still an issue.... multiple simulatanious updates overwrite eachother, this needs some locking or support for concurrent updates.

@sigio
Copy link
Contributor

sigio commented Dec 2, 2019

Looks like this will be fixed with 'throttle: 1' in ansible-latest... 2.9 ?

@ansibot ansibot added collection Related to Ansible Collections work collection:ansible.posix needs_collection_redirect https://github.com/ansible/ansibullbot/blob/master/docs/collection_migration.md support:community This issue/PR relates to code supported by the Ansible community. and removed support:core This issue/PR relates to code supported by the Ansible Engineering Team. labels Apr 29, 2020
@ansibot ansibot added the needs_triage Needs a first human triage before being processed. label May 16, 2020
@ansibot
Copy link
Contributor Author

ansibot commented Aug 17, 2020

Thank you very much for your interest in Ansible. Ansible has migrated much of the content into separate repositories to allow for more rapid, independent development. We are closing this issue/PR because this content has been moved to one or more collection repositories.

For further information, please see:
https://github.com/ansible/ansibullbot/blob/master/docs/collection_migration.md

@ansibot ansibot closed this as completed Aug 17, 2020
@sivel sivel removed the needs_triage Needs a first human triage before being processed. label Aug 17, 2020
@ansible ansible locked and limited conversation to collaborators Sep 14, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
affects_2.0 This issue/PR affects Ansible v2.0 bot_closed bug This issue/PR relates to a bug. collection:ansible.posix collection Related to Ansible Collections work module This issue/PR relates to a module. needs_collection_redirect https://github.com/ansible/ansibullbot/blob/master/docs/collection_migration.md support:community This issue/PR relates to code supported by the Ansible community. system System category
Projects
None yet
Development

No branches or pull requests

6 participants