-
Notifications
You must be signed in to change notification settings - Fork 23.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
authorized_key
with delegate_to
not always adding keys to ~/.ssh/authorized_keys
#29693
Comments
From @ansibot on 2016-08-25T19:13:13Z @ansible ping, this issue is waiting for your response. |
From @gerhard-tinned on 2016-08-25T19:13:13Z Version: ansible 2.1.1.0 I have a slightly different setup but i noticed the affect as well. I generate a ssh-key on one host and upload it using the "authorized_key" with "delegate_to" to a second host. While testing this playbook i noticed that not every time the ssh-key is added to the second host's authorized_keys file. Without the deletgate_to, i did not notice this issue so far. |
From @gerhard-tinned on 2016-08-25T19:13:13Z +1 Is this the wrong place for this ticket? Is this a dead project? Anyone except the ansibot?? |
From @gerhard-tinned on 2016-08-25T19:13:13Z I thought Ansible is an active project ??? |
From @gaddman on 2016-08-25T19:13:13Z I've seen this same behaviour, using |
From @sivel on 2016-08-25T19:13:13Z This could very well be related to having multiple instances of the authorized keys module attempting to update a single file at once. If you had multiple processes updating a single file simultaneously, the results are likely to be unexpected. You can probably get around this by running the |
From @gerhard-tinned on 2016-08-25T19:13:13Z Is there anyone working on this ... except the ansibot?? |
From @gerhard-tinned on 2016-08-25T19:13:13Z What is the new ticket number now after the merge? |
Is there a plan on when this will be fixed? |
When delegating with the authorized_key module, writes of multiple keys against the same host's file can occur at the same time, leading to missing keys.[0] To avoid conflicting delegation between hosts, the registered 'keystone_pubkey' fact now contains a list of SSH keys of all hosts the current batch of the play, rather than only the key of the current host. The first host within each batch will handle distribution of that batch's keys to all hosts within the play. [0] ansible/ansible#29693 Change-Id: I386e84eba46aa164db22618b7a6ac53b86eeeaf0
When delegating with the authorized_key module, writes of multiple keys against the same host's file can occur at the same time, leading to missing keys.[0] To avoid conflicting delegation between hosts, the registered 'keystone_pubkey' fact now contains a list of SSH keys of all hosts the current batch of the play, rather than only the key of the current host. The first host within each batch will handle distribution of that batch's keys to all hosts within the play. [0] ansible/ansible#29693 Change-Id: I386e84eba46aa164db22618b7a6ac53b86eeeaf0 (cherry picked from commit 97428cb)
+1 |
any updates on this? I have the exact same issue |
Still an issue.... multiple simulatanious updates overwrite eachother, this needs some locking or support for concurrent updates. |
Looks like this will be fixed with 'throttle: 1' in ansible-latest... 2.9 ? |
Thank you very much for your interest in Ansible. Ansible has migrated much of the content into separate repositories to allow for more rapid, independent development. We are closing this issue/PR because this content has been moved to one or more collection repositories.
For further information, please see: |
From @zachradtka on 2016-08-25T19:13:13Z
ISSUE TYPE
COMPONENT NAME
authorized_key
ANSIBLE VERSION
CONFIGURATION
OS / ENVIRONMENT
N/A
SUMMARY
Utilizing
delegate_to
andauthorized_key
to implement passworless SSH on a cluster does not work. Some, not all keys will get added to~/.ssh/authorized_keys
while Ansible reports that all keys have been added.STEPS TO REPRODUCE
I have a cluster that has 4 machines. I am attempting to implement passworldless SSH on the entire cluster for a single user (e.g. the user hdfs on a Hadoop cluster). What I am trying to do is:
id_rsa.pub
) from each machine~/.ssh/authorized_keys
fileAn example of what I am trying to accomplish is below.
For clarity, the variables I am using are defined as:
user
is a useruser_home_dir
is the users home directoryhosts
is a group named "hosts" in the inventory fileEXPECTED RESULTS
I expected all hosts to contain the public key of all other hosts. The easiest way to check this is to cat
.ssh/authorized_keys
on each hosts. If this did work, I would expect the following (Assuming hosts: host_A, host_B, host_C, host_D).The authorized keys on host_A:
The authorized keys on host_B:
The authorized keys on host_C:
The authorized keys on host_D:
ACTUAL RESULTS
The results vary every time the command is run. What is problematic is that Ansible reports that all hosts have added the public keys from all other hosts. The output of the command is below.
Looking at each hosts
authorized_keys
file shows that only one host (host_B) has all of the other hosts keys:The authorized keys on host_A:
The authorized keys on host_B:
The authorized keys on host_C:
The authorized keys on host_D:
To further give evidence that there is an error, when I run the playbook again, the
authorized_key
module should report a statusok
for each action that was already completed. The result of a subsequent run shows that some of the statuses arechanged
.For brevity, only checking host_A's authorized keys file reveals that after a second run, host_C's public key is still missing.
WORKAROUND
Because of this error, I have created the following workaround. Instead of using the
authorized_key
module, I just run a simpleshell
command.Copied from original issue: ansible/ansible-modules-core#4542
The text was updated successfully, but these errors were encountered: