New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cygwin SSH is failing with mux_client_request_session: read from master failed: Connection reset by peer #6

Closed
rbeesley opened this Issue Mar 19, 2014 · 7 comments

Comments

Projects
None yet
6 participants
@rbeesley

rbeesley commented Mar 19, 2014

I was following the guide on https://servercheck.in/blog/running-ansible-within-windows.

I may have misunderstood what was meant by "if you would like to use Ansible as a provisioner for Vagrant..." I thought that meant standing up a Vagrant guest OS. I was starting my own Vagrant guest, so I didn't think this applied to me. I thought the only hurdle I was running into was that the line endings were CRLF rather than just LF. I changed that in Git. I can SSH into the Vagrant guest, so I was surprised when running the playbook failed. Turning on verbose logging -vvvv the error looks different than what you were describing in the blog post:

$ ansible-playbook -vvvv <playbook.yml> -i host_inventories/hosts.yml --tags "" --private-key=/cygdrive/c/Users//.vagrant.d/insecure_private_key -u vagrant

PLAY [Deploy Services] ***********************************************

GATHERING FACTS ***************************************************************
<127.0.0.1> ESTABLISH CONNECTION FOR USER: vagrant
<127.0.0.1> REMOTE_MODULE setup
<127.0.0.1> EXEC ['ssh', '-C', '-tt', '-vvv', '-o', 'ControlMaster=auto', '-o', 'ControlPersist=60s', '-o', 'ControlPath=/home//.ansible/cp/ansible-ssh-%h-%p-%r', '-o', 'Port=2222', '-o', 'IdentityFile=/cygdrive/c/Users//.vagrant.d/insecure_private_key', '-o', 'KbdInteractiveAuthentication=no', '-o', 'PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey', '-o', 'PasswordAuthentication=no', '-o', 'User=vagrant', '-o', 'ConnectTimeout=10', '127.0.0.1', "/bin/sh -c 'mkdir -p /tmp/ansible-tmp-1395190782.76-199795475576977 && chmod a+rx /tmp/ansible-tmp-1395190782.76-199795475576977 && echo /tmp/ansible-tmp-1395190782.76-199795475576977'"]
fatal: [localhost] => SSH encountered an unknown error. The output was:
OpenSSH_6.5, OpenSSL 1.0.1f 6 Jan 2014
debug1: auto-mux: Trying existing master
debug1: Control socket "/home//.ansible/cp/ansible-ssh-127.0.0.1-2222-vagrant" does not exist
debug2: ssh_connect: needpriv 0
debug1: Connecting to 127.0.0.1 [127.0.0.1] port 2222.
debug2: fd 3 setting O_NONBLOCK
debug1: fd 3 clearing O_NONBLOCK
debug1: Connection established.
debug3: timeout: 10000 ms remain after connect
debug3: Incorrect RSA1 identifier
debug3: Could not load "/cygdrive/c/Users//.vagrant.d/insecure_private_key" as a RSA1 public key
debug1: identity file /cygdrive/c/Users//.vagrant.d/insecure_private_key type -1
debug1: identity file /cygdrive/c/Users//.vagrant.d/insecure_private_key-cert type -1
debug1: Enabling compatibility mode for protocol 2.0
debug1: Local version string SSH-2.0-OpenSSH_6.5
debug1: Remote protocol version 2.0, remote software version OpenSSH_5.9p1 Debian-5ubuntu1
debug1: match: OpenSSH_5.9p1 Debian-5ubuntu1 pat OpenSSH_5* compat 0x0c000000
debug2: fd 3 setting O_NONBLOCK
debug3: put_host_port: [127.0.0.1]:2222
debug3: load_hostkeys: loading entries for host "[127.0.0.1]:2222" from file "/home//.ssh/known_hosts"
debug3: load_hostkeys: found key type ECDSA in file /home//.ssh/known_hosts:2
debug3: load_hostkeys: loaded 1 keys
debug3: order_hostkeyalgs: prefer hostkeyalgs: ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384-cert-v01@openssh.com,ecdsa-sha2-nistp521-cert-v01@openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521
debug1: SSH2_MSG_KEXINIT sent
debug1: SSH2_MSG_KEXINIT received
debug2: kex_parse_kexinit: curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384-cert-v01@openssh.com,ecdsa-sha2-nistp521-cert-v01@openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-ed25519-cert-v01@openssh.com,ssh-rsa-cert-v01@openssh.com,ssh-dss-cert-v01@openssh.com,ssh-rsa-cert-v00@openssh.com,ssh-dss-cert-v00@openssh.com,ssh-ed25519,ssh-rsa,ssh-dss
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,chacha20-poly1305@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-gcm@openssh.com,aes256-gcm@openssh.com,chacha20-poly1305@openssh.com,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-ripemd160-etm@openssh.com,hmac-sha1-96-etm@openssh.com,hmac-md5-96-etm@openssh.com,hmac-md5,hmac-sha1,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-ripemd160-etm@openssh.com,hmac-sha1-96-etm@openssh.com,hmac-md5-96-etm@openssh.com,hmac-md5,hmac-sha1,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: zlib@openssh.com,zlib,none
debug2: kex_parse_kexinit: zlib@openssh.com,zlib,none
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: kex_parse_kexinit: ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
debug2: kex_parse_kexinit: ssh-rsa,ssh-dss,ecdsa-sha2-nistp256
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64@openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: hmac-md5,hmac-sha1,umac-64@openssh.com,hmac-sha2-256,hmac-sha2-256-96,hmac-sha2-512,hmac-sha2-512-96,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
debug2: kex_parse_kexinit: none,zlib@openssh.com
debug2: kex_parse_kexinit: none,zlib@openssh.com
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit:
debug2: kex_parse_kexinit: first_kex_follows 0
debug2: kex_parse_kexinit: reserved 0
debug2: mac_setup: found hmac-md5
debug1: kex: server->client aes128-ctr hmac-md5 zlib@openssh.com
debug2: mac_setup: found hmac-md5
debug1: kex: client->server aes128-ctr hmac-md5 zlib@openssh.com
debug1: sending SSH2_MSG_KEX_ECDH_INIT
debug1: expecting SSH2_MSG_KEX_ECDH_REPLY
debug1: Server host key: ECDSA XX:XX:XX:XX:XX:XX:XX:XX:XX:XX:XX:XX:XX:XX:XX:XX
debug3: put_host_port: [127.0.0.1]:2222
debug3: put_host_port: [127.0.0.1]:2222
debug3: load_hostkeys: loading entries for host "[127.0.0.1]:2222" from file "/home//.ssh/known_hosts"
debug3: load_hostkeys: found key type ECDSA in file /home//.ssh/known_hosts:2
debug3: load_hostkeys: loaded 1 keys
debug1: Host '[127.0.0.1]:2222' is known and matches the ECDSA host key.
debug1: Found key in /home//.ssh/known_hosts:2
debug1: ssh_ecdsa_verify: signature correct
debug2: kex_derive_keys
debug2: set_newkeys: mode 1
debug1: SSH2_MSG_NEWKEYS sent
debug1: expecting SSH2_MSG_NEWKEYS
debug2: set_newkeys: mode 0
debug1: SSH2_MSG_NEWKEYS received
debug1: Roaming not allowed by server
debug1: SSH2_MSG_SERVICE_REQUEST sent
debug2: service_accept: ssh-userauth
debug1: SSH2_MSG_SERVICE_ACCEPT received
debug2: key: /cygdrive/c/Users//.vagrant.d/insecure_private_key (0x0), explicit
debug1: Authentications that can continue: publickey,password
debug3: start over, passed a different list publickey,password
debug3: preferred gssapi-with-mic,gssapi-keyex,hostbased,publickey
debug3: authmethod_lookup publickey
debug3: remaining preferred: ,gssapi-keyex,hostbased,publickey
debug3: authmethod_is_enabled publickey
debug1: Next authentication method: publickey
debug1: Trying private key: /cygdrive/c/Users//.vagrant.d/insecure_private_key
debug1: key_parse_private2: missing begin marker
debug1: read PEM private key done: type RSA
debug3: sign_and_send_pubkey: RSA xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx:xx
debug2: we sent a publickey packet, wait for reply
debug1: Enabling compression at level 6.
debug1: Authentication succeeded (publickey).
Authenticated to 127.0.0.1 ([127.0.0.1]:2222).
debug1: setting up multiplex master socket
debug3: muxserver_listen: temporary control path /home//.ansible/cp/ansible-ssh-127.0.0.1-2222-vagrant.q4roHdSf4LGT5CHm
debug2: fd 4 setting O_NONBLOCK
debug3: fd 4 is O_NONBLOCK
debug3: fd 4 is O_NONBLOCK
debug1: channel 0: new [/home/beesleyr/.ansible/cp/ansible-ssh-127.0.0.1-2222-vagrant]
debug3: muxserver_listen: mux listener channel 0 fd 4
debug2: fd 3 setting TCP_NODELAY
debug3: packet_set_tos: set IP_TOS 0x08
debug1: control_persist_detach: backgrounding master process
debug2: control_persist_detach: background process is 13356
debug1: forking to background
debug1: Entering interactive session.
debug2: set_control_persist_exit_time: schedule exit in 60 seconds
debug1: multiplexing control connection
debug3: fd 5 is O_NONBLOCK
debug2: fd 4 setting O_NONBLOCK
debug3: fd 5 is O_NONBLOCK
debug1: channel 1: new [mux-control]
debug3: channel_post_mux_listener: new mux channel 1 fd 5
debug3: mux_master_read_cb: channel 1: hello sent
debug2: set_control_persist_exit_time: cancel scheduled exit
debug3: mux_master_read_cb: channel 1 packet type 0x00000001 len 4
debug2: process_mux_master_hello: channel 1 slave version 4
debug2: mux_client_hello_exchange: master version 4
debug3: mux_client_forwards: request forwardings: 0 local, 0 remote
debug3: mux_client_request_session: entering
debug3: mux_client_request_alive: entering
debug3: mux_master_read_cb: channel 1 packet type 0x10000004 len 4
debug2: process_mux_alive_check: channel 1: alive check
debug3: mux_client_request_alive: done pid = 14104
debug3: mux_master_read_cb: channel 1 packet type 0x10000002 len 225
debug2: process_mux_new_session: channel 1: request tty 1, X 0, agent 0, subsys 0, term "xterm", cmd "/bin/sh -c 'mkdir -p /tmp/ansible-tmp-1395190782.76-199795475576977 && chmod a+rx /tmp/ansible-tmp-1395190782.76-199795475576977 && echo /tmp/ansible-tmp-1395190782.76-199795475576977'", env 0
debug3: mux_client_request_session: session request sent
mm_receive_fd: no message header
process_mux_new_session: failed to receive fd 0 from slave
debug1: channel 1: mux_rcb failed
debug2: channel 1: zombie
debug2: channel 1: gc: notify user
debug3: mux_master_control_cleanup_cb: entering for channel 1
debug2: channel 1: gc: user detached
debug2: channel 1: zombie
debug2: channel 1: garbage collecting
debug1: channel 1: free: mux-control, nchannels 2
debug3: channel 1: status: The following connections are open:

mux_client_request_session: read from master failed: Connection reset by peer
Failed to connect to new control master
debug2: set_control_persist_exit_time: schedule exit in 60 seconds

TASK: [stop-services] *******************************************
FATAL: no hosts matched or all hosts have already failed -- aborting

PLAY RECAP ********************************************************************
to retry, use: --limit @/home//playbook.retry

localhost : ok=0 changed=0 unreachable=1 failed=0

@geerlingguy

This comment has been minimized.

Show comment
Hide comment
@geerlingguy

geerlingguy Mar 23, 2014

Owner

This is one of the reasons I gave up on using Ansible provisioners in windows, and using this project (on GitHub) instead. See http://cygwin.com/ml/cygwin/2010-08/msg00088.html. Basically, Cygwin doesn't support the way Ansible does SSH. You may be able to work around this using a different connection method or something, but it's hard (if not impossible) to get Ansible working from within Windows/Cygwin.

It's better to create another VM with Vagrant, install Ansible on that, then use that VM to connect to and provision the original VM. Or you can use this script (JJG-Ansible-Windows) as a shell provisioner within Vagrant to launch Ansible from within your VM and run it through the shell (rather than directly within Cygwin/Windows).

Owner

geerlingguy commented Mar 23, 2014

This is one of the reasons I gave up on using Ansible provisioners in windows, and using this project (on GitHub) instead. See http://cygwin.com/ml/cygwin/2010-08/msg00088.html. Basically, Cygwin doesn't support the way Ansible does SSH. You may be able to work around this using a different connection method or something, but it's hard (if not impossible) to get Ansible working from within Windows/Cygwin.

It's better to create another VM with Vagrant, install Ansible on that, then use that VM to connect to and provision the original VM. Or you can use this script (JJG-Ansible-Windows) as a shell provisioner within Vagrant to launch Ansible from within your VM and run it through the shell (rather than directly within Cygwin/Windows).

@joshspivey

This comment has been minimized.

Show comment
Hide comment
@joshspivey

joshspivey Aug 6, 2014

@rbeesley @geerlingguy The reason you are having this issue is you have to use "power shell" not cygwin and you have to add this to your provision shell script.

pip install http://github.com/diyan/pywinrm/archive/master.zip#egg=pywinrm

This will install the tools needed to communicate properly

joshspivey commented Aug 6, 2014

@rbeesley @geerlingguy The reason you are having this issue is you have to use "power shell" not cygwin and you have to add this to your provision shell script.

pip install http://github.com/diyan/pywinrm/archive/master.zip#egg=pywinrm

This will install the tools needed to communicate properly

@webloginwu

This comment has been minimized.

Show comment
Hide comment
@webloginwu

webloginwu May 5, 2017

my error message:
$ ansible hp -m raw -a "hostname"
hp | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: mux_client_request_session: read from master failed: Connection reset by peer\r\nFailed to connect to new control master\r\n",
"unreachable": true
}

according to this: http://everythingshouldbevirtual.com/ansible-using-ansible-on-windows-via-cygwin ,It works.
This error above is from using Cygwin and can easily be solved by creating an ansible.cfg file in your playbook folder with the following.
nano ansible.cfg
....
[ssh_connection]
ssh_args = -o ControlMaster=no

webloginwu commented May 5, 2017

my error message:
$ ansible hp -m raw -a "hostname"
hp | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: mux_client_request_session: read from master failed: Connection reset by peer\r\nFailed to connect to new control master\r\n",
"unreachable": true
}

according to this: http://everythingshouldbevirtual.com/ansible-using-ansible-on-windows-via-cygwin ,It works.
This error above is from using Cygwin and can easily be solved by creating an ansible.cfg file in your playbook folder with the following.
nano ansible.cfg
....
[ssh_connection]
ssh_args = -o ControlMaster=no

@Secrole6789

This comment has been minimized.

Show comment
Hide comment
@Secrole6789

Secrole6789 Jun 7, 2017

Hi webloginwu, Your,

"
This error above is from using Cygwin and can easily be solved by creating an ansible.cfg file in your playbook folder with the following.
nano ansible.cfg
....
[ssh_connection]
ssh_args = -o ControlMaster=no

suggestion helped me to solve my problem. Thank you so much.

Secrole6789 commented Jun 7, 2017

Hi webloginwu, Your,

"
This error above is from using Cygwin and can easily be solved by creating an ansible.cfg file in your playbook folder with the following.
nano ansible.cfg
....
[ssh_connection]
ssh_args = -o ControlMaster=no

suggestion helped me to solve my problem. Thank you so much.

@geerlingguy

This comment has been minimized.

Show comment
Hide comment
@geerlingguy

geerlingguy Jun 7, 2017

Owner

@Secrole6789 - Awesome! Glad that @webloginwu helped. Note that this particular project is no longer maintained, but I'm going to keep it up so people can continue to use it for reference.

Owner

geerlingguy commented Jun 7, 2017

@Secrole6789 - Awesome! Glad that @webloginwu helped. Note that this particular project is no longer maintained, but I'm going to keep it up so people can continue to use it for reference.

@gavenkoa

This comment has been minimized.

Show comment
Hide comment
@gavenkoa

gavenkoa Jul 7, 2018

I succeeded with alternative ssh connectivity option in inventory file:

all:
  hosts:
    lighttpd:
      ansible_host: 192.168.33.10
      ansible_port: 22
      ansible_user: root
      ansible_connection: paramiko

Although local ansible.cfg fixes connectivity too.

I don't understand why following inventory options are not passed to ssh:

  ansible_ssh_common_args: "-o ControlMaster=no"
  ansible_ssh_extra_args: "-o ControlMaster=no"

gavenkoa commented Jul 7, 2018

I succeeded with alternative ssh connectivity option in inventory file:

all:
  hosts:
    lighttpd:
      ansible_host: 192.168.33.10
      ansible_port: 22
      ansible_user: root
      ansible_connection: paramiko

Although local ansible.cfg fixes connectivity too.

I don't understand why following inventory options are not passed to ssh:

  ansible_ssh_common_args: "-o ControlMaster=no"
  ansible_ssh_extra_args: "-o ControlMaster=no"
@gavenkoa

This comment has been minimized.

Show comment
Hide comment
@gavenkoa

gavenkoa Jul 7, 2018

Another easy fix is adding env var:

export ANSIBLE_SSH_ARGS='-o ControlMaster=no'

gavenkoa commented Jul 7, 2018

Another easy fix is adding env var:

export ANSIBLE_SSH_ARGS='-o ControlMaster=no'

sevenfourk added a commit to sevenfourk/dotfiles that referenced this issue Jul 31, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment