Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Synchronize module fails with multiple items #4855

Closed
akuznecov opened this issue Nov 8, 2013 · 5 comments · Fixed by #4856
Closed

Synchronize module fails with multiple items #4855

akuznecov opened this issue Nov 8, 2013 · 5 comments · Fixed by #4856
Assignees
Labels
bug This issue/PR relates to a bug. needs_info This issue requires further information. Please answer any outstanding questions.

Comments

@akuznecov
Copy link

Synchronize module adds unwanted path fragment to rsync destination when multiple items used (with_items)

Part of task with synchronize action and with_items directive:

- name: 'SOLR | Deploy cores configuration files for instances'
  synchronize: src=projects/{{project_name}}/core-conf/ dest=/opt/solr/home/cores/{{item.name}}/conf/ archive=yes delete=yes rsync_path="sudo rsync"
  register: solrxml
  when: item.solr is defined and item.solr == 'present'
  with_items: instances
  tags:
    - solr
    - solr-conf
    - search

Got output:

TASK: [solr | SOLR | Deploy cores configuration files for instances] **********
<127.0.0.1> EXEC ['/bin/sh', '-c', 'mkdir -p $HOME/.ansible/tmp/ansible-1383930775.88-164589547782291 && chmod a+rx $HOME/.ansible/tmp/ansible-1383930775.88-164589547782291 && echo $HOME/.ansible/tmp/ansible-1383930775.88-164589547782291']
<127.0.0.1> PUT /tmp/tmp6nU_52 TO /home/xt/.ansible/tmp/ansible-1383930775.88-164589547782291/synchronize
<127.0.0.1> EXEC ['/bin/sh', '-c', '/usr/bin/env python /home/xt/.ansible/tmp/ansible-1383930775.88-164589547782291/synchronize; rm -rf /home/xt/.ansible/tmp/ansible-1383930775.88-164589547782291/ >/dev/null 2>&1']
changed: [10.1.250.10] => (item={'autopull': 'present', 'cron': 'present', 'solr': 'present', 'name': 'dev', 'vhost': 'application', 'domains': 'dev.test.local', 'docroot': u'/var/www/dev', 'db': 'present', 'auth': 'present'}) => {"changed": true, "cmd": "rsync --delay-updates --compress --timeout=10 --delete-after --archive --rsh 'ssh -o StrictHostKeyChecking=no' --rsync-path 'sudo rsync' --out-format='<>%i %n%L' projects/test/core-conf/ vagrant@10.1.250.10:/opt/solr/home/cores/dev/conf/", "item": {"auth": "present", "autopull": "present", "cron": "present", "db": "present", "docroot": "/var/www/dev", "domains": "dev.test.local", "name": "dev", "solr": "present", "vhost": "application"}, "msg": ".d....og... ./\n.f....og... admin-extra.html\n.f....og... currency.xml\n.f....og... elevate.xml\n.f....og... mapping-FoldToASCII.txt\n.f....og... mapping-ISOLatin1Accent.txt\n.f....og... protwords.txt\n.f....og... schema.xml\n.f....og... schema_extra_fields.xml\n.f....og... schema_extra_types.xml\n.f....og... scripts.conf\n.f....og... solrconfig.xml\n.f....og... solrconfig_extra.xml\n.f....og... solrcore.properties\n.f....og... spellings.txt\n.f....og... stopwords.txt\n.f....og... synonyms.txt\n.d....og... lang/\n.f....og... lang/contractions_ca.txt\n.f....og... lang/contractions_fr.txt\n.f....og... lang/contractions_ga.txt\n.f....og... lang/contractions_it.txt\n.f....og... lang/hyphenations_ga.txt\n.f....og... lang/stemdict_nl.txt\n.f....og... lang/stoptags_ja.txt\n.f....og... lang/stopwords_ar.txt\n.f....og... lang/stopwords_bg.txt\n.f....og... lang/stopwords_ca.txt\n.f....og... lang/stopwords_cz.txt\n.f....og... lang/stopwords_da.txt\n.f....og... lang/stopwords_de.txt\n.f....og... lang/stopwords_el.txt\n.f....og... lang/stopwords_en.txt\n.f....og... lang/stopwords_es.txt\n.f....og... lang/stopwords_eu.txt\n.f....og... lang/stopwords_fa.txt\n.f....og... lang/stopwords_fi.txt\n.f....og... lang/stopwords_fr.txt\n.f....og... lang/stopwords_ga.txt\n.f....og... lang/stopwords_gl.txt\n.f....og... lang/stopwords_hi.txt\n.f....og... lang/stopwords_hu.txt\n.f....og... lang/stopwords_hy.txt\n.f....og... lang/stopwords_id.txt\n.f....og... lang/stopwords_it.txt\n.f....og... lang/stopwords_ja.txt\n.f....og... lang/stopwords_lv.txt\n.f....og... lang/stopwords_nl.txt\n.f....og... lang/stopwords_no.txt\n.f....og... lang/stopwords_pt.txt\n.f....og... lang/stopwords_ro.txt\n.f....og... lang/stopwords_ru.txt\n.f....og... lang/stopwords_sv.txt\n.f....og... lang/stopwords_th.txt\n.f....og... lang/stopwords_tr.txt\n.f....og... lang/userdict_ja.txt\n.d....og... velocity/\n.f....og... velocity/VM_global_library.vm\n.f....og... velocity/browse.vm\n.f....og... velocity/cluster.vm\n.f....og... velocity/clusterResults.vm\n.f....og... velocity/doc.vm\n.f....og... velocity/facet_fields.vm\n.f....og... velocity/facet_queries.vm\n.f....og... velocity/facet_ranges.vm\n.f....og... velocity/facets.vm\n.f....og... velocity/footer.vm\n.f....og... velocity/head.vm\n.f....og... velocity/header.vm\n.f....og... velocity/hit.vm\n.f....og... velocity/hitGrouped.vm\n.f....og... velocity/jquery.autocomplete.css\n.f....og... velocity/jquery.autocomplete.js\n.f....og... velocity/layout.vm\n.f....og... velocity/main.css\n.f....og... velocity/query.vm\n.f....og... velocity/querySpatial.vm\n.f....og... velocity/suggest.vm\n.f....og... velocity/tabs.vm\n.d....og... xslt/\n.f....og... xslt/example.xsl\n.f....og... xslt/example_atom.xsl\n.f....og... xslt/example_rss.xsl\n.f....og... xslt/luke.xsl\n.f....og... xslt/updateXml.xsl\n", "rc": 0}
<127.0.0.1> EXEC ['/bin/sh', '-c', 'mkdir -p $HOME/.ansible/tmp/ansible-1383930776.06-136953182056861 && chmod a+rx $HOME/.ansible/tmp/ansible-1383930776.06-136953182056861 && echo $HOME/.ansible/tmp/ansible-1383930776.06-136953182056861']
<127.0.0.1> PUT /tmp/tmpKJBXPa TO /home/xt/.ansible/tmp/ansible-1383930776.06-136953182056861/synchronize
<127.0.0.1> EXEC ['/bin/sh', '-c', '/usr/bin/env python /home/xt/.ansible/tmp/ansible-1383930776.06-136953182056861/synchronize; rm -rf /home/xt/.ansible/tmp/ansible-1383930776.06-136953182056861/ >/dev/null 2>&1']
failed: [10.1.250.10] => (item={'autopull': 'absent', 'cron': 'absent', 'solr': 'present', 'name': 'stage', 'vhost': 'static', 'domains': 'stage.test.local stg.test.local', 'docroot': u'/var/www/stage/current', 'db': 'present', 'auth': 'present'}) => {"cmd": "rsync --delay-updates --compress --timeout=10 --delete-after --archive --rsh 'ssh -o StrictHostKeyChecking=no' --rsync-path 'sudo rsync' --out-format='<>%i %n%L' projects/test/core-conf/ vagrant@10.1.250.10:vagrant@10.1.250.10:/opt/solr/home/cores/dev/conf/", "failed": true, "item": {"auth": "present", "autopull": "absent", "cron": "absent", "db": "present", "docroot": "/var/www/stage/current", "domains": "stage.test.local stg.test.local", "name": "stage", "solr": "present", "vhost": "static"}, "rc": 12}
msg: rsync: mkdir "/home/vagrant/vagrant@10.1.250.10:/opt/solr/home/cores/dev/conf" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(605) [Receiver=3.0.9]
rsync: connection unexpectedly closed (197 bytes received so far) [sender]
rsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]

Some of options got duplicated for second item

Ansible: v1.4 (devel)
Python: 2.7

@jctanner
Copy link
Contributor

@akuznecov what does "instances" look like in your testcase? I can not reproduce the issue with this simple playbook ...

- hosts: localhost
  connection: local
  gather_facts: False
  vars:
    words: 
        - name: "a"
          somekey: "someval"
        - name: "b"
          somekey: "someval"
        - name: "c"
          somekey: "someval"
  tasks:
    - debug: var=words
    - shell: if [ ! -d /tmp/foo ]; then  mkdir /tmp/foo; fi
    - shell: touch /tmp/foo/{1,2,3,4,5}

    - synchronize: src=/tmp/foo dest=/tmp/{{ item.name }}
      with_items: words

    - synchronize: src=/tmp/foo dest=/tmp/{{ item.name }}
      when: item != 'three'
      with_items: words

    - synchronize: src=/tmp/foo dest=/tmp/{{item.name}} rsync_path="/usr/bin/rsync"
      with_items: words

    - synchronize: src=/tmp/foo dest=root@localhost:/tmp/{{item.name}} rsync_path="/usr/bin/rsync"
      with_items: words

@akuznecov
Copy link
Author

@jctanner here is my part of playbook

- name: 'Apply recipes to VM'
  hosts: vagrant
  user: vagrant
  sudo: yes

  vars:
    project_user: 'project'
    project_name: 'test'

    webserver_root: '/var/www'
    webserver_user: 'nginx'

    instances:
      - name: "dev"
        domains: "dev.test.local"
        docroot: "{{webserver_root}}/dev"
        vhost: application
        cron: present
        autopull: present
        auth: present
        solr: present
        db: present
      - name: "stage"
        domains: "stage.test.local stg.test.local"
        docroot: "{{webserver_root}}/stage/current"
        vhost: static
        cron: absent
        autopull: absent
        auth: present
        solr: present
        db: present

i have tried to apply your playbook(but with one task only) for localhost and it works too for me, but not for remote machine

- name: 'Apply recipes to VM'
  hosts: vagrant
  user: vagrant
  vars:
    words: 
        - name: "a"
          somekey: "someval"
        - name: "b"
          somekey: "someval"
        - name: "c"
          somekey: "someval"
  tasks:
    - debug: var=words

    - synchronize: src=foo dest=/tmp/{{item.name}} rsync_path="/usr/bin/rsync"
      with_items: words

PLAY [Apply recipes to VM] ****************************************************

GATHERING FACTS ***************************************************************
ok: [10.1.250.10]

TASK: [debug var=words] *******************************************************
ok: [10.1.250.10] => {
"words": [
{
"name": "a",
"somekey": "someval"
},
{
"name": "b",
"somekey": "someval"
},
{
"name": "c",
"somekey": "someval"
}
]
}

TASK: [synchronize src=foo dest=/tmp/{{item.name}} rsync_path="/usr/bin/rsync"] ***
ok: [10.1.250.10] => (item={'somekey': 'someval', 'name': 'a'}) => {"changed": false, "cmd": "rsync --delay-updates --compress --timeout=10 --archive --rsh 'ssh -o StrictHostKeyChecking=no' --rsync-path '/usr/bin/rsync' --out-format='<>%i %n%L' foo vagrant@10.1.250.10:/tmp/a", "item": {"name": "a", "somekey": "someval"}, "msg": "", "rc": 0}
failed: [10.1.250.10] => (item={'somekey': 'someval', 'name': 'b'}) => {"cmd": "rsync --delay-updates --compress --timeout=10 --archive --rsh 'ssh -o StrictHostKeyChecking=no' --rsync-path '/usr/bin/rsync' --out-format='<>%i %n%L' foo vagrant@10.1.250.10:vagrant@10.1.250.10:/tmp/a", "failed": true, "item": {"name": "b", "somekey": "someval"}, "rc": 12}
msg: rsync: mkdir "/home/vagrant/vagrant@10.1.250.10:/tmp/a" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(605) [Receiver=3.0.9]
rsync: connection unexpectedly closed (174 bytes received so far) [sender]
rsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]

failed: [10.1.250.10] => (item={'somekey': 'someval', 'name': 'c'}) => {"cmd": "rsync --delay-updates --compress --timeout=10 --archive --rsh 'ssh -o StrictHostKeyChecking=no' --rsync-path '/usr/bin/rsync' --out-format='<>%i %n%L' foo vagrant@10.1.250.10:vagrant@10.1.250.10:vagrant@10.1.250.10:/tmp/a", "failed": true, "item": {"name": "c", "somekey": "someval"}, "rc": 12}
msg: rsync: mkdir "/home/vagrant/vagrant@10.1.250.10:vagrant@10.1.250.10:/tmp/a" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(605) [Receiver=3.0.9]
rsync: connection unexpectedly closed (194 bytes received so far) [sender]
rsync error: error in rsync protocol data stream (code 12) at io.c(226) [sender=3.1.0]

FATAL: all hosts have already failed -- aborting

@jctanner
Copy link
Contributor

@akuznecov Ah, I see now ....

failed: [el6.lab.net] => (item={'somekey': 'someval', 'name': 'c'}) => {"cmd": "rsync --delay-updates --compress --timeout=10 --archive --rsh 'ssh  -o StrictHostKeyChecking=no' --out-format='<<CHANGED>>%i %n%L' /tmp/foo root@el6.lab.net:root@el6.lab.net:root@el6.lab.net:/tmp/a", "failed": true, "item": {"name": "c", "somekey": "someval"}, "rc": 12}

jctanner added a commit that referenced this issue Nov 12, 2013
Resolves #4855 issue with synchronize module failing on multiple items
@renard
Copy link

renard commented Feb 11, 2014

I guess this issue is still running. I guess I have a similar issue.

I try with both v1.4.4 (45dde5b) and devel (acae162). Note: when I switch from a branch to an other, I cleanup al pyc files.

The idea is to synchronize some roles from a machine to an other one. Here is a small playbook (run-sync.yml):

- hosts: all
  gather_facts: no

  tasks:
    - name: Copy ansible roles to remote host
      synchronize:
        dest=/srv/ansible/roles/{{item}}
        src=roles/{{item}}
      with_items:
        - locale
        - ntp

The inventory file is:

sa01 ansible_ssh_host=172.19.90.148

Here is the output:


ansible-playbook -vvvvv -i inventory/sa run-sync.yml

PLAY [all] ********************************************************************

TASK: [sync | Copy ansible roles to remote host] ******************************
<127.0.0.1> EXEC ['/bin/sh', '-c', 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1392129141.02-89847347795019 && echo $HOME/.ansible/tmp/ansible-tmp-1392129141.02-89847347795019']
<127.0.0.1> PUT /tmp/tmpeu304K TO /root/.ansible/tmp/ansible-tmp-1392129141.02-89847347795019/synchronize
<127.0.0.1> EXEC ['/bin/sh', '-c', '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1392129141.02-89847347795019/synchronize; rm -rf /root/.ansible/tmp/ansible-tmp-1392129141.02-89847347795019/ >/dev/null 2>&1']
ok: [sa01] => (item=locale) => {"changed": false, "cmd": "rsync --delay-updates -FF --compress --timeout=10 --archive --rsh 'ssh  -o StrictHostKeyChecking=no' --out-format='<<CHANGED>>%i %n%L' roles/locale root@172.19.90.148:/srv/ansible/roles/locale", "item": "locale", "msg": "", "rc": 0}
<127.0.0.1> EXEC ['/bin/sh', '-c', 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1392129141.42-201712564729970 && echo $HOME/.ansible/tmp/ansible-tmp-1392129141.42-201712564729970']
<127.0.0.1> PUT /tmp/tmpYk0Xw5 TO /root/.ansible/tmp/ansible-tmp-1392129141.42-201712564729970/synchronize
<127.0.0.1> EXEC ['/bin/sh', '-c', '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1392129141.42-201712564729970/synchronize; rm -rf /root/.ansible/tmp/ansible-tmp-1392129141.42-201712564729970/ >/dev/null 2>&1']
failed: [sa01] => (item=ntp) => {"cmd": "rsync --delay-updates -FF --compress --timeout=10 --archive --rsh 'ssh  -o StrictHostKeyChecking=no' --out-format='<<CHANGED>>%i %n%L' roles/ntp /srv/ansible/roles/ntp", "failed": true, "item": "ntp", "rc": 12}
msg: rsync: mkdir "/srv/ansible/roles/ntp" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(605) [Receiver=3.0.9]
rsync: connection unexpectedly closed (9 bytes received so far) [sender]
rsync error: error in rsync protocol data stream (code 12) at io.c(605) [sender=3.0.9]


FATAL: all hosts have already failed -- aborting

PLAY RECAP ********************************************************************
           to retry, use: --limit @/root/run-sync.retry

sa01                       : ok=0    changed=0    unreachable=0    failed=1

The first round is always a success. The second one (and other if there are) always fails. This is because the remote host is not set on for the dest:
root@172.19.90.148:/srv/ansible/roles/locale vs. /srv/ansible/roles/ntp

@jctanner
Copy link
Contributor

@renard please open a new bug and I'll take a look.

jimi-c pushed a commit that referenced this issue Dec 6, 2016
CERN maintains its own fork of "Scientific Linux",
which identifies as "Scientific Linux CERN SLC".
This commit lets Ansible know that this is again
another variant of RHEL.
@ansibot ansibot added bug This issue/PR relates to a bug. and removed bug_report labels Mar 6, 2018
@ansible ansible locked and limited conversation to collaborators Apr 24, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue/PR relates to a bug. needs_info This issue requires further information. Please answer any outstanding questions.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants