Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handlers don't execute handlers in included #13485

Closed
galiev opened this issue Dec 9, 2015 · 38 comments
Closed

Handlers don't execute handlers in included #13485

galiev opened this issue Dec 9, 2015 · 38 comments
Labels
bug This issue/PR relates to a bug.
Milestone

Comments

@galiev
Copy link

galiev commented Dec 9, 2015

When I include a file in a handler, the tasks in the included file are not executed in ansible 2.0.
Playbook redis.yml:

---
- hosts: redis
  gather_facts: true
  vars:
    os_firewall_use_firewalld: false
    port_open: "{{ redis_port|default(6379) }}"
  tasks:
    - include: roles/firewall/tasks/firewall_open_port.yml
  handlers:
    - include: roles/firewall/handlers/main.yml

roles/firewall/handlers/main.yml

- name: save iptables
  shell: service iptables save

roles/firewall/tasks/firewall_open_port.yml

- include: iptables/iptables_open_port.yml
  when: not os_firewall_use_firewalld

For ansible 1.9.4:
roles/firewall/tasks/iptables/iptables_open_port.yml

- name: Open port
  shell: iptables -A INPUT -m conntrack --ctstate NEW -p tcp --dport "{{ port_open }}" -j ACCEPT
  notify:
    - save iptables

For ansible 2.0:
roles/firewall/tasks/iptables/iptables_open_port.yml

- name: Open port 
  iptables:
    chain: INPUT
    ctstate: NEW
    protocol: tcp
    destination_port: "{{ port_open }}"
    jump: ACCEPT
  notify:
    - save iptables

Result:

ansible --version
ansible 1.9.4
  configured module search path = None
PLAY [redis] ****************************************************************** 

GATHERING FACTS *************************************************************** 
ok: [redis-01-transport.shire.local]
ok: [redis-02-transport.shire.local]

TASK: [Open port] ********************************************* 
changed: [redis-01-transport.shire.local]
changed: [redis-02-transport.shire.local]

NOTIFIED: [save iptables] ***************************************************** 
changed: [redis-01-transport.shire.local]
changed: [redis-02-transport.shire.local]

PLAY RECAP ******************************************************************** 
redis-01-transport.shire.local : ok=3    changed=2    unreachable=0    failed=0   
redis-02-transport.shire.local : ok=3    changed=2    unreachable=0    failed=0   
 ansible --version
ansible 2.0.0
  config file = /home/rino/ansible/ansible.cfg
  configured module search path = Default w/o overrides
PLAY ***************************************************************************

TASK [setup] *******************************************************************
ok: [redis-01-transport.shire.local]
ok: [redis-02-transport.shire.local]

TASK [include] *****************************************************************
included: /home/rino/ansible/roles/firewall/tasks/firewall_open_port.yml for redis-01-transport.shire.local, redis-02-transport.shire.local

TASK [include] *****************************************************************
included: /home/rino/ansible/playbooks/../roles/firewall/tasks/iptables/iptables_open_port.yml for redis-02-transport.shire.local, redis-01-transport.shire.local

TASK [Open port] **********************************************************
changed: [redis-01-transport.shire.local]
changed: [redis-02-transport.shire.local]

PLAY RECAP *********************************************************************
redis-01-transport.shire.local : ok=4    changed=1    unreachable=0    failed=0   
redis-02-transport.shire.local : ok=4    changed=1    unreachable=0    failed=0

When I replace in playbook:

  handlers:
    - include: roles/firewall/handlers/main.yml

to

handlers:
  - name: save iptables
    shell: /usr/libexec/iptables/iptables.init save

it's work well. It's bug or my mistake?

@bcoca bcoca added this to the v2 milestone Dec 9, 2015
@mindfreakthemon
Copy link

Can confirm the issue on ansible 2.1.0 on Centos 6.6.

[vagrant@platform ~]$ cat /etc/redhat-release
CentOS release 6.6 (Final)
[vagrant@platform ~]$ ansible --version
ansible 2.1.0
  config file = /etc/ansible/ansible.cfg
  configured module search path = Default w/o overrides

File roles/x/handlers/main.yml:

- include: mysql.yml

File roles/x/handlers/mysql.yml:

- name: import sql dump
  mysql_db: name={{mysql_db}} target={{db_dump_path}} state=import

File roles/x/tasks/main.yml:

- name: add mysql db
  mysql_db: name={{mysql_db}} encoding={{mysql_encoding}} collation={{mysql_collation}} state=present
  notify:
    - import sql dump

Not only the handler is not triggered, it does not throw any error if it does not exist.

@aaronk6
Copy link

aaronk6 commented Dec 14, 2015

We're also seeing this after upgrading from Ansible 1.9.4 to 2.0.0-0.7.rc2 on OS X 10.11.

@jimi-c
Copy link
Member

jimi-c commented Dec 17, 2015

Can you test devel or the stable-2.0 branch? This should be resolved.

@galiev
Copy link
Author

galiev commented Dec 18, 2015

Thank you for replay.
Tested version:

Name        : ansible
Version     : 2.1.0
Release     : 0.git201512180409.a391d6f.stable20.el7.centos

Playbook test.yml:

---
- hosts: lb
  gather_facts: false
  remote_user: root
  tasks:
    - include: roles/test/tasks/main.yml
  handlers:
    - include: roles/test/handlers/main.yml
    - name: restart chrony
      service:
        name: chronyd
        state: restarted

roles/test/tasks/main.yml

- name: Debug
  command: /bin/true
  notify:
    - sshd restart
    - restart chrony

roles/test/handlers/main.yml

- name: sshd restart
  service:
    name: sshd
    state: restarted

Result:

$ ansible-playbook test.yml -vv
Using /home/rino/ansible/ansible.cfg as config file
1 plays in test.yml

PLAY ***************************************************************************

TASK [include] *****************************************************************
included: /home/rino/ansible/roles/test/tasks/main.yml for lb-transport.shire.local

TASK [Debug] *******************************************************************
NOTIFIED HANDLER sshd restart
NOTIFIED HANDLER restart chrony
changed: [lb-transport.shire.local] => {"changed": true, "cmd": ["/bin/true"], "delta": "0:00:00.003088", "end": "2015-12-18 11:21:55.788825", "rc": 0, "start": "2015-12-18 11:21:55.785737", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []}

RUNNING HANDLER [restart chrony] ***********************************************
changed: [lb-transport.shire.local] => {"changed": true, "name": "chronyd", "state": "started"}

PLAY RECAP *********************************************************************
lb-transport.shire.local   : ok=3    changed=2    unreachable=0    failed=0    

it's didn't work out.

@mindfreakthemon
Copy link

Checked it on the devel branch - [still does not work if the handler is in another file which is included in the role's handlers/main.yml file]:

Name        : ansible
Arch        : noarch
Version     : 2.1.0
Release     : 0.git201512180409.a391d6f.devel.el6

File playbook.yml:

- name: apply common configuration to all nodes
  hosts: 127.0.0.1
  connection: local
  remote_user: vagrant
  roles:
    - common

File roles/common/handlers/main.yml:

- include: test.yml

File roles/common/handlers/test.yml:

- name: test handler
  debug: msg="Yay"

File roles/common/tasks/main.yml:

- name: test task
  command: /bin/true
  notify: test handler

Ansible output:

[root@platform ansible]# ansible-playbook /provisioning/playbook.yml -vvv
Using /etc/ansible/ansible.cfg as config file
 [WARNING]: provided hosts list is empty, only localhost is available

1 plays in /provisioning/playbook.yml

PLAY [apply common configuration to all nodes] *********************************

TASK [setup] *******************************************************************
ESTABLISH LOCAL CONNECTION FOR USER: root
127.0.0.1 EXEC ( umask 22 && mkdir -p "$( echo $HOME/.ansible/tmp/ansible-tmp-1450427408.28-142856526343450 )" && echo "$( echo $HOME/.ansible/tmp/ansible-tmp-1450427408.28-142856526343450 )" )
127.0.0.1 PUT /tmp/tmpZl9tcv TO /root/.ansible/tmp/ansible-tmp-1450427408.28-142856526343450/setup
127.0.0.1 EXEC LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 /usr/bin/python /root/.ansible/tmp/ansible-tmp-1450427408.28-142856526343450/setup; rm -rf "/root/.ansible/tmp/ansible-tmp-1450427408.28-142856526343450/" > /dev/null 2>&1
ok: [127.0.0.1]

TASK [common : test task] ******************************************************
task path: /provisioning/roles/common/tasks/main.yml:1
ESTABLISH LOCAL CONNECTION FOR USER: root
127.0.0.1 EXEC ( umask 22 && mkdir -p "$( echo $HOME/.ansible/tmp/ansible-tmp-1450427408.66-77735003354168 )" && echo "$( echo $HOME/.ansible/tmp/ansible-tmp-1450427408.66-77735003354168 )" )
127.0.0.1 PUT /tmp/tmpZl9tcv TO /root/.ansible/tmp/ansible-tmp-1450427408.66-77735003354168/command
127.0.0.1 EXEC LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 /usr/bin/python /root/.ansible/tmp/ansible-tmp-1450427408.66-77735003354168/command; rm -rf "/root/.ansible/tmp/ansible-tmp-1450427408.66-77735003354168/" > /dev/null 2>&1
NOTIFIED HANDLER test handler
changed: [127.0.0.1] => {"changed": true, "cmd": ["/bin/true"], "delta": "0:00:00.001711", "end": "2015-12-18 10:30:08.718368", "invocation": {"module_args": {"_raw_params": "/bin/true"}, "module_name": "command"}, "rc": 0, "start": "2015-12-18 10:30:08.716657", "stderr": "", "stdout": "", "stdout_lines": [], "warnings": []}

PLAY RECAP *********************************************************************
127.0.0.1                  : ok=2    changed=1    unreachable=0    failed=0

As you can see here, ansible indeed tries to notify the handler "test handler" but somehow it does not get included. If I move the handler from test.yml directly to main.yml it will get triggered successfully.

@jimi-c jimi-c added P2 Priority 2 - Issue Blocks Release and removed pending_action labels Dec 18, 2015
@booch
Copy link

booch commented Jan 6, 2016

I'm running into the same problem in v2.0.0-0.8.rc3. I was expecting to be able to break handlers into separate files, as I do with tasks.

If I put the handler in handlers/main.yml (within the role directory), it works fine. But if I try moving the handler to another file and including it from handlers/main.yml, the handler does not run. I get no error message about the include not working.

@jimi-c
Copy link
Member

jimi-c commented Jan 6, 2016

This is currently documented in the 2.0 known issues. This is a consequence of the fact that includes are now dynamic and not processed ahead of time, so Ansible doesn't know about tasks inside an include it hasn't processed yet.

@jimi-c jimi-c removed the P2 Priority 2 - Issue Blocks Release label Jan 8, 2016
@juodumas
Copy link

@jimi-c: could you please point to the document containing the known issues for 2.0?

Such usage (handlers/main.yml including other files) should be quite common with non-atomic roles, but maybe they are not, because I could not find many examples in the wild. Anyway, here are a few random roles that are now broken with 2.0:

If handlers are not executed after all tasks complete, services will not be restarted and will stay running with old configuration. This can be a security risk and Ansible won't even warn about it.

@stemid
Copy link
Contributor

stemid commented Jan 25, 2016

Is there a workaround for this? I have opted to keep all my handlers centrally and include them in playbooks for use in playbooks, and in roles.

But now the only workaround seems to be that I re-define every handler inline in every playbook that needs them. Or is there another workaround besides downgrading back to 1.9?

@bcoca
Copy link
Member

bcoca commented Jan 25, 2016

no workaround yet, we are working on a fix for v2

d3matt pushed a commit to d3matt/ansible that referenced this issue Feb 24, 2016
 * if a Handler.get_name() is 'include', load and parse the file
 * Fixes ansible#13485
@raxip
Copy link

raxip commented Feb 25, 2016

When is this issue going to be fixed? Seems fairly major for people that separated handlers from playbooks.

davidfischer-ch added a commit to davidfischer-ch/ansible-roles that referenced this issue Mar 2, 2016
@axos88
Copy link
Contributor

axos88 commented Mar 4, 2016

Was very happy that finally includes are dynamic, and then I stumbled into this.

Btw, wouldn't it be better to have an "include" that works as it used to in 1.9, and a "run" that would be evaluated dynamically?

@jvervlied
Copy link

I've noticed that this works in the 2.0.0.1 branch.

@dejayc
Copy link

dejayc commented Mar 17, 2016

I'm gonna have to stick with Salt until this mess is sorted out.

Trying to achieve any sense of composability with Ansible remains a nightmare.

@mengelmann
Copy link

Any updates on this?

@d3matt
Copy link
Contributor

d3matt commented Mar 18, 2016

haven't seen any comments on my pull request that should fix this: #14650

@mengelmann
Copy link

Just for reference: as @jimi-c wrote in #14650, the PR won't be included because there is an idea how to fix the issue with handlers in a different way, as discussed in the thread https://groups.google.com/forum/#!topic/ansible-devel/9aJaoVeRdOg

@alexmarkley
Copy link

For the project I'm working on, we really need the structured includes. I'm working around this issue for the time being in our codebase by moving the original handlers/main.yml to handlers/rightMain.yml and then using a script to generate the handlers/main.yml file:

#!/bin/bash

# Working around https://github.com/ansible/ansible/issues/13485 for the time being.
# See rightMain.yml for details.

echo "---" >main.yml
echo "### AUTOGENERATED BY generateMain.sh ... DO NOT EDIT DIRECTLY ###" >>main.yml
echo "### SEE generateMain.sh AND rightMain.yml FOR DETAILS ###" >>main.yml
echo "" >>main.yml
for INCLUDE in $(awk '$2=="include:" { print $3; }' rightMain.yml); do
    echo "### INCLUDED ${INCLUDE} ###" >>main.yml
    cat "${INCLUDE}" | grep -v -E '^---$' >>main.yml
    echo "" >>main.yml
    done

Obviously this is a WFMYMMV scenario, but I've attached this here in the hope that it will help someone.

@BinaryBeard
Copy link

👍

@mengelmann
Copy link

@alexmarkley yep, we're doing exactly the same with includes in our playbooks :(

@jimi-c
Copy link
Member

jimi-c commented Apr 14, 2016

The static includes feature was merged in, and should resolve this for all use-cases (unless you're doing includes in a handler that does a loop). Is anyone still seeing this as an issue on the devel branch?

@d3matt
Copy link
Contributor

d3matt commented Apr 19, 2016

it appears to work for me...

@jimi-c can you add an integration test to prevent regression? I can open a pull request with the tests I had written if you want them.

@jimi-c
Copy link
Member

jimi-c commented May 11, 2016

@d3matt yes please. I'll go ahead and close this out now though.

Thanks!

@jimi-c jimi-c closed this as completed May 11, 2016
@jimi-c jimi-c removed needs_info This issue requires further information. Please answer any outstanding questions. pending_action labels May 11, 2016
@d3matt
Copy link
Contributor

d3matt commented May 12, 2016

This appears to be broken again...

bisected to:
$ git bisect bad
438ed70 is the first bad commit
commit 438ed70
Author: Martin Matuska martin.matuska@axelspringer.de
Date: Mon May 2 17:50:42 2016 +0200

Restore Ansible 2.0 compatibility for includes

:040000 040000 7094f976b8efdef066f8b960a8604fdd5774c1a5 49e2fc6154f45d8d57ad196b34eb5b37c7617ca1 M lib

@xrobau
Copy link

xrobau commented May 16, 2016

Confirmed this is broken again in 2.0.2.0

Edit: This is visible because EPEL has upgraded their Ansible version from 1.9.x to 2.0.2.0, which has this issue. If you're coming here because your playbooks have suddenly stopped working, the resolution (if you're on CentOS or RHEL) is this:

yum remove ansible
yum install ansible1.9 

I've created a bugzilla ticket - https://bugzilla.redhat.com/show_bug.cgi?id=1336266 - that links here to reduce future confusion.

@jimi-c
Copy link
Member

jimi-c commented May 16, 2016

This is not broken again, it was fixed in devel and included in the stable-2.1 branch, so it will be included in the 2.1 release.

@d3matt
Copy link
Contributor

d3matt commented May 16, 2016

100% sure it's currently broken on devel and stable-2.1.

I see NOTIFIED HANDLER , but it never runs.

Matt

On Sun, May 15, 2016 at 9:59 PM, James Cammarata notifications@github.com
wrote:

This is not broken again, it was fixed in devel and included in the
stable-2.1 branch, so it will be included in the 2.1 release.


You are receiving this because you were mentioned.
Reply to this email directly or view it on GitHub
#13485 (comment)

@nirik
Copy link

nirik commented May 16, 2016

Did you add "static: yes" per https://groups.google.com/d/msg/ansible-devel/9aJaoVeRdOg/B4TvRTLgCAAJ ?

@d3matt
Copy link
Contributor

d3matt commented May 16, 2016

Interesting... until 438ed70, you didn't need to add static: yes to get included handlers to work...

@Shookit
Copy link

Shookit commented Jun 30, 2016

Should this be documented if static: yes is actually required? I got burnt by this as well, currently sitting at 2.1 release.

@cfoutstd
Copy link

cfoutstd commented Oct 11, 2016

This is broken in 2.1.1.0?

So if I had a some_role/tasks/main.yml

- include: roles/some_other_role

- name: Do something
  template: src=some_template.j2 dest=/opt/some_dir/conf
  notify: handler_in_included_role

What should I do?

@darkweaver87
Copy link

darkweaver87 commented Oct 19, 2016

Hello,

FYI, the install of 2.1.2.0 broke pretty much all my handlers called in included files.
For instance:

tasks/main.yml:

- include: foo.yml

tasks/foo.yml

- debug: msg="Calling bar"
  notify: bar

handlers/main.yml

- name: bar
  shell: echo bar

I've the following error: ERROR! The requested handler 'bar' was not found in the main handlers list

Rémi

@br0ziliy
Copy link

I have exactly the same problem as @darkweaver87 with ansible 2.3.0 (stable-2.2 54c5ea29bb) last updated 2016/11/28 17:50:23 (GMT +200)
A task running from an included file does not "see" the handlers.

@abcfy2
Copy link

abcfy2 commented Jan 22, 2017

I have the same issue in ansible 2.2.1.0. Handlers not working within include.

Without include, just a big playbook, handlers will be working.

cat roles/dgate-deploy/tasks/main.yml:

---
- include: tasks/create_service_user.yml
- include: tasks/create_service_dirs.yml
  with_items:
    - "{{service_install_dir}}"
    - "/etc/{{service_name}}"
- include: tasks/copy_files.yml
  with_items:
    - src: "../dgate-{{service_version}}-fat.jar"
      dst: "{{service_install_dir}}"
    - src: "../{{dgate_conf}}"
      dst: "/etc/{{service_name}}/"
    - src: "../dgate.jceks"
      dst: "{{service_install_dir}}"
  notify:
    - "restart {{service_name}}"
- include: tasks/create_service_script.yml
  notify:
    - "restart {{service_name}}"
- include: tasks/create_service_default.yml
  notify:
    - "restart {{service_name}}"
- stat:
    path: "/etc/init/{{service_name}}.conf"
  register: service_script
- include: tasks/start_service.yml
  when:
    service_script.stat.exists == True

cat roles/dgate-deploy/handlers/main.yml:

- include: handlers/restart_service.yml
  when:
    service_script.stat.exists == True

cat handlers/restart_service.yml:

- name: "restart {{service_name}}"
  service:
    name: "{{service_name}}"
    state: restarted
ansible-playbook -i hosts -C -D dgate-deploy.yml 

PLAY [deploy dgate] ***************************************************************

TASK [dgate-deploy : create user: dgate] *******************************************
ok: [10.25.67.174]

TASK [dgate-deploy : include] **************************************************
included: /var/lib/jenkins/workspace/dgate-deploy@script/ansible/tasks/create_service_dirs.yml for 10.25.67.174
included: /var/lib/jenkins/workspace/dgate-deploy@script/ansible/tasks/create_service_dirs.yml for 10.25.67.174

TASK [dgate-deploy : create dir: /usr/local/dgate] ***********************************
ok: [10.25.67.174]

TASK [dgate-deploy : create dir: /etc/dgate] *****************************************
ok: [10.25.67.174]

TASK [dgate-deploy : include] **************************************************
included: /var/lib/jenkins/workspace/dgate-deploy@script/ansible/tasks/copy_files.yml for 10.25.67.174
included: /var/lib/jenkins/workspace/dgate-deploy@script/ansible/tasks/copy_files.yml for 10.25.67.174
included: /var/lib/jenkins/workspace/dgate-deploy@script/ansible/tasks/copy_files.yml for 10.25.67.174

TASK [dgate-deploy : copy file: src=../dgate-0.0.1-fat.jar dst=/usr/local/dgate] ***
ok: [10.25.67.174]

TASK [dgate-deploy : copy file: src=../prod.conf dst=/etc/dgate/] **************
ok: [10.25.67.174]

TASK [dgate-deploy : copy file: src=../dgate.jceks dst=/usr/local/dgate] *******
diff skipped: destination file appears to be binary
diff skipped: source file appears to be binary
changed: [10.25.67.174]

TASK [dgate-deploy : copy service script: /etc/init/dgate.conf] ***************************
ok: [10.25.67.174]

TASK [dgate-deploy : copy service default: /etc/default/dgate] *****************************
ok: [10.25.67.174]

TASK [dgate-deploy : stat] *****************************************************
ok: [10.25.67.174]

TASK [dgate-deploy : start dgate] ************************************************
ok: [10.25.67.174]

PLAY RECAP *********************************************************************
10.25.67.174               : ok=15   changed=1    unreachable=0    failed=0  

Without any handlers trigger.

@nirik
Copy link

nirik commented Jan 23, 2017

Do you have "handler_includes_static = True" in your ansible.cfg?

@abcfy2
Copy link

abcfy2 commented Jan 23, 2017

@nirik Thanks, but I can't find this config in configuration.

But it's still not working.

I've already set

[defaults]
handler_includes_static = True

in ansible.cfg. But I still cannot see the handler trigger.

Also, I've tried modify roles/dgate-deploy/handlers/main.yml without any include, but still not working.

$ cat roles/dgate-deploy/handlers/main.yml

---
#- include: handlers/restart_service.yml
- name: restart dgate
  service:
    name: dgate
    state: restarted
  when:
    service_script.stat.exists == True

@abcfy2
Copy link

abcfy2 commented Jan 23, 2017

Sorry, it's my fault. I find the reason: #6094

notify does not work for include. So I have to add notify within include.

mbarcia added a commit to mbarcia/drupsible-deploy that referenced this issue Feb 17, 2017
ansible/ansible#13485. Moved to main.yml.
Removed unused handlers and its include files. 

Signed-off-by: Mariano Barcia <mariano.barcia@gmail.com>
@vhosakot
Copy link

Including handlers works beautifully for me in Ansible 2.2.1.0. Make sure you have - meta: flush_handlers at the right place before notifying handlers already notified. I also have force_handlers=True and no handler_includes_static = True in my ansible.cfg.

@ansibot ansibot added bug This issue/PR relates to a bug. and removed bug_report labels Mar 7, 2018
@ansible ansible locked and limited conversation to collaborators Apr 25, 2019
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug This issue/PR relates to a bug.
Projects
None yet
Development

No branches or pull requests