Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Beacon only fires to single random master in HA mode #49663

doesitblend opened this issue Sep 14, 2018 · 4 comments


Copy link

commented Sep 14, 2018

Description of Issue/Question

We are seeing an issue where certain beacons are only firing to a single random master. In particular this happens with the inotify and the process beacon. The problem occurs when the beacon is firing with at least one master down in an HA environment.

@DmitryKuzmenko is already aware of this issue and some of the details.

What we expect to see is that the beacon would fire to all alive masters.


Configure 3-4 masters and connect at least a single minion in the environment. The minions should be connected in HA mode to all masters.

Steps to Reproduce Issue

  1. Configure the inotify beacon on the minion to watch any file
  2. Stop the first master configured in the minion
  3. Watch the master event bus on the remaining alive masters
  4. Update the watched file on the minion to trigger the beacon
  5. You should see the event come in only once to a random master
  6. Re-triggering the beacon may send the event to a different master

Versions Report

#  Minion Version
    Salt Version:
               Salt: 2018.3.2
    Dependency Versions:
               cffi: Not Installed
           cherrypy: Not Installed
           dateutil: Not Installed
          docker-py: Not Installed
              gitdb: Not Installed
          gitpython: Not Installed
              ioflo: Not Installed
             Jinja2: 2.7.2
            libgit2: Not Installed
            libnacl: Not Installed
           M2Crypto: Not Installed
               Mako: Not Installed
       msgpack-pure: Not Installed
     msgpack-python: 0.5.6
       mysql-python: Not Installed
          pycparser: Not Installed
           pycrypto: 2.6.1
       pycryptodome: Not Installed
             pygit2: Not Installed
             Python: 2.7.5 (default, Jul 13 2018, 13:06:57)
       python-gnupg: Not Installed
             PyYAML: 3.11
              PyZMQ: 15.3.0
               RAET: Not Installed
              smmap: Not Installed
            timelib: Not Installed
            Tornado: 4.2.1
                ZMQ: 4.1.4
    System Versions:
               dist: centos 7.5.1804 Core
             locale: UTF-8
            machine: x86_64
            release: 4.9.93-linuxkit-aufs
             system: Linux
            version: CentOS Linux 7.5.1804 Core

This comment has been minimized.

Copy link

commented Sep 16, 2018

I believe the way it currently works is by design. If you have the beacon fire to all masters then you're going to have all the masters potentially reacting to the event, which could end up with duplicate work.
@thatch45 Would you mind weighing on this one?


This comment has been minimized.

Copy link

commented Sep 18, 2018

What's wrong in that beacons behavior is that beacons handling is executed in each minion instance in multimaster environment that looks like it's not designed to do this. Because each of minion instances does the same work at the same time. Beacons should be handled once at a time (by loop_interval) independent on how many masters minion is connected to. So here by minion design beacons should be either a singleton like schedule or be managed by MinionManager instead of Minion. The second looks better for me.

BTW @doesitblend I've checked it and confirm that beacons works like I've described before and in that example where we used inotify and ps beacons I see that inotify is executed and throwed to a random master once per event when ps is sent to all masters.


This comment has been minimized.

Copy link

commented Sep 18, 2018



This comment has been minimized.

Copy link

commented Oct 6, 2019

Fixed by #54247

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
None yet
5 participants
You can’t perform that action at this time.