-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
grains: jinja2.exceptions.UndefinedError: 'dict object' has no attribute 'os' when rendering Pillar #59205
Comments
@saltstack/team-macos FYI, any thoughts on this one? |
This looks very related to #59015 -- we're only setting custom grains (in /etc/salt/minion) on our macOS minions |
I'm seeing this exact problem. Edit - on linux The problem does not exist in I upgrade to Then I remove grains from my effusive config file:
and it works again. |
We just ran into a similar problem on an Ubuntu 18.04 minion instance running with the 3002.2 minion:
--versions-report for the affected minion is below
In our case, the master was able to load the grains correctly from the minion when running
but acted like the grain was undefined when I ran orchestrations. We tried lots of things to cajole it - restarting master and minion services, saltutil.sync_all on master and minion, removing the key from the master and readding it. The pillar .sls attempting to use the grain is straightforward. Interesting that the error occurs for us on the
As with the original post, saltutil.refresh_grains + saltutil.refresh_pillar fixed the condition for us - thanks for that. Only additional detail I can offer is the failing instance had been recently cloned onto a different vm instance/architecture by our IT team, so perhaps the salt master grains cache became confused by some detail of the updated architecture on the minion machine. Our minion files use custom grains (which hadn't changed after the machine was cloned). |
Looks like we are facing the same issue with Salt Here is a version report:
There are several errors in the master logs as well. Which look like Jinja errors, to be honest:
|
Does anyone have a reliable way to reproduce this, or is it random? If it is reproducible, then I would like to bisect of the salt code to narrow down what code change caused this regression. We know it used to work, and we know it now works unreliably at best in some cases. If anyone has a scenario where they can reliably reproduce the failure, please include any specific details. I will then try to reproduce here, and then narrow it down. Include any details related to the setup. The simpler the reproduction setup, the better. Thanks in advance. |
I'm having the same problem with Salt 3002.6+ds-1 on Ubuntu 20.04.2 LTS. I have a macro that contains
and every time it is used it returns an error:
This is a blocker. |
I dove into this and found the following: I have some grains defined in
And I found that the This behaviour was different in 3002.1+ds-1. |
Hello, could anybody fix this bug please? Because of this we cannot install the critical security patches provided by 3002.6 |
If anyone has any information on how to reproduce this issue reliably, please post. We will need to attempt to reproduce this issue before we can fix it. If it can't be reliably reproduced, we will at least need to get an environment set up where it can be reproduced randomly, and then try to track it down from there. |
@danielrobbins , In my case it was a minion started with Salt Cloud in AWS public cloud (simple EC2 machines). Both master and minion were running 3002 version on Ubuntu 20.04 with the default Python 3 interpreter. Unfortunately, I cannot disclose concrete details about the setup e.g. config files, states, etc. Hopefully, this information helps. This bug doesn't occur with versions 3000 and 3003 (both master and minion should be 3003). So, whoever encounters this bug as well, you can try to upgrade your masters and minions to 3003 |
FWIW that original error I don't think was due to os grains not being defined - at least More generally, |
I simplified my setup as much as I could. With that, reproduce the bug with the following simple files:
Then install the packages and the error pops up immediately:
With saltstack |
This zip contains the config files list above: |
@rasstef Using the zipfile you provided in an Ubuntu container I haven't see the error that you were seeing. Are you running a particular state or Salt command when you see the issue? Thanks! |
Ok, I wasn't explaining it at full lengths:
I haven't checked the intermediate versions ( |
Ah, this reactor is already sufficient to trigger the error:
Just restart the minion and there we go...
|
I could simplify the code that causes the exception to occur even more. Please find it here: Just restart the minion and have a look into It all depends on the grain defined in the first line of
is raised, for grains defined in |
@sagetherage When will the version with the bugfix be available? Could you provide a patch once you have it, please? We need this urgently.... |
Well, I can reliably reproduce the symptom, but not the cause. Specifically, I just added Given it's an intermittent issue that can be solved with a |
After some further research - I was able to get this to happen by blocklisting If you add these lines to your minion config:
And then run I'm not sure how or why this is happening, for your current setup, but so far If you encounter this again naturally, what is the output of If this solves the problem, great! But if not, aside from the Let us know, thanks! |
Dear waynew,
Thanks a lot for that @waynew, but I don't see how this could be related to the problem we have.
(here:
All other grains would cause a To summarize: according to |
I am facing a similar issue in my project : Error log :
Appreciate if you could help me resolve this error coming in for salt deployment |
@rasstef have you tried using the |
@miraznawaz that error looks like it's from a different problem - you may find some help in one of our other community resources: scroll down on https://saltproject.io/ |
Description
Having just upgraded to 3002.2, we are seeing random instances where Pillar fails to render due to core grains (specifically 'os') not being defined.
Exception in master log is as follows:
Setup
The Pillar SLS being rendered which fails with the above is quite straightforward:
Steps to Reproduce the behavior
Unfortunately I've not found how to reliably reproduce this. We have a few hundred minions, and following the upgrade (we were previously at 2017.7.8) we're observing this at random (i.e. most of the time, rendering happens correctly, but when it doesn't, forcing a saltutil.refresh_grains + saltutil.refresh_pillar does the trick, but the problem can crop up again after an indeterminate amount of time).
I can attest that the 'os' grain is the only grain that's failing to be defined, and this is only observed for MacOS minions.
Expected behavior
grains['os'] to reliably evaluate within Pillar SLS definitions
Versions Report
salt --versions-report
master:
minions:
The text was updated successfully, but these errors were encountered: