Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

gitfs regression with authenticated repos #23013

Closed
ghost opened this issue Apr 24, 2015 · 16 comments
Closed

gitfs regression with authenticated repos #23013

ghost opened this issue Apr 24, 2015 · 16 comments
Assignees
Labels
Bug broken, incorrect, or confusing behavior Confirmed Salt engineer has confirmed bug/feature - often including a MCVE File Servers fixed-pls-verify fix is linked, bug author to confirm fix P2 Priority 2 Platform Relates to OS, containers, platform-based utilities like FS, system based apps Regression The issue is a bug that breaks functionality known to work in previous releases. severity-medium 3rd level, incorrect or bad functionality, confusing and lacks a work around
Milestone

Comments

@ghost
Copy link

ghost commented Apr 24, 2015

I just upgraded my Fedora 21 install from salt-{,minon-}2014.7.1-1.fc21.noarch to 2014.7.5-1.fc21.noarch. Now, my setup does not work anymore.

I have python-pygit2 version 0.21.4 installed.

# cat /etc/salt/minion
file_client: local
fileserver_backend:
  - git
gitfs_remotes:
  - git@git.example.com:saltstack/bar:
    - pubkey: /root/.ssh/saltstack.pub
    - privkey: /root/.ssh/saltstack

All I get is this:

# salt-call state.highstate -l debug
[DEBUG   ] Reading configuration from /etc/salt/minion
[DEBUG   ] Using cached minion ID from /etc/salt/minion_id: bar
[DEBUG   ] Configuration file path: /etc/salt/minion
[DEBUG   ] Reading configuration from /etc/salt/minion
[DEBUG   ] The `dmidecode` binary is not available on the system. GPU grains will not be available.
[DEBUG   ] Mako not available
[DEBUG   ] pygit2 gitfs_provider enabled
[DEBUG   ] Updating git fileserver cache
[DEBUG   ] Set lock for git@git.example.com:saltstack/bar
[DEBUG   ] gitfs is fetching from git@git.example.com:saltstack/bar
[DEBUG   ] gitfs received 0 objects for remote git@git.example.com:saltstack/bar
[DEBUG   ] gitfs cleaned the following stale refs: ['refs/remotes/origin/master']
[DEBUG   ] Removed lock for git@git.example.com:saltstack/bar
[DEBUG   ] Reading configuration from /etc/salt/minion
[DEBUG   ] The `dmidecode` binary is not available on the system. GPU grains will not be available.
[INFO    ] Loading fresh modules for state activity
[DEBUG   ] Fetching file from saltenv 'base', ** attempting ** 'salt://top.sls'
local:
----------
          ID: states
    Function: no.None
      Result: False
     Comment: No Top file or external nodes data matches found
     Started: 
    Duration: 
     Changes:   

Summary
------------
Succeeded: 0
Failed:    1
------------
Total states run:     1

The top.sls is really simple for this repo:

base:
  'bar':
    - packages.bash
    - packages.gitolite
    - packages.mypkgs
    - packages.postfix
    - packages.salt-minion
    - packages.sshd
    - packages.sudo
    - packages.yum-cron
    - users

The directory /var/cache/salt/minion/files/base/ did contain the proper files previously. Although, after removing /var/cache/salt/minion/{gitfs,files} for testing purposes, it stays empty after a call to state.highstate:

# rm -rf /var/cache/salt/minion/{files,gitfs}
# salt-call state.highstate 
[INFO    ] Wrote new gitfs_remote map to /var/cache/salt/minion/gitfs/remote_map.txt
[INFO    ] Loading fresh modules for state activity
local:
----------
          ID: states
    Function: no.None
      Result: False
     Comment: No Top file or external nodes data matches found
     Started: 
    Duration: 
     Changes:   

Summary
------------
Succeeded: 0
Failed:    1
------------
Total states run:     1
# tree -a /var/cache/salt/minion/files/base/
/var/cache/salt/minion/files/base/

0 directories, 0 files

Is this expected behavior? I can't find anything about this in the Salt 2014.7.5 Release Notes.

@ghost
Copy link
Author

ghost commented Apr 24, 2015

I just upgraded a CentOS 7 machine to 2014.7.5-1.el7 and that machine keeps working.

salt-minion is installed from EPEL.

@ghost
Copy link
Author

ghost commented Apr 24, 2015

I upgraded another Fedora machine and it fails as described initially, too.

@ghost
Copy link
Author

ghost commented Apr 24, 2015

Comparing the debug output of the CentOS vs the Fedora machines I see the following appearing on the Fedora (i.e. non-working) machines only:

gitfs cleaned the following stale refs: ['refs/remotes/origin/master']

@ghost
Copy link
Author

ghost commented Apr 24, 2015

This has nothing to do with CentOS vs Fedora.

After adding a few debug lines in gitfs.py it was clear that the ssh key is not used. Creating /root/.ssh/config with the following content works around this problem.

Host git.example.com
    IdentityFile ~/.ssh/saltstack

@ghost ghost changed the title salt-call stopped working with gitfs in 2014.7.5 (standalone minion or salted master) gitfs ignores pubkey/privkey in 2014.7.5 Apr 24, 2015
@rallytime
Copy link
Contributor

Thanks for this report @markusr815. When you upgraded salt, did you also upgrade python-pygit2? Or is that the same as before you upgraded the version of salt on your minion? I am trying to reproduce this error right now.

@rallytime
Copy link
Contributor

Ok, I can confirm this bug. Thanks very much for this excellent report - it made it easy to reproduce once I realized what was going on. The https way of authenticating explained here works just fine, but the ssh method is indeed broken.

Steps to reproduce on a Fedora 21 VM:
Configure salt-minion running on 2014.7.5 with gitfs with your github ssh key:

# cat /etc/salt/minion
file_client: local
fileserver_backend:
  - git
gitfs_remotes:
  - git@github.com:rallytime/salt-jenkins.git:
    - pubkey: /root/.ssh/<your-key>.pub
    - privkey: /root/.ssh/<your-key>

Install pygit 2:

yum install python-pygit2

Then try to run a state from the your gitfs repository listed in the minion config:

# salt-call --local state.sls git.salt -l debug
[DEBUG   ] Reading configuration from /etc/salt/minion
[DEBUG   ] Using cached minion ID from /etc/salt/minion_id: 45.33.113.250
[DEBUG   ] Configuration file path: /etc/salt/minion
[DEBUG   ] Reading configuration from /etc/salt/minion
[DEBUG   ] The `lspci` binary is not available on the system. GPU grains will not be available.
[DEBUG   ] The `dmidecode` binary is not available on the system. GPU grains will not be available.
[DEBUG   ] Mako not available
[DEBUG   ] Registered VCS backend: git
[DEBUG   ] Registered VCS backend: hg
[DEBUG   ] Registered VCS backend: svn
[DEBUG   ] Registered VCS backend: bzr
[DEBUG   ] pygit2 gitfs_provider enabled
[DEBUG   ] Updating git fileserver cache
[DEBUG   ] Set lock for git@github.com:rallytime/salt-jenkins.git
[DEBUG   ] gitfs is fetching from git@github.com:rallytime/salt-jenkins.git
[DEBUG   ] gitfs received 0 objects for remote git@github.com:rallytime/salt-jenkins.git
[DEBUG   ] gitfs cleaned the following stale refs: ['refs/remotes/origin/master']
[DEBUG   ] Removed lock for git@github.com:rallytime/salt-jenkins.git
[DEBUG   ] Reading configuration from /etc/salt/minion
[DEBUG   ] The `lspci` binary is not available on the system. GPU grains will not be available.
[DEBUG   ] The `dmidecode` binary is not available on the system. GPU grains will not be available.
[INFO    ] Loading fresh modules for state activity
[DEBUG   ] Fetching file from saltenv 'base', ** attempting ** 'salt://git/salt.sls'
[DEBUG   ] Fetching file from saltenv 'base', ** attempting ** 'salt://git/salt/init.sls'
local:
    Data failed to compile:
----------
    No matching sls found for 'git.salt' in env 'base'

ping @terminalmage

@rallytime rallytime added Bug broken, incorrect, or confusing behavior severity-medium 3rd level, incorrect or bad functionality, confusing and lacks a work around Confirmed Salt engineer has confirmed bug/feature - often including a MCVE P2 Priority 2 File Servers labels Apr 24, 2015
@rallytime rallytime added this to the Approved milestone Apr 24, 2015
@rallytime rallytime added the Regression The issue is a bug that breaks functionality known to work in previous releases. label Apr 24, 2015
@deuscapturus
Copy link
Contributor

Fixed this by replacing /usr/lib/python2.6/site-packages/salt/fileserver/gitfs.py with the one found in here https://github.com/saltstack/salt/releases/download/v2014.7.1/salt-2014.7.1.tar.gz

@rallytime
Copy link
Contributor

@deuscapturus Do you think this is a duplicate of the other bug you filed a couple of days ago? If so, I know that we usually keep the "earliest" bugs open and close new duplicates, but since I've outlined some reproduction steps and narrowed this down and you've provided some helpful info here as well, I'm inclined to keep this one open and close the other one. What do you think?

@terminalmage
Copy link
Contributor

This was caused by a bugfix where branches/tags removed from the remote repo were still showing up as salt fileserver environments. pygit2 doesn't know how to detect remote refs, so I added code that uses the git CLI to run git ls-remote to detect this, and then remove the stale branches/tags from the local checkout used by gitfs. This won't work, however, if the repository is authenticated, it leads to every branch/tag being detected as a stale ref and thus removed.

I've disabled stale ref cleaning for authenticated repos in #23094, and opened libgit2/pygit2#524 to request proper support for detecting stale refs in pygit2.

@terminalmage terminalmage self-assigned this Apr 27, 2015
@terminalmage terminalmage changed the title gitfs ignores pubkey/privkey in 2014.7.5 gitfs regression with authenticated repos Apr 27, 2015
@terminalmage
Copy link
Contributor

I renamed this issue to reflect the actual problem.

@ryepup
Copy link
Contributor

ryepup commented Apr 27, 2015

Is there a suggested workaround until the next release?

@terminalmage
Copy link
Contributor

Commenting out this line and adding a cleaned = [] line will give you a temporary workaround. For example:

#cleaned = _clean_stale(repo['repo'], refs_post)
cleaned = []

@ryepup
Copy link
Contributor

ryepup commented Apr 27, 2015

@terminalmage is that in my master, minion, or both?

@ryepup
Copy link
Contributor

ryepup commented Apr 27, 2015

looks like it only needs to change on the master. Thanks @terminalmage!

@terminalmage
Copy link
Contributor

Yes, just the master. Sorry, should have been more specific.

@deuscapturus
Copy link
Contributor

FYI. Applying the workaround 'cleaned = []' will fix most gitfs issues but will still cause an issue with file.recurse giving error "Recurse failed: none of the specified sources were found". For a complete fix follow the workaround I specified above.

@jfindlay jfindlay added the Platform Relates to OS, containers, platform-based utilities like FS, system based apps label May 26, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Bug broken, incorrect, or confusing behavior Confirmed Salt engineer has confirmed bug/feature - often including a MCVE File Servers fixed-pls-verify fix is linked, bug author to confirm fix P2 Priority 2 Platform Relates to OS, containers, platform-based utilities like FS, system based apps Regression The issue is a bug that breaks functionality known to work in previous releases. severity-medium 3rd level, incorrect or bad functionality, confusing and lacks a work around
Projects
None yet
Development

No branches or pull requests

5 participants