Skip to content
This repository has been archived by the owner on Jul 7, 2020. It is now read-only.

metalink error on vagrant+ansible up.sh #60

Closed
screeley44 opened this issue Dec 8, 2016 · 7 comments · Fixed by #61
Closed

metalink error on vagrant+ansible up.sh #60

screeley44 opened this issue Dec 8, 2016 · 7 comments · Fixed by #61

Comments

@screeley44
Copy link

screeley44 commented Dec 8, 2016

The following error, although moves between nodes on subsequent runs, it has shown consistently on at least 1 node for each up.sh initiation. This time it showed on node0. wondering if there is something we can do to help this? going to try to remove epel.repo prior to yum installs or possibly yum clean metadata after repo is installed????

fatal: [node0]: FAILED! => {"changed": true, "cmd": ["yum", "-y", "install", "wget", "screen", "git", "vim", "glusterfs-client", "heketi-client", "iptables", "iptables-utils", "iptables-services", "docker", "kubeadm"], "delta": "0:00:16.419865", "end": "2016-12-08 14:39:52.070970", "failed": true, "rc": 1, "start": "2016-12-08 14:39:35.651105", "stderr": "http://mirror.math.princeton.edu/pub/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttps://mirror.chpc.utah.edu/pub/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.symnds.com/distributions/fedora-epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.us.leaseweb.net/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.sfo12.us.leaseweb.net/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirrors.syringanetworks.net/fedora-epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirrors.mit.edu/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.cs.pitt.edu/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttps://pubmirror1.math.uh.edu/fedora-buffet/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://fedora-epel.mirror.lstn.net/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://ftp.osuosl.org/pub/fedora-epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.nexcess.net/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.sjc02.svwh.net/fedora-epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for epel\nTrying other mirror.\nhttp://mirror.oss.ou.edu/epel/7/x86_64/repodata/repomd.xml: [Errno -1] repomd.xml does not match metalink for
@jarrpa
Copy link
Contributor

jarrpa commented Dec 8, 2016

Moving the conversation form #58 here:

@obnoxxx, I think you were running into this issue. Could you weigh in on this? I'm not sure this is a bug in the vagrant setup, I think this is some sort of networking/DNS issue.

@ravishivt
Copy link
Contributor

The fix in #61 does not fully fix it. I'm getting repomd.xml does not match metalink for epel errors on at least one random nodes during the vagrant provision. I can't actually get a successful provision after 4 attempts.

@ravishivt
Copy link
Contributor

I managed to fix this by adding retry logic to the "install base packages" task. It isn't pretty but I suspect there are some underlying network issues with the VMs it creates. I occasionally get the docker pull tasks to fail too. Or maybe it's just my network.

- name: install base packages
  command: yum -y install {{ install_pkgs }}
  register: task_result
  until: task_result.rc == 0
  retries: 5
  delay: 1

@screeley44
Copy link
Author

@ravishivt - yeah, I was testing some retry logic as well on Friday but I couldn't get mine to work for some reason. Thinks it's a good idea to have some retry logic in some key spots, I will add this in a PR...thanks!

@ravishivt
Copy link
Contributor

@screeley44 Sounds great. IMHO, the fixes in #61 should be reverted. It adds complexity but doesn't improve the outcome. In my tests, I had the same behavior when using or not using the additional logic in #61.

@screeley44
Copy link
Author

@ravishivt - agreed, will also revert those #61 changes

@jarrpa
Copy link
Contributor

jarrpa commented Dec 12, 2016

Just chiming in to say I like the more elegant retries solution. @screeley44, just a request to make sure the revert is its own commit. :)

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants