Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

dchp active on private network with ip option #312

Closed
obnoxxx opened this issue Feb 21, 2015 · 18 comments
Closed

dchp active on private network with ip option #312

obnoxxx opened this issue Feb 21, 2015 · 18 comments

Comments

@obnoxxx
Copy link
Contributor

obnoxxx commented Feb 21, 2015

Hi,

this has been discussed before (e.g. issue #103). But this is quite some time
ago and the issues have been closed. I am facing this issue with latest varant
and latest vagrant-libvirt using purpleidea's fedora21 base box:

I have a libvirt network interface internal1 which has a network 172.20.10.0/24.
DHCP is enable on that network for the upper half of the range. In my Vagrantfile
I specify a private network like this:

config.vm.define 'hostname' do |node|
...
node.vm.network :private_network, :ip => 172.20.10.10
...
end

When I initially create the vm with 'vagrant up', it finds the
network internal1 and associates the address. It even drops
a file /etc/sysconfig/network-scripts/ifcfg-eth1 with the following
content:

#VAGRANT-BEGIN
# The contents below are automatically generated by Vagrant. Do not modify.
NM_CONTROLLED=no
BOOTPROTO=none
ONBOOT=yes
IPADDR=172.20.10.10
NETMASK=255.255.255.0
DEVICE=eth1
PEERDNS=no
#VAGRANT-END

But nevertheless dhclient is also active on the interface
and I hence get two ips, which is not what I want.

So the first question is how to disable dhcp on the private
network with a configured IP.

Now things get even more strange:
When I vagrant halt and then vagrant up the box, then
the interfaces is not brought up at all, at least there are
no IPs at all associated to it. And doing ifup in the vm
does not help either.

The vagrant-libvirt management interface works fine.
Vagrant-libvirt created it for me attached to bridge virbr5.
This is using dhcp as desired, and I can vagrant ssh into
the machine.

Any help would be appreciated!!!

Cheers - Michael

@obnoxxx
Copy link
Contributor Author

obnoxxx commented Feb 21, 2015

ok... this seems to be at least somehow related to the concrete fedora box i am using...
the centos7 box works fine. :-o

@obnoxxx
Copy link
Contributor Author

obnoxxx commented Feb 23, 2015

Really strange:

@obnoxxx
Copy link
Contributor Author

obnoxxx commented Feb 23, 2015

Here is the decisive extract of the debug output of initially booting fedora21:

DEBUG guest: Searching for cap: network_scripts_dir
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: network_scripts_dir in redhat
 INFO guest: Execute capability: network_scripts_dir [#<Vagrant::Machine: default (VagrantPlugins::ProviderLibvirt::Pro
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: /sbin/ifdown eth1 2> /dev/null (sudo=true)
DEBUG ssh: Exit status: 1
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: touch /etc/sysconfig/network-scripts/ifcfg-eth1 (sudo=true)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: sed -e '/^#VAGRANT-BEGIN/,/^#VAGRANT-END/ d' /etc/sysconfig/network-scripts/ifcfg-eth1 > /tmp/vagra
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /tmp/vagrant-ifcfg-eth1 > /etc/sysconfig/network-scripts/ifcfg-eth1 (sudo=true)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: rm -f /tmp/vagrant-ifcfg-eth1 (sudo=true)
DEBUG ssh: Exit status: 0
DEBUG ssh: Uploading: /tmp/vagrant20150223-16460-mrjpu to /tmp/vagrant-network-entry_1
DEBUG ssh: Re-using SSH connection.
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: /sbin/ifdown eth1 2> /dev/null (sudo=true)
DEBUG ssh: stdout: ERROR    : [ipv6_test_device_status] Missing parameter 'device' (arg 1)

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /tmp/vagrant-network-entry_1 >> /etc/sysconfig/network-scripts/ifcfg-eth1 (sudo=true)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: ARPCHECK=no /sbin/ifup eth1 2> /dev/null (sudo=true)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: rm -f /tmp/vagrant-network-entry_1 (sudo=true)
DEBUG ssh: Exit status: 0

This sounds very similar to what has been reported and discussed in vagrant a long
time ago: hashicorp/vagrant#921

@obnoxxx
Copy link
Contributor Author

obnoxxx commented Feb 23, 2015

More info:

Strangely, if I ifdown eth1, systemctl restart NetworkManager and then ifup eth1, the dhclient disappears and the network interface is configured as desired.

When I vagrant halt ; vagrant up, then the eth1 interface is not started. I have to manually ifup eth1 in the vm for it to get an IP address.

@purpleidea
Copy link
Contributor

A few issues:

You need to specify which versions of things you're using:

  • vagrant-libvirt version (and if fedora RPM or not)
  • vagrant version (and which RPM)
  • os version

and as we discussed offline, if you've been able to reproduce the issues using a stock https://github.com/purpleidea/oh-my-vagrant configuration.

@obnoxxx
Copy link
Contributor Author

obnoxxx commented Feb 24, 2015

As written in the initial comment:

  • latest vagrant (fedora RPM)
  • latest vagrant-libvirt (fedora RPM)
  • fedora 21

I tested with omv, but then I realizied why I did not test it before:
As described above, my libvirt host-only network has dhcp enabled for
the upper half of the address range. But I am specifying and ip address
in my vagrant file (in your case through omv.yaml).
Here is my omv.yaml:

---
:domain: example.com
:network: 172.20.10.40/24
:image: local-fedora-21.2
:boxurlprefix: ''
:sync: rsync
:folder: ''
:extern: []
:puppet: false
:classes: []
:docker: false
:cachier: false
:vms: []
:namespace: omv
:count: 1
:username: ''
:password: ''
:poolid: []
:repos: []

Just like your omv does, I tried to disable dhcp in the Vagrantfile for the private network by setting
dhcp_enabled to false. But then vagrant refuses to start the vm:

$ vagrant up
Warning: Can't to load 'xdg' gem - check local gem path
/home/obnox/vagrant/oh-my-vagrant/vagrant/gems/xdg/lib
Bringing machine 'omv1' up with 'libvirt' provider...
==> omv1: Creating image (snapshot of base box volume).
==> omv1: Creating domain with the following settings...
==> omv1:  -- Name:              omv_omv1
==> omv1:  -- Domain type:       kvm
==> omv1:  -- Cpus:              1
==> omv1:  -- Memory:            512M
==> omv1:  -- Base box:          local-fedora-21.2
==> omv1:  -- Storage pool:      default
==> omv1:  -- Image:             /var/lib/libvirt/images/omv_omv1.img
==> omv1:  -- Volume Cache:      default
==> omv1:  -- Kernel:            
==> omv1:  -- Initrd:            
==> omv1:  -- Graphics Type:     vnc
==> omv1:  -- Graphics Port:     5900
==> omv1:  -- Graphics IP:       127.0.0.1
==> omv1:  -- Graphics Password: Not defined
==> omv1:  -- Video Type:        cirrus
==> omv1:  -- Video VRAM:        9216
==> omv1:  -- Command line : 
Network internal1 exists but does not have dhcp disabled.
Please fix your configuration and run vagrant again.

So I don't even get to the point where the machine is started.

I also think that running with a completely minimal Vagrantfile is the better test
for bugs than running with omv.

According to my analyis above, vagrant-libvirt tries to do the
right thing with the ifconfig-eth1, but the ifdown that is done befor
the modified config is copied into place fails. And then the ifup
leaves the dhclient running. restarting NetworkManager helps.
And even more puzzling, a vagrant up after vagrant halt does not
bring the IF up at all.

@purpleidea
Copy link
Contributor

This might be related to the bug I'm seeing in: https://bugzilla.redhat.com/show_bug.cgi?id=1221006

@Momus
Copy link

Momus commented Jun 3, 2015

Having a similar problem, except I don't see any evidence that Vagrant is even trying to configure my network from the Vagrantfile. No matter how I specify network setting, when I 'vagrant up' they all seem to be ignored, and DHCP is used to configure a single eth0 interface. I've tried this with public and private networks.
libvirt installed as vagrant plugin, vagrant installed from Vagrant site, not from Fedora repo.
Installed Version: 1.7.2
Latest Version: 1.7.2

My Vagrantfile:
http://pastebin.com/NARrMGTi
The long debug from 'vagrant up' (this was from a run where I did not have the synced folders disabled, hence the timeout on NFS mount)
http://pastebin.com/DjPzA8AX

@purpleidea
Copy link
Contributor

@Momus Small unrelated note:

You have config.vm.synced_folder nested inside the vm block. This will parse later, but run first. Think of it as a strange global. Better to switch to namespacing inside your vm block, or moving it outside so you don't go crazy debugging things.

HTH maybe you can figure out our networking issues :)

@Momus
Copy link

Momus commented Jun 3, 2015

@purpleidea Thanks for catching that. Unfortunately, that wound up being placed there when I was trying to clean up the file for Pastebin, and fixing it had no effect (I just checked) OTOH, 'vagrant up' with libvirt is really fast when it doesn't have anything to hang it.

@uvsmtid
Copy link

uvsmtid commented Sep 27, 2015

I experienced the same issue using:

  • host machine: Fedora 22
  • guest machine: CentOS 7
  • vagrant: 1.7.2
  • vagrant-libvirt: 0.0.26

After extensive testing I realized that the weird behaviour (double IP addresses when DHCP is on) exists only immediately after first vagrant up. After turning VM off and starting it again, I get the single required IP.

It is not ideal, but it may be acceptable as a workaround.

dnsmasq/DHCP residual leases

When you don't get required IP at all...

If virtual networks are already created, dnsmasq keeps IP lease status in its files per interface:

/var/lib/libvirt/dnsmasq/*.status

Therefore, when Vagrantfile is reconfigured with new IP address, newly created VM will still gets its old IP from dnsmasq over DHCP.

To avoid issues with the residual IP leases, you have to remove virtual networks either/or:

  • Manually: Virtual Machine Manager => menu Edit => Connection Details => tab Virtual Networks.
  • Automatically: Vagrant automatically removes networks it creates if all Vagrant-created VMs are destroyed.

Notes for troubleshooting

  • Graphical utilities related to NetworkManager do not show all IP addresses (Vagrant disables using NM for its interfaces) - use command line ones (e.g. ip addr) to see the truth.
  • Vagrant configures virtual networks only when they do not exit. Remove them if they do before vagrant up is run first time (see above)
  • Network is configured in each VM (multiple times if there are multiple VMs) - test with single VM first, also destroy powered off VMs even those which were not created by vagrant (to make sure they don't keep previously configured networks from destruction as "used"). Otherwise keeping all network configurations consistent is difficult.
  • I also use :mac option for vagrant-libvrit - this makes interface identities clear rather than relying on (unreliable) listing orders of interfaces for host and guest.

@infernix
Copy link
Member

AFAIK there were some changes in vagrant 1.8.1 that affect the config snippets that Vagrant uses to configure interfaces in guests. Unfortunately there is no fully working Vagrantfile here (could not find that opscode freebsd 10.1 libvirt box anywhere)

If this issue still persists, please retest under vagrant 1.8.1 and vagrant-libvirt 0.0.33 and if it still persists please post:

  • Full Vagrantfile with valid box (either provide URL or preferably use an Atlas box)
  • Complete output ofVAGRANT_LOG=debug vagrantup

@uvsmtid
Copy link

uvsmtid commented Jun 22, 2016

As requested in the previous post, I re-produced the issue on the following system:

  • host machine: Fedora 23 4.5.5-201.fc23.x86_64
  • vagrant: 1.8.1
  • vagrant-libvirt: 0.0.32 [version 0.0.33 tested separately]
  • guest: CentOS-7.1-1503 [see Vagrantfile below]

The RPMs above are from default Fedora 23 repository.

To re-iterate, the problem is that running vagrant up does not set guest fixed IP address on the private network as specified in Vagrant file. The IP address appears randomly selected when vagrant up is run on specific host machine. However, if multiple vagrant up/vagrant destroy/vagrant up/... cycles are executed, the same guest IP is re-used (not sure where it is "cached").

See below:

  • Problematic Vagrantfile
  • Host network interfaces
  • Guest network interfaces
  • vagrant up STDOUT
  • vagrant up STDERR

Problematic Vagrantfile

# Vagrant file to test this issue:
#   https://github.com/vagrant-libvirt/vagrant-libvirt/issues/312

VAGRANTFILE_API_VERSION = "2"

Vagrant.configure(VAGRANTFILE_API_VERSION) do |config|

  config.vm.provider "libvirt"
  config.vm.provider :libvirt do |libvirt|

      libvirt.video_type = 'qxl'

  end

  config.vm.box_download_insecure = true

  config.vm.define "observer_client_1" do |observer_client_1|

    observer_client_1.vm.box = "uvsmtid/centos-7.1-1503-gnome"

    # See libvirt configuration:
    #   https://github.com/pradels/vagrant-libvirt
    observer_client_1.vm.provider :libvirt do |observer_client_1_domain|
        observer_client_1_domain.memory = 2048
        observer_client_1_domain.cpus = 2
    end

    observer_client_1.vm.provision "shell", inline: "true"

    # Based on Vagrant explanation, in the future they may support provider
    # per each VM. At the moment, it should only be configured per all
    # set of VMs (outside of individual configuration).
    #observer_client_1.vm.provider = "libvirt"

    observer_client_1.vm.network :private_network,
        :ip => '192.168.1.3',
        :libvirt__network_name => 'vagrant_internal_net',
        :mac => 'FA:16:3E:3D:C8:77',
        :libvirt__netmask => '255.255.255.0',
        :libvirt__forward_mode => 'nat',

        # Use DHCP to offer addresses to avoid too long initialization
        # of network interfaces during first boot (before static IP is
        # configured by Vagrant) on some OSes.
        # NOTE: At the time of coding IP range for DHCP server was not
        #       configurable. So, we hope that there will be no conflicts
        #       with IP addresses assigned statically.
        :libvirt__dhcp_enabled => true,

        # A syntactic dummy to allow every option above end with comma `,`.
        :whatever => true

  end

end

Host network interfaces

1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet 127.0.0.1/8 scope host lo
       valid_lft forever preferred_lft forever
    inet6 ::1/128 scope host 
       valid_lft forever preferred_lft forever
2: eno1: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc fq_codel state UP group default qlen 1000
    link/ether a0:48:1c:92:83:61 brd ff:ff:ff:ff:ff:ff
    inet 10.77.4.116/24 brd 10.77.4.255 scope global dynamic eno1
       valid_lft 27253sec preferred_lft 27253sec
    inet6 fe80::a248:1cff:fe92:8361/64 scope link 
       valid_lft forever preferred_lft forever
65: virbr0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
    link/ether 52:54:00:4e:80:a1 brd ff:ff:ff:ff:ff:ff
    inet 192.168.121.1/24 brd 192.168.121.255 scope global virbr0
       valid_lft forever preferred_lft forever
66: virbr0-nic: <BROADCAST,MULTICAST> mtu 1500 qdisc fq_codel master virbr0 state DOWN group default qlen 1000
    link/ether 52:54:00:4e:80:a1 brd ff:ff:ff:ff:ff:ff
67: virbr1: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc noqueue state UP group default qlen 1000
    link/ether 52:54:00:a0:e1:7b brd ff:ff:ff:ff:ff:ff
    inet 192.168.1.1/24 brd 192.168.1.255 scope global virbr1
       valid_lft forever preferred_lft forever
68: virbr1-nic: <BROADCAST,MULTICAST> mtu 1500 qdisc fq_codel master virbr1 state DOWN group default qlen 1000
    link/ether 52:54:00:a0:e1:7b brd ff:ff:ff:ff:ff:ff
69: vnet0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc fq_codel master virbr0 state UNKNOWN group default qlen 1000
    link/ether fe:54:00:22:48:a1 brd ff:ff:ff:ff:ff:ff
    inet6 fe80::fc54:ff:fe22:48a1/64 scope link 
       valid_lft forever preferred_lft forever
70: vnet1: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc fq_codel master virbr1 state UNKNOWN group default qlen 1000
    link/ether fe:16:3e:3d:c8:77 brd ff:ff:ff:ff:ff:ff
    inet6 fe80::fc16:3eff:fe3d:c877/64 scope link 
       valid_lft forever preferred_lft forever

Guest network interfaces

1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN 
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet 127.0.0.1/8 scope host lo
       valid_lft forever preferred_lft forever
    inet6 ::1/128 scope host 
       valid_lft forever preferred_lft forever
2: eth0: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP qlen 1000
    link/ether 52:54:00:22:48:a1 brd ff:ff:ff:ff:ff:ff
    inet 192.168.121.155/24 brd 192.168.121.255 scope global dynamic eth0
       valid_lft 3574sec preferred_lft 3574sec
    inet6 fe80::5054:ff:fe22:48a1/64 scope link 
       valid_lft forever preferred_lft forever
3: eth1: <BROADCAST,MULTICAST,UP,LOWER_UP> mtu 1500 qdisc pfifo_fast state UP qlen 1000
    link/ether fa:16:3e:3d:c8:77 brd ff:ff:ff:ff:ff:ff
    inet 192.168.1.68/24 brd 192.168.1.255 scope global dynamic eth1
       valid_lft 3577sec preferred_lft 3577sec
    inet6 fe80::f816:3eff:fe3d:c877/64 scope link 
       valid_lft forever preferred_lft forever

vagrant up STDOUT

Bringing machine 'observer_client_1' up with 'libvirt' provider...
==> observer_client_1: Creating image (snapshot of base box volume).
==> observer_client_1: Creating domain with the following settings...
==> observer_client_1:  -- Name:              vagrantissuedir_observer_client_1
==> observer_client_1:  -- Domain type:       kvm
==> observer_client_1:  -- Cpus:              2
==> observer_client_1:  -- Memory:            2048M
==> observer_client_1:  -- Management MAC:    
==> observer_client_1:  -- Loader:            
==> observer_client_1:  -- Base box:          uvsmtid/centos-7.1-1503-gnome
==> observer_client_1:  -- Storage pool:      default
==> observer_client_1:  -- Image:             /var/lib/libvirt/images/vagrantissuedir_observer_client_1.img (138G)
==> observer_client_1:  -- Volume Cache:      default
==> observer_client_1:  -- Kernel:            
==> observer_client_1:  -- Initrd:            
==> observer_client_1:  -- Graphics Type:     vnc
==> observer_client_1:  -- Graphics Port:     5900
==> observer_client_1:  -- Graphics IP:       127.0.0.1
==> observer_client_1:  -- Graphics Password: Not defined
==> observer_client_1:  -- Video Type:        qxl
==> observer_client_1:  -- Video VRAM:        9216
==> observer_client_1:  -- Keymap:            en-us
==> observer_client_1:  -- INPUT:             type=mouse, bus=ps2
==> observer_client_1:  -- Command line : 
==> observer_client_1: Creating shared folders metadata...
==> observer_client_1: Starting domain.
==> observer_client_1: Waiting for domain to get an IP address...
==> observer_client_1: Waiting for SSH to become available...
    observer_client_1: 
    observer_client_1: Vagrant insecure key detected. Vagrant will automatically replace
    observer_client_1: this with a newly generated keypair for better security.
    observer_client_1: 
    observer_client_1: Inserting generated public key within guest...
    observer_client_1: Removing insecure key from the guest if it's present...
    observer_client_1: Key inserted! Disconnecting and reconnecting using new SSH key...
==> observer_client_1: Configuring and enabling network interfaces...
==> observer_client_1: Rsyncing folder: /mnt/backup/home/uvsmtid/vagrant.issue.dir/ => /vagrant
==> observer_client_1: Running provisioner: shell...
    observer_client_1: Running: inline script

vagrant up STDERR

 INFO global: Vagrant version: 1.8.1
 INFO global: Ruby version: 2.2.5
 INFO global: RubyGems version: 2.4.8
 INFO global: VAGRANT_DEFAULT_PROVIDER="libvirt"
 INFO global: VAGRANT_EXECUTABLE="/usr/share/vagrant/bin/vagrant"
 INFO global: VAGRANT_LOG="debug"
 INFO global: VAGRANT_INSTALLER_EMBEDDED_DIR="/var/lib/vagrant"
 INFO global: VAGRANT_INSTALLER_VERSION="2"
 INFO global: VAGRANT_INTERNAL_BUNDLERIZED="1"
 INFO global: VAGRANT_DETECTED_OS="Linux"
 INFO global: VAGRANT_INSTALLER_ENV="1"
 INFO global: Plugins:
 INFO global:   - builder = 3.2.2
 INFO global:   - bundler = 1.7.8
 INFO global:   - excon = 0.45.1
 INFO global:   - formatador = 0.2.4
 INFO global:   - mime-types = 1.25.1
 INFO global:   - net-ssh = 2.9.1
 INFO global:   - net-scp = 1.2.1
 INFO global:   - fog-core = 1.29.0
 INFO global:   - multi_json = 1.10.1
 INFO global:   - fog-json = 1.0.0
 INFO global:   - nokogiri = 1.6.7.2
 INFO global:   - fog-xml = 0.1.1
 INFO global:   - json = 1.8.3
 INFO global:   - ruby-libvirt = 0.6.0
 INFO global:   - fog-libvirt = 0.0.3
 INFO global:   - vagrant-libvirt = 0.0.32
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/ftp/plugin.rb
 INFO manager: Registered plugin: ftp
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/atlas/plugin.rb
 INFO manager: Registered plugin: atlas
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/noop/plugin.rb
 INFO manager: Registered plugin: noop
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/local-exec/plugin.rb
 INFO manager: Registered plugin: local-exec
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/heroku/plugin.rb
 INFO manager: Registered plugin: heroku
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/providers/docker/plugin.rb
 INFO manager: Registered plugin: docker-provider
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/providers/virtualbox/plugin.rb
 INFO manager: Registered plugin: VirtualBox provider
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/providers/hyperv/plugin.rb
 INFO manager: Registered plugin: Hyper-V provider
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/kernel_v1/plugin.rb
 INFO manager: Registered plugin: kernel
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/kernel_v2/plugin.rb
 INFO manager: Registered plugin: kernel
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/puppet/plugin.rb
 INFO manager: Registered plugin: puppet
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/chef/plugin.rb
 INFO manager: Registered plugin: chef
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/file/plugin.rb
 INFO manager: Registered plugin: file
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/salt/plugin.rb
 INFO manager: Registered plugin: salt
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/ansible/plugin.rb
 INFO manager: Registered plugin: ansible
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/shell/plugin.rb
 INFO manager: Registered plugin: shell
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/docker/plugin.rb
 INFO manager: Registered plugin: docker
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/cfengine/plugin.rb
 INFO manager: Registered plugin: CFEngine Provisioner
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/communicators/winrm/plugin.rb
 INFO manager: Registered plugin: winrm communicator
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/communicators/ssh/plugin.rb
 INFO manager: Registered plugin: ssh communicator
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/openbsd/plugin.rb
 INFO manager: Registered plugin: OpenBSD guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/tinycore/plugin.rb
 INFO manager: Registered plugin: TinyCore Linux guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/gentoo/plugin.rb
 INFO manager: Registered plugin: Gentoo guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/arch/plugin.rb
 INFO manager: Registered plugin: Arch guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/coreos/plugin.rb
 INFO manager: Registered plugin: CoreOS guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/nixos/plugin.rb
 INFO manager: Registered plugin: NixOS guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/esxi/plugin.rb
 INFO manager: Registered plugin: ESXi guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/linux/plugin.rb
 INFO manager: Registered plugin: Linux guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/solaris/plugin.rb
 INFO manager: Registered plugin: Solaris guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/solaris11/plugin.rb
 INFO manager: Registered plugin: Solaris 11 guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/debian/plugin.rb
 INFO manager: Registered plugin: Debian guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/netbsd/plugin.rb
 INFO manager: Registered plugin: NetBSD guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/omnios/plugin.rb
 INFO manager: Registered plugin: OmniOS guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/ubuntu/plugin.rb
 INFO manager: Registered plugin: Ubuntu guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/mint/plugin.rb
 INFO manager: Registered plugin: Mint guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/darwin/plugin.rb
 INFO manager: Registered plugin: Darwin guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/photon/plugin.rb
 INFO manager: Registered plugin: VMware Photon guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/atomic/plugin.rb
 INFO manager: Registered plugin: Atomic Host guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/pld/plugin.rb
 INFO manager: Registered plugin: PLD Linux guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/smartos/plugin.rb
 INFO manager: Registered plugin: SmartOS guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/redhat/plugin.rb
 INFO manager: Registered plugin: Red Hat Enterprise Linux guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/freebsd/plugin.rb
 INFO manager: Registered plugin: FreeBSD guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/slackware/plugin.rb
 INFO manager: Registered plugin: Slackware guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/fedora/plugin.rb
 INFO manager: Registered plugin: Fedora guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/funtoo/plugin.rb
 INFO manager: Registered plugin: Funtoo guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/suse/plugin.rb
 INFO manager: Registered plugin: SUSE guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/windows/plugin.rb
 INFO manager: Registered plugin: Windows guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/suspend/plugin.rb
 INFO manager: Registered plugin: suspend command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/package/plugin.rb
 INFO manager: Registered plugin: package command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/push/plugin.rb
 INFO manager: Registered plugin: push command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/status/plugin.rb
 INFO manager: Registered plugin: status command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/port/plugin.rb
 INFO manager: Registered plugin: port command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/box/plugin.rb
 INFO manager: Registered plugin: box command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/login/plugin.rb
 INFO manager: Registered plugin: vagrant-login
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/resume/plugin.rb
 INFO manager: Registered plugin: resume command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/destroy/plugin.rb
 INFO manager: Registered plugin: destroy command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/powershell/plugin.rb
 INFO manager: Registered plugin: powershell command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/ssh_config/plugin.rb
 INFO manager: Registered plugin: ssh-config command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/ssh/plugin.rb
 INFO manager: Registered plugin: ssh command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/rdp/plugin.rb
 INFO manager: Registered plugin: rdp command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/cap/plugin.rb
 INFO manager: Registered plugin: cap command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/halt/plugin.rb
 INFO manager: Registered plugin: halt command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/list-commands/plugin.rb
 INFO manager: Registered plugin: list-commands command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/provision/plugin.rb
 INFO manager: Registered plugin: provision command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/version/plugin.rb
 INFO manager: Registered plugin: version command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/up/plugin.rb
 INFO manager: Registered plugin: up command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/reload/plugin.rb
 INFO manager: Registered plugin: reload command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/init/plugin.rb
 INFO manager: Registered plugin: init command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/provider/plugin.rb
 INFO manager: Registered plugin: provider command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/global-status/plugin.rb
 INFO manager: Registered plugin: global-status command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/snapshot/plugin.rb
 INFO manager: Registered plugin: snapshot command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/help/plugin.rb
 INFO manager: Registered plugin: help command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/plugin/plugin.rb
 INFO manager: Registered plugin: plugin command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/bsd/plugin.rb
 INFO manager: Registered plugin: BSD host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/gentoo/plugin.rb
 INFO manager: Registered plugin: Gentoo host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/arch/plugin.rb
 INFO manager: Registered plugin: Arch host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/linux/plugin.rb
 INFO manager: Registered plugin: Linux host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/null/plugin.rb
 INFO manager: Registered plugin: null host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/darwin/plugin.rb
 INFO manager: Registered plugin: Mac OS X host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/redhat/plugin.rb
 INFO manager: Registered plugin: Red Hat Enterprise Linux host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/freebsd/plugin.rb
 INFO manager: Registered plugin: FreeBSD host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/slackware/plugin.rb
 INFO manager: Registered plugin: Slackware host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/suse/plugin.rb
 INFO manager: Registered plugin: SUSE host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/windows/plugin.rb
 INFO manager: Registered plugin: Windows host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/synced_folders/rsync/plugin.rb
 INFO manager: Registered plugin: RSync synced folders
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/synced_folders/smb/plugin.rb
 INFO manager: Registered plugin: SMB synced folders
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/synced_folders/nfs/plugin.rb
 INFO manager: Registered plugin: NFS synced folders
 INFO global: Loading plugins!
 INFO manager: Registered plugin: libvirt
 INFO vagrant: `vagrant` invoked: ["up"]
DEBUG vagrant: Creating Vagrant environment
 INFO environment: Environment initialized (#<Vagrant::Environment:0x00562ce142d880>)
 INFO environment:   - cwd: /mnt/backup/home/uvsmtid/vagrant.issue.dir
 INFO environment: Home path: /home/uvsmtid/.vagrant.d
 INFO environment: Local data path: /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant
DEBUG environment: Creating: /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant
 INFO environment: Running hook: environment_plugins_loaded
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: environment_plugins_loaded #<Vagrant::Action::Builder:0x00562ce12cfa88>
 INFO environment: Running hook: environment_load
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: environment_load #<Vagrant::Action::Builder:0x00562ce0ba7580>
 INFO cli: CLI: [] "up" []
DEBUG cli: Invoking command class: VagrantPlugins::CommandUp::Command []
DEBUG command: 'Up' each target VM...
 INFO loader: Set :root = ["#<Pathname:/mnt/backup/home/uvsmtid/vagrant.issue.dir/Vagrantfile>"]
DEBUG loader: Populating proc cache for #<Pathname:/mnt/backup/home/uvsmtid/vagrant.issue.dir/Vagrantfile>
DEBUG loader: Load procs for pathname: /mnt/backup/home/uvsmtid/vagrant.issue.dir/Vagrantfile
 INFO loader: Loading configuration in order: [:home, :root]
DEBUG loader: Loading from: root (evaluating)
DEBUG loader: Configuration loaded successfully, finalizing and returning
DEBUG push: finalizing
 INFO host: Autodetecting host type for [#<Vagrant::Environment: /mnt/backup/home/uvsmtid/vagrant.issue.dir>]
DEBUG host: Trying: gentoo
DEBUG host: Trying: arch
DEBUG host: Trying: darwin
DEBUG host: Trying: redhat
 INFO host: Detected: redhat!
DEBUG host: Searching for cap: provider_install_libvirt
DEBUG host: Checking in: redhat
DEBUG host: Checking in: linux
DEBUG command: Getting target VMs for command. Arguments:
DEBUG command:  -- names: ["observer_client_1"]
DEBUG command:  -- options: {:provider=>nil}
DEBUG command: Finding machine that match name: observer_client_1
 INFO environment: Getting machine: observer_client_1 (libvirt)
 INFO environment: Uncached load of machine.
 INFO loader: Set "47375378870040_machine_observer_client_1" = ["[\"2\", #<Proc:0x00562ce12ddf98@/mnt/backup/home/uvsmtid/vagrant.issue.dir/Vagrantfile:17>]"]
DEBUG loader: Populating proc cache for ["2", #<Proc:0x00562ce12ddf98@/mnt/backup/home/uvsmtid/vagrant.issue.dir/Vagrantfile:17>]
 INFO loader: Loading configuration in order: [:home, :root, "47375378870040_machine_observer_client_1"]
DEBUG loader: Loading from: root (cache)
DEBUG loader: Loading from: 47375378870040_machine_observer_client_1 (evaluating)
DEBUG provisioner: Provisioner defined: 
DEBUG loader: Configuration loaded successfully, finalizing and returning
DEBUG push: finalizing
 INFO box_collection: Box found: uvsmtid/centos-7.1-1503-gnome (libvirt)
 INFO environment: Running hook: authenticate_box_url
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 3 hooks defined.
 INFO runner: Running action: authenticate_box_url #<Vagrant::Action::Builder:0x00562ce10959e8>
 INFO warden: Calling IN action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x00562ce1954570>
DEBUG client: No authentication token in environment or /home/uvsmtid/.vagrant.d/data/vagrant_login_token
 INFO warden: Calling OUT action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x00562ce1954570>
 INFO loader: Set :"47375369586780_uvsmtid/centos-7.1-1503-gnome_libvirt" = ["#<Pathname:/home/uvsmtid/.vagrant.d/boxes/uvsmtid-VAGRANTSLASH-centos-7.1-1503-gnome/1.0.1/libvirt/Vagrantfile>"]
DEBUG loader: Populating proc cache for #<Pathname:/home/uvsmtid/.vagrant.d/boxes/uvsmtid-VAGRANTSLASH-centos-7.1-1503-gnome/1.0.1/libvirt/Vagrantfile>
DEBUG loader: Load procs for pathname: /home/uvsmtid/.vagrant.d/boxes/uvsmtid-VAGRANTSLASH-centos-7.1-1503-gnome/1.0.1/libvirt/Vagrantfile
 INFO loader: Loading configuration in order: [:"47375369586780_uvsmtid/centos-7.1-1503-gnome_libvirt", :home, :root, "47375378870040_machine_observer_client_1"]
DEBUG loader: Loading from: 47375369586780_uvsmtid/centos-7.1-1503-gnome_libvirt (evaluating)
DEBUG loader: Loading from: root (cache)
DEBUG loader: Loading from: 47375378870040_machine_observer_client_1 (cache)
DEBUG loader: Configuration loaded successfully, finalizing and returning
DEBUG push: finalizing
 INFO machine: Initializing machine: observer_client_1
 INFO machine:   - Provider: VagrantPlugins::ProviderLibvirt::Provider
 INFO machine:   - Box: #<Vagrant::Box:0x00562ce1350610>
 INFO machine:   - Data dir: /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt
 INFO machine: New machine ID: nil
 INFO interface: Machine: metadata ["provider", :libvirt, {:target=>:observer_client_1}]
 INFO command: With machine: observer_client_1 (#<VagrantPlugins::ProviderLibvirt::Provider:0x00562ce1c83d90 @machine=#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, @cap_logger=#<Log4r::Logger:0x00562ce1c83750 @fullname="vagrant::capability_host::vagrantplugins::providerlibvirt::provider", @outputters=[], @additive=true, @name="provider", @path="vagrant::capability_host::vagrantplugins::providerlibvirt", @parent=#<Log4r::Logger:0x00562ce0a8c448 @fullname="vagrant", @outputters=[#<Log4r::StderrOutputter:0x00562ce0a0af38 @mon_owner=nil, @mon_count=0, @mon_mutex=#<Mutex:0x00562ce0a0ad30>, @name="stderr", @level=0, @formatter=#<Log4r::DefaultFormatter:0x00562ce0a06000 @depth=7>, @out=#<IO:<STDERR>>>], @additive=true, @name="vagrant", @path="", @parent=#<Log4r::RootLogger:0x00562ce0a8c218 @level=0, @outputters=[]>, @level=1, @trace=false>, @level=1, @trace=false>, @cap_host_chain=[[:libvirt, #<#<Class:0x00562ce1c83d18>:0x00562ce1c66290>]], @cap_args=[#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>], @cap_caps={:docker=>#<Vagrant::Registry:0x00562ce1c83bb0 @items={:public_address=>#<Proc:0x00562ce108a7f0@/usr/share/vagrant/plugins/providers/docker/plugin.rb:54>, :proxy_machine=>#<Proc:0x00562ce108a6b0@/usr/share/vagrant/plugins/providers/docker/plugin.rb:59>}, @results_cache={}>, :virtualbox=>#<Vagrant::Registry:0x00562ce1c83ac0 @items={:forwarded_ports=>#<Proc:0x00562ce1094b38@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:27>, :nic_mac_addresses=>#<Proc:0x00562ce1094a98@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:32>, :public_address=>#<Proc:0x00562ce1094a70@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:37>, :snapshot_list=>#<Proc:0x00562ce1094a48@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:42>}, @results_cache={}>, :hyperv=>#<Vagrant::Registry:0x00562ce1c83a20 @items={:public_address=>#<Proc:0x00562ce109c400@/usr/share/vagrant/plugins/providers/hyperv/plugin.rb:25>}, @results_cache={}>, :libvirt=>#<Vagrant::Registry:0x00562ce1c83958 @items={:nic_mac_addresses=>#<Proc:0x00562ce13382e0@/usr/share/vagrant/gems/gems/vagrant-libvirt-0.0.32/lib/vagrant-libvirt/plugin.rb:41>}, @results_cache={}>}>)
 INFO interface: info: Bringing machine 'observer_client_1' up with 'libvirt' provider...
 INFO batch_action: Enabling parallelization by default.
 INFO batch_action: Disabling parallelization because only executing one action
 INFO batch_action: Batch action will parallelize: false
 INFO batch_action: Starting action: #<Vagrant::Machine:0x00562ce1cc69d8> up {:destroy_on_error=>true, :install_provider=>true, :parallel=>true, :provision_ignore_sentinel=>false, :provision_types=>nil}
 INFO machine: Calling action: up on provider Libvirt (new)
DEBUG environment: Attempting to acquire process-lock: machine-action-3f82de25c284b1371becb8a838898b3c
DEBUG environment: Attempting to acquire process-lock: dotlock
 INFO environment: Acquired process lock: dotlock
 INFO environment: Released process lock: dotlock
 INFO environment: Acquired process lock: machine-action-3f82de25c284b1371becb8a838898b3c
 INFO interface: Machine: action ["up", "start", {:target=>:observer_client_1}]
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: machine_action_up #<Vagrant::Action::Builder:0x00562ce1b891b0>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::ConfigValidate:0x00562ce1b6c808>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::Call:0x00562ce1b6c7e0>
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: machine_action_up #<Vagrant::Action::Builder:0x00562ce1af93d0>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::IsCreated:0x00562ce1aed9e0>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::IsCreated:0x00562ce1aed9e0>
 INFO driver: Connecting to Libvirt (qemu:///system?no_verify=1&keyfile=/home/uvsmtid/.ssh/id_rsa) ...
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: machine_action_up #<Vagrant::Action::Warden:0x00562ce136fdf8>
 INFO warden: Calling IN action: #<Proc:0x00562ce19aebb0@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::SetNameOfDomain:0x00562ce136fd08>
 INFO set_name_of_domain: Looking for domain vagrantissuedir_observer_client_1 through list [  <Fog::Compute::Libvirt::Server
    id="2229aace-8821-414c-a5dc-0fc0361f8f8d",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="openstack-client",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="2229aace-8821-414c-a5dc-0fc0361f8f8d",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:41:3a:7c",
      id=nil,
      type="network",
      network="default",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/openstack-client.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/openstack-client.qcow2",
      name="openstack-client.qcow2",
      path="/mnt/virt_storage/openstack-client.qcow2",
      capacity=20,
      allocation=3,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="65cc88f0-d701-4312-8964-dc4c8132f9d4",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="skggws1a",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="65cc88f0-d701-4312-8964-dc4c8132f9d4",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="08:00:27:6e:3a:32",
      id=nil,
      type="network",
      network="virt",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/skggws1a.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/skggws1a.qcow2",
      name="skggws1a.qcow2",
      path="/mnt/virt_storage/skggws1a.qcow2",
      capacity=50,
      allocation=30,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="9feb1199-5f81-40ab-ab33-015ef30c14c3",
    cpus=2,
    cputime=0,
    os_type="hvm",
    memory_size=4194304,
    max_memory_size=4194304,
    name="vagrantdir_observer_client_1",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="9feb1199-5f81-40ab-ab33-015ef30c14c3",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:0d:1b:92",
      id=nil,
      type="network",
      network="vagrant-libvirt",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="fa:16:3e:3d:c8:77",
      id=nil,
      type="network",
      network="vagrant_internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/vagrantdir_observer_client_1.img",
      pool_name="default",
      key="/var/lib/libvirt/images/vagrantdir_observer_client_1.img",
      name="vagrantdir_observer_client_1.img",
      path="/var/lib/libvirt/images/vagrantdir_observer_client_1.img",
      capacity=138,
      allocation=14,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"vnc", :port=>"-1", :listen=>"127.0.0.1"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="586cb11d-e9b3-489e-a77d-e9730f4b3551",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="occgws1a",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="586cb11d-e9b3-489e-a77d-e9730f4b3551",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="08:00:27:6e:3a:31",
      id=nil,
      type="network",
      network="virt",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/Microsoft.Visual.Studio.Image_140613_1615.iso",
      pool_name="virt_storage",
      key="/mnt/virt_storage/Microsoft.Visual.Studio.Image_140613_1615.iso",
      name="Microsoft.Visual.Studio.Image_140613_1615.iso",
      path="/mnt/virt_storage/Microsoft.Visual.Studio.Image_140613_1615.iso",
      capacity=2,
      allocation=2,
      format_type="iso",
      backing_volume=nil
    >,     <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/occgws1a.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/occgws1a.qcow2",
      name="occgws1a.qcow2",
      path="/mnt/virt_storage/occgws1a.qcow2",
      capacity=50,
      allocation=33,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="5a9fc4a9-7519-492e-86d5-1052a6b0084a",
    cpus=2,
    cputime=0,
    os_type="hvm",
    memory_size=4120576,
    max_memory_size=4120576,
    name="nelskg1a",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="5a9fc4a9-7519-492e-86d5-1052a6b0084a",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="08:00:27:6e:3a:11",
      id=nil,
      type="network",
      network="virt",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_swap/nelskg1a_swap.qcow2",
      pool_name="virt_swap",
      key="/mnt/virt_swap/nelskg1a_swap.qcow2",
      name="nelskg1a_swap.qcow2",
      path="/mnt/virt_swap/nelskg1a_swap.qcow2",
      capacity=16,
      allocation=2,
      format_type="qcow2",
      backing_volume=nil
    >,     <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/nelskg1a.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/nelskg1a.qcow2",
      name="nelskg1a.qcow2",
      path="/mnt/virt_storage/nelskg1a.qcow2",
      capacity=50,
      allocation=41,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=[],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="4776d9b1-f0cd-4dde-913c-baf14c3a69ef",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="observer-client",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="4776d9b1-f0cd-4dde-913c-baf14c3a69ef",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:1a:36:ea",
      id=nil,
      type="network",
      network="internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/observer-client.qcow2",
      pool_name="default",
      key="/var/lib/libvirt/images/observer-client.qcow2",
      name="observer-client.qcow2",
      path="/var/lib/libvirt/images/observer-client.qcow2",
      capacity=24,
      allocation=18,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="5d1ca606-2815-42ce-a560-d075f65a6c6a",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=4194304,
    max_memory_size=4194304,
    name="observer_client_3",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="5d1ca606-2815-42ce-a560-d075f65a6c6a",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:10:a0:60",
      id=nil,
      type="network",
      network="vagrant_internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/observer_client_3.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/observer_client_3.qcow2",
      name="observer_client_3.qcow2",
      path="/mnt/virt_storage/observer_client_3.qcow2",
      capacity=50,
      allocation=25,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="4b2bac97-020f-42fa-8312-190dc4f2c441",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="observer-server",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="4b2bac97-020f-42fa-8312-190dc4f2c441",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:b8:38:8d",
      id=nil,
      type="network",
      network="vagrant-libvirt",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:d9:96:24",
      id=nil,
      type="network",
      network="primary_vagrant_private_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/observer-server.img",
      pool_name="default",
      key="/var/lib/libvirt/images/observer-server.img",
      name="observer-server.img",
      path="/var/lib/libvirt/images/observer-server.img",
      capacity=138,
      allocation=9,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"vnc", :port=>"-1", :listen=>"127.0.0.1"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="2b2a8012-4e4a-4dd2-8d43-204113ec3dd1",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="nedgws1a",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="2b2a8012-4e4a-4dd2-8d43-204113ec3dd1",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="08:00:27:6e:3a:33",
      id=nil,
      type="network",
      network="virt",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/Microsoft.Visual.Studio.Image_140613_1615.iso",
      pool_name="virt_storage",
      key="/mnt/virt_storage/Microsoft.Visual.Studio.Image_140613_1615.iso",
      name="Microsoft.Visual.Studio.Image_140613_1615.iso",
      path="/mnt/virt_storage/Microsoft.Visual.Studio.Image_140613_1615.iso",
      capacity=2,
      allocation=2,
      format_type="iso",
      backing_volume=nil
    >,     <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/nedgws1a.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/nedgws1a.qcow2",
      name="nedgws1a.qcow2",
      path="/mnt/virt_storage/nedgws1a.qcow2",
      capacity=50,
      allocation=30,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="cc078341-947e-4151-9c7c-43af74fe6454",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="nelskg2a",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="cc078341-947e-4151-9c7c-43af74fe6454",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="08:00:27:6e:3a:12",
      id=nil,
      type="network",
      network="virt",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/nelskg2a.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/nelskg2a.qcow2",
      name="nelskg2a.qcow2",
      path="/mnt/virt_storage/nelskg2a.qcow2",
      capacity=50,
      allocation=30,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="7fc8b5b1-0baa-4ed8-acd0-132517e35a5e",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="virtual-water-way-OMSA",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="7fc8b5b1-0baa-4ed8-acd0-132517e35a5e",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:48:b7:59",
      id=nil,
      type="network",
      network="vagrant_internal_net",
      bridge=nil,
      model="rtl8139"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/virtual-water-way-OMSA.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/virtual-water-way-OMSA.qcow2",
      name="virtual-water-way-OMSA.qcow2",
      path="/mnt/virt_storage/virtual-water-way-OMSA.qcow2",
      capacity=50,
      allocation=7,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="86a42be6-42da-446c-97d1-9bddbcd9a4c7",
    cpus=2,
    cputime=0,
    os_type="hvm",
    memory_size=2072576,
    max_memory_size=2072576,
    name="vagrantdir_rhel7_minion",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="86a42be6-42da-446c-97d1-9bddbcd9a4c7",
    autostart=false,
    nics=[],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/vagrantdir_rhel7_minion.img",
      pool_name="default",
      key="/var/lib/libvirt/images/vagrantdir_rhel7_minion.img",
      name="vagrantdir_rhel7_minion.img",
      path="/var/lib/libvirt/images/vagrantdir_rhel7_minion.img",
      capacity=138,
      allocation=0,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"vnc", :port=>"-1", :listen=>"127.0.0.1"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="6bbdca3e-8cbe-4bed-88ea-850f924fef8a",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="virtual-water-way-clone",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="6bbdca3e-8cbe-4bed-88ea-850f924fef8a",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:78:ab:9f",
      id=nil,
      type="network",
      network="internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id=nil,
      pool_name="default",
      key=nil,
      name="fog-939916696468693",
      path=nil,
      capacity="10G",
      allocation="1G",
      format_type="raw",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="073639b5-9d07-451b-bece-3a6f048c6817",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=4194304,
    max_memory_size=4194304,
    name="SAVED-SQUEEZED-observer_client_3",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="073639b5-9d07-451b-bece-3a6f048c6817",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:2a:d6:ca",
      id=nil,
      type="network",
      network="primary_vagrant_private_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/SAVED-SQUEEZED-observer_client_3.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/SAVED-SQUEEZED-observer_client_3.qcow2",
      name="SAVED-SQUEEZED-observer_client_3.qcow2",
      path="/mnt/virt_storage/SAVED-SQUEEZED-observer_client_3.qcow2",
      capacity=50,
      allocation=24,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="aa259751-00ee-4fb4-996e-ba2b30493410",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="virtual-water-way",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="aa259751-00ee-4fb4-996e-ba2b30493410",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:83:f0:2e",
      id=nil,
      type="network",
      network="internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/mnt/virt_storage/virtual-water-way.qcow2",
      pool_name="virt_storage",
      key="/mnt/virt_storage/virtual-water-way.qcow2",
      name="virtual-water-way.qcow2",
      path="/mnt/virt_storage/virtual-water-way.qcow2",
      capacity=50,
      allocation=6,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="1939194c-0d57-4d27-82ad-779b4b6fc128",
    cpus=2,
    cputime=0,
    os_type="hvm",
    memory_size=4194304,
    max_memory_size=4194304,
    name="vagrantdir_observer_server_1",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="1939194c-0d57-4d27-82ad-779b4b6fc128",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:94:ae:c0",
      id=nil,
      type="network",
      network="vagrant-libvirt",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:f6:ec:96",
      id=nil,
      type="network",
      network="vagrant_internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/vagrantdir_observer_server_1.img",
      pool_name="default",
      key="/var/lib/libvirt/images/vagrantdir_observer_server_1.img",
      name="vagrantdir_observer_server_1.img",
      path="/var/lib/libvirt/images/vagrantdir_observer_server_1.img",
      capacity=138,
      allocation=10,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"vnc", :port=>"-1", :listen=>"127.0.0.1"},
    state="shutoff"
  >]
 INFO set_name_of_domain: Looking for domain vagrantissuedir_observer_client_1
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::HandleStoragePool:0x00562ce12b37e8>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::HandleBox:0x00562ce120df28>
 INFO handle_box: Machine already has box. HandleBox will not run.
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::HandleBoxImage:0x00562ce1168cd0>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomainVolume:0x00562ce10ba090>
 INFO interface: info: Creating image (snapshot of base box volume).
 INFO interface: info: ==> observer_client_1: Creating image (snapshot of base box volume).
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomain:0x00562ce0b478b0>
 INFO interface: info: Creating domain with the following settings...
 INFO interface: info: ==> observer_client_1: Creating domain with the following settings...
 INFO interface: info:  -- Name:              vagrantissuedir_observer_client_1
 INFO interface: info: ==> observer_client_1:  -- Name:              vagrantissuedir_observer_client_1
 INFO interface: info:  -- Domain type:       kvm
 INFO interface: info: ==> observer_client_1:  -- Domain type:       kvm
 INFO interface: info:  -- Cpus:              2
 INFO interface: info: ==> observer_client_1:  -- Cpus:              2
 INFO interface: info:  -- Memory:            2048M
 INFO interface: info: ==> observer_client_1:  -- Memory:            2048M
 INFO interface: info:  -- Management MAC:    
 INFO interface: info: ==> observer_client_1:  -- Management MAC:    
 INFO interface: info:  -- Loader:            
 INFO interface: info: ==> observer_client_1:  -- Loader:            
 INFO interface: info:  -- Base box:          uvsmtid/centos-7.1-1503-gnome
 INFO interface: info: ==> observer_client_1:  -- Base box:          uvsmtid/centos-7.1-1503-gnome
 INFO interface: info:  -- Storage pool:      default
 INFO interface: info: ==> observer_client_1:  -- Storage pool:      default
 INFO interface: info:  -- Image:             /var/lib/libvirt/images/vagrantissuedir_observer_client_1.img (138G)
 INFO interface: info: ==> observer_client_1:  -- Image:             /var/lib/libvirt/images/vagrantissuedir_observer_client_1.img (138G)
 INFO interface: info:  -- Volume Cache:      default
 INFO interface: info: ==> observer_client_1:  -- Volume Cache:      default
 INFO interface: info:  -- Kernel:            
 INFO interface: info: ==> observer_client_1:  -- Kernel:            
 INFO interface: info:  -- Initrd:            
 INFO interface: info: ==> observer_client_1:  -- Initrd:            
 INFO interface: info:  -- Graphics Type:     vnc
 INFO interface: info: ==> observer_client_1:  -- Graphics Type:     vnc
 INFO interface: info:  -- Graphics Port:     5900
 INFO interface: info: ==> observer_client_1:  -- Graphics Port:     5900
 INFO interface: info:  -- Graphics IP:       127.0.0.1
 INFO interface: info: ==> observer_client_1:  -- Graphics IP:       127.0.0.1
 INFO interface: info:  -- Graphics Password: Not defined
 INFO interface: info: ==> observer_client_1:  -- Graphics Password: Not defined
 INFO interface: info:  -- Video Type:        qxl
 INFO interface: info: ==> observer_client_1:  -- Video Type:        qxl
 INFO interface: info:  -- Video VRAM:        9216
 INFO interface: info: ==> observer_client_1:  -- Video VRAM:        9216
 INFO interface: info:  -- Keymap:            en-us
 INFO interface: info: ==> observer_client_1:  -- Keymap:            en-us
 INFO interface: info:  -- INPUT:             type=mouse, bus=ps2
 INFO interface: info: ==> observer_client_1:  -- INPUT:             type=mouse, bus=ps2
 INFO interface: info:  -- Command line : 
 INFO interface: info: ==> observer_client_1:  -- Command line : 
 INFO machine: New machine ID: "7454b2f1-eca9-4856-b8a4-2327dbaf62b3"
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::Provision:0x00562ce0a096d8>
 INFO provision: Checking provisioner sentinel file...
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSValidIds:0x00562ce1d858b0>
 INFO warden: Calling IN action: #<VagrantPlugins::SyncedFolderNFS::ActionCleanup:0x00562ce1d61488>
DEBUG host: Searching for cap: nfs_prune
DEBUG host: Checking in: redhat
DEBUG host: Checking in: linux
DEBUG host: Found cap: nfs_prune in linux
 INFO nfs: NFS pruning. Valid IDs: ["2229aace-8821-414c-a5dc-0fc0361f8f8d", "65cc88f0-d701-4312-8964-dc4c8132f9d4", "9feb1199-5f81-40ab-ab33-015ef30c14c3", "586cb11d-e9b3-489e-a77d-e9730f4b3551", "5a9fc4a9-7519-492e-86d5-1052a6b0084a", "4776d9b1-f0cd-4dde-913c-baf14c3a69ef", "5d1ca606-2815-42ce-a560-d075f65a6c6a", "4b2bac97-020f-42fa-8312-190dc4f2c441", "7454b2f1-eca9-4856-b8a4-2327dbaf62b3", "2b2a8012-4e4a-4dd2-8d43-204113ec3dd1", "cc078341-947e-4151-9c7c-43af74fe6454", "7fc8b5b1-0baa-4ed8-acd0-132517e35a5e", "86a42be6-42da-446c-97d1-9bddbcd9a4c7", "6bbdca3e-8cbe-4bed-88ea-850f924fef8a", "073639b5-9d07-451b-bece-3a6f048c6817", "aa259751-00ee-4fb4-996e-ba2b30493410", "1939194c-0d57-4d27-82ad-779b4b6fc128"]
DEBUG host: Searching for cap: nfs_prune
DEBUG host: Checking in: redhat
DEBUG host: Checking in: linux
DEBUG host: Found cap: nfs_prune in linux
 INFO host: Execute capability: nfs_prune [#<Vagrant::Environment: /mnt/backup/home/uvsmtid/vagrant.issue.dir>, #<Vagrant::UI::Prefixed:0x00562ce1ca23d0 @logger=#<Log4r::Logger:0x00562ce1ca2358 @fullname="vagrant::ui::interface", @outputters=[], @additive=true, @name="interface", @path="vagrant::ui", @parent=#<Log4r::Logger:0x00562ce0a8c448 @fullname="vagrant", @outputters=[#<Log4r::StderrOutputter:0x00562ce0a0af38 @mon_owner=nil, @mon_count=0, @mon_mutex=#<Mutex:0x00562ce0a0ad30>, @name="stderr", @level=0, @formatter=#<Log4r::DefaultFormatter:0x00562ce0a06000 @depth=7>, @out=#<IO:<STDERR>>>], @additive=true, @name="vagrant", @path="", @parent=#<Log4r::RootLogger:0x00562ce0a8c218 @level=0, @outputters=[]>, @level=1, @trace=false>, @level=1, @trace=false>, @opts={}, @stdin=#<IO:<STDIN>>, @stdout=#<IO:<STDOUT>>, @stderr=#<IO:<STDERR>>, @prefix=:observer_client_1, @ui=#<Vagrant::UI::Basic:0x00562ce142d650 @logger=#<Log4r::Logger:0x00562ce142d5d8 @fullname="vagrant::ui::interface", @outputters=[], @additive=true, @name="interface", @path="vagrant::ui", @parent=#<Log4r::Logger:0x00562ce0a8c448 @fullname="vagrant", @outputters=[#<Log4r::StderrOutputter:0x00562ce0a0af38 @mon_owner=nil, @mon_count=0, @mon_mutex=#<Mutex:0x00562ce0a0ad30>, @name="stderr", @level=0, @formatter=#<Log4r::DefaultFormatter:0x00562ce0a06000 @depth=7>, @out=#<IO:<STDERR>>>], @additive=true, @name="vagrant", @path="", @parent=#<Log4r::RootLogger:0x00562ce0a8c218 @level=0, @outputters=[]>, @level=1, @trace=false>, @level=1, @trace=false>, @opts={:color=>:default}, @stdin=#<IO:<STDIN>>, @stdout=#<IO:<STDOUT>>, @stderr=#<IO:<STDERR>>, @lock=#<Mutex:0x00562ce13e3050>>>, ["2229aace-8821-414c-a5dc-0fc0361f8f8d", "65cc88f0-d701-4312-8964-dc4c8132f9d4", "9feb1199-5f81-40ab-ab33-015ef30c14c3", "586cb11d-e9b3-489e-a77d-e9730f4b3551", "5a9fc4a9-7519-492e-86d5-1052a6b0084a", "4776d9b1-f0cd-4dde-913c-baf14c3a69ef", "5d1ca606-2815-42ce-a560-d075f65a6c6a", "4b2bac97-020f-42fa-8312-190dc4f2c441", "7454b2f1-eca9-4856-b8a4-2327dbaf62b3", "2b2a8012-4e4a-4dd2-8d43-204113ec3dd1", "cc078341-947e-4151-9c7c-43af74fe6454", "7fc8b5b1-0baa-4ed8-acd0-132517e35a5e", "86a42be6-42da-446c-97d1-9bddbcd9a4c7", "6bbdca3e-8cbe-4bed-88ea-850f924fef8a", "073639b5-9d07-451b-bece-3a6f048c6817", "aa259751-00ee-4fb4-996e-ba2b30493410", "1939194c-0d57-4d27-82ad-779b4b6fc128"]] (redhat)
 INFO linux: Pruning invalid NFS entries...
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolderCleanup:0x00562ce1d3cca0>
 INFO synced_folder_cleanup: Invoking synced folder cleanup for: rsync
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolders:0x00562ce1d126d0>
 INFO synced_folders: SyncedFolders loading from cache: false
 INFO synced_folders: Synced Folder Implementation: rsync
 INFO synced_folders:   - /vagrant: . => /vagrant
 INFO synced_folders: Invoking synced folder prepare for: rsync
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSSettings:0x00562ce1ce5dd8>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::ShareFolders:0x00562ce1cc1140>
 INFO interface: info: Creating shared folders metadata...
 INFO interface: info: ==> observer_client_1: Creating shared folders metadata...
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworks:0x00562ce1c9ca48>
 INFO create_networks: Using vagrant-libvirt at 192.168.121.0/24 as the management network nat is the mode
DEBUG create_networks: In config found network type private_network options {:ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :libvirt__forward_mode=>"nat", :libvirt__dhcp_enabled=>true, :whatever=>true, :protocol=>"tcp", :id=>"177a6891-0670-4a5b-ab8e-8f2b3eccd91b"}
DEBUG create_networks: In config found network type forwarded_port options {:guest=>22, :host=>2222, :host_ip=>"127.0.0.1", :id=>"ssh", :auto_correct=>true, :protocol=>"tcp"}
DEBUG create_networks: Searching for network with options {:iface_type=>:private_network, :network_name=>"vagrant-libvirt", :ip=>"192.168.121.0", :netmask=>"255.255.255.0", :dhcp_enabled=>true, :forward_mode=>"nat"}
DEBUG create_networks: looking up network with ip == 192.168.121.0
DEBUG create_networks: Checking that network name does not clash with ip
DEBUG create_networks: looking up network named vagrant-libvirt
DEBUG create_networks: generating name for bridge
DEBUG create_networks: looking up bridge named virbr0
DEBUG create_networks: found available bridge name virbr0
DEBUG create_networks: created network
 INFO create_networks: Saving information about created network vagrant-libvirt, UUID=35c8c4ef-0412-4049-bdcf-75e5096ec957 to file /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/created_networks.
DEBUG create_networks: Searching for network with options {:iface_type=>:private_network, :netmask=>"255.255.255.0", :dhcp_enabled=>true, :forward_mode=>"nat", :ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :libvirt__forward_mode=>"nat", :libvirt__dhcp_enabled=>true, :whatever=>true, :protocol=>"tcp", :id=>"177a6891-0670-4a5b-ab8e-8f2b3eccd91b", :network_name=>"vagrant_internal_net"}
DEBUG create_networks: looking up network with ip == 192.168.1.0
DEBUG create_networks: Checking that network name does not clash with ip
DEBUG create_networks: looking up network named vagrant_internal_net
DEBUG create_networks: generating name for bridge
DEBUG create_networks: looking up bridge named virbr0
DEBUG create_networks: looking up bridge named virbr1
DEBUG create_networks: found available bridge name virbr1
DEBUG create_networks: created network
 INFO create_networks: Saving information about created network vagrant_internal_net, UUID=da103e8b-f7c5-4671-a7cf-718814db73bd to file /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/created_networks.
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworkInterfaces:0x00562ce1b99d80>
 INFO create_network_interfaces: Using vagrant-libvirt at 192.168.121.0/24 as the management network nat is the mode
DEBUG create_network_interfaces: In config found network type private_network options {:ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :libvirt__forward_mode=>"nat", :libvirt__dhcp_enabled=>true, :whatever=>true, :protocol=>"tcp", :id=>"177a6891-0670-4a5b-ab8e-8f2b3eccd91b"}
DEBUG create_network_interfaces: In config found network type forwarded_port options {:guest=>22, :host=>2222, :host_ip=>"127.0.0.1", :id=>"ssh", :auto_correct=>true, :protocol=>"tcp"}
DEBUG create_network_interfaces: Adapter not specified so found slot 0
DEBUG create_network_interfaces: Found network by name
DEBUG create_network_interfaces: Adapter not specified so found slot 1
DEBUG create_network_interfaces: Found network by name
 INFO create_network_interfaces: Creating network interface eth0 connected to network vagrant-libvirt.
 INFO create_network_interfaces: Creating network interface eth1 connected to network vagrant_internal_net. Using MAC address: FA:16:3E:3D:C8:77
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::SetBootOrder:0x00562ce1b4d4a8>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::StartDomain:0x00562ce1b04848>
 INFO interface: info: Starting domain.
 INFO interface: info: ==> observer_client_1: Starting domain.
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00562ce1aae268>
 INFO interface: info: Waiting for domain to get an IP address...
 INFO interface: info: ==> observer_client_1: Waiting for domain to get an IP address...
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO wait_till_up: Got IP address 192.168.121.155
 INFO wait_till_up: Time for getting IP: 11.118990659713745
 INFO interface: info: Waiting for SSH to become available...
 INFO interface: info: ==> observer_client_1: Waiting for SSH to become available...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Checking key permissions: /home/uvsmtid/.vagrant.d/insecure_private_key
 INFO ssh: Attempting SSH connection...
 INFO ssh: Attempting to connect to SSH...
 INFO ssh:   - Host: 192.168.121.155
 INFO ssh:   - Port: 22
 INFO ssh:   - Username: vagrant
 INFO ssh:   - Password? false
 INFO ssh:   - Key Path: ["/home/uvsmtid/.vagrant.d/insecure_private_key"]
DEBUG ssh: == Net-SSH connection debug-level log START ==
DEBUG ssh: D, [2016-06-22T11:31:22.283869 #15135] DEBUG -- net.ssh.transport.session[2b1670d0e4ac]: establishing connection to 192.168.121.155:22
D, [2016-06-22T11:31:22.284432 #15135] DEBUG -- net.ssh.transport.session[2b1670d0e4ac]: connection established
I, [2016-06-22T11:31:22.284516 #15135]  INFO -- net.ssh.transport.server_version[2b1670d0d444]: negotiating protocol version
D, [2016-06-22T11:31:22.388018 #15135] DEBUG -- net.ssh.transport.server_version[2b1670d0d444]: remote is `SSH-2.0-OpenSSH_6.6.1'
D, [2016-06-22T11:31:22.388092 #15135] DEBUG -- net.ssh.transport.server_version[2b1670d0d444]: local is `SSH-2.0-Ruby/Net::SSH_2.9.1 x86_64-linux'
D, [2016-06-22T11:31:22.602101 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 1640 bytes
D, [2016-06-22T11:31:22.602248 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 0 type 20 len 1636
I, [2016-06-22T11:31:22.602311 #15135]  INFO -- net.ssh.transport.algorithms[2b1670d0c65c]: got KEXINIT from server
I, [2016-06-22T11:31:22.602408 #15135]  INFO -- net.ssh.transport.algorithms[2b1670d0c65c]: sending KEXINIT
D, [2016-06-22T11:31:22.602520 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 0 type 20 len 2020
D, [2016-06-22T11:31:22.602581 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 2024 bytes
I, [2016-06-22T11:31:22.602610 #15135]  INFO -- net.ssh.transport.algorithms[2b1670d0c65c]: negotiating algorithms
D, [2016-06-22T11:31:22.602710 #15135] DEBUG -- net.ssh.transport.algorithms[2b1670d0c65c]: negotiated:
* kex: diffie-hellman-group-exchange-sha1
* host_key: ssh-rsa
* encryption_server: aes128-cbc
* encryption_client: aes128-cbc
* hmac_client: hmac-sha1
* hmac_server: hmac-sha1
* compression_client: none
* compression_server: none
* language_client: 
* language_server: 
D, [2016-06-22T11:31:22.602725 #15135] DEBUG -- net.ssh.transport.algorithms[2b1670d0c65c]: exchanging keys
D, [2016-06-22T11:31:22.602873 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 1 type 34 len 20
D, [2016-06-22T11:31:22.602907 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 24 bytes
D, [2016-06-22T11:31:22.615162 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 152 bytes
D, [2016-06-22T11:31:22.615391 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 1 type 31 len 148
D, [2016-06-22T11:31:22.616555 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 2 type 32 len 140
D, [2016-06-22T11:31:22.616638 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 144 bytes
D, [2016-06-22T11:31:22.622911 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 720 bytes
D, [2016-06-22T11:31:22.623005 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 2 type 33 len 700
D, [2016-06-22T11:31:22.623645 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 3 type 21 len 20
D, [2016-06-22T11:31:22.623708 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 24 bytes
D, [2016-06-22T11:31:22.623797 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 3 type 21 len 12
D, [2016-06-22T11:31:22.624004 #15135] DEBUG -- net.ssh.authentication.session[2b1670cde52c]: beginning authentication of `vagrant'
D, [2016-06-22T11:31:22.624075 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 4 type 5 len 28
D, [2016-06-22T11:31:22.624104 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 52 bytes
D, [2016-06-22T11:31:22.666812 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 52 bytes
D, [2016-06-22T11:31:22.666965 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 4 type 6 len 28
D, [2016-06-22T11:31:22.667062 #15135] DEBUG -- net.ssh.authentication.session[2b1670cde52c]: trying none
D, [2016-06-22T11:31:22.667212 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 5 type 50 len 44
D, [2016-06-22T11:31:22.667274 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 68 bytes
D, [2016-06-22T11:31:22.670251 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 84 bytes
D, [2016-06-22T11:31:22.670373 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 5 type 51 len 60
D, [2016-06-22T11:31:22.670488 #15135] DEBUG -- net.ssh.authentication.session[2b1670cde52c]: allowed methods: publickey,gssapi-keyex,gssapi-with-mic,password
D, [2016-06-22T11:31:22.670548 #15135] DEBUG -- net.ssh.authentication.methods.none[2b1670cd7754]: none failed
D, [2016-06-22T11:31:22.670610 #15135] DEBUG -- net.ssh.authentication.session[2b1670cde52c]: trying publickey
D, [2016-06-22T11:31:22.670805 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: connecting to ssh-agent
D, [2016-06-22T11:31:22.671451 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: sending agent request 1 len 44
D, [2016-06-22T11:31:22.671751 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: received agent packet 2 len 5
D, [2016-06-22T11:31:22.671786 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: sending agent request 11 len 0
D, [2016-06-22T11:31:22.672545 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: received agent packet 12 len 624
D, [2016-06-22T11:31:22.672793 #15135] DEBUG -- net.ssh.authentication.methods.publickey[2b1670cd2b50]: trying publickey (dd:3b:b8:2e:85:04:06:e9:ab:ff:a8:0a:c0:04:6e:d6)
D, [2016-06-22T11:31:22.672891 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 6 type 50 len 348
D, [2016-06-22T11:31:22.672963 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 372 bytes
D, [2016-06-22T11:31:22.695004 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 324 bytes
D, [2016-06-22T11:31:22.695160 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 6 type 60 len 300
D, [2016-06-22T11:31:22.695358 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: sending agent request 13 len 649
D, [2016-06-22T11:31:22.698770 #15135] DEBUG -- net.ssh.authentication.agent[2b1670cd2858]: received agent packet 14 len 276
D, [2016-06-22T11:31:22.698886 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: queueing packet nr 7 type 50 len 620
D, [2016-06-22T11:31:22.698943 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: sent 644 bytes
D, [2016-06-22T11:31:22.712679 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: read 36 bytes
D, [2016-06-22T11:31:22.712895 #15135] DEBUG -- tcpsocket[2b1670d0dca0]: received packet nr 7 type 52 len 12
D, [2016-06-22T11:31:22.712956 #15135] DEBUG -- net.ssh.authentication.methods.publickey[2b1670cd2b50]: publickey succeeded (dd:3b:b8:2e:85:04:06:e9:ab:ff:a8:0a:c0:04:6e:d6)

DEBUG ssh: == Net-SSH connection debug-level log END ==
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Checking key permissions: /home/uvsmtid/.vagrant.d/insecure_private_key
 INFO interface: detail: 
Vagrant insecure key detected. Vagrant will automatically replace
this with a newly generated keypair for better security.
 INFO interface: detail:     observer_client_1: 
    observer_client_1: Vagrant insecure key detected. Vagrant will automatically replace
    observer_client_1: this with a newly generated keypair for better security.
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
 INFO guest: Autodetecting host type for [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>]
DEBUG guest: Trying: mint
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/issue | grep 'Linux Mint' (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: atomic
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: grep 'ostree=' /proc/cmdline (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: ubuntu
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: [ -x /usr/bin/lsb_release ] && /usr/bin/lsb_release -i 2>/dev/null | grep Ubuntu (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: pld
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/pld-release (sudo=false)
DEBUG ssh: stderr: cat: /etc/pld-release: No such file or directory

DEBUG ssh: Exit status: 1
DEBUG guest: Trying: fedora
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: grep 'Fedora release' /etc/redhat-release (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: tinycore
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/issue | grep 'Core Linux' (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: gentoo
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: grep Gentoo /etc/gentoo-release (sudo=false)
DEBUG ssh: stderr: grep: /etc/gentoo-release
DEBUG ssh: stderr: : No such file or directory

DEBUG ssh: Exit status: 2
DEBUG guest: Trying: arch
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/arch-release (sudo=false)
DEBUG ssh: stderr: cat: /etc/arch-release
DEBUG ssh: stderr: : No such file or directory

DEBUG ssh: Exit status: 1
DEBUG guest: Trying: coreos
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/os-release | grep ID=coreos (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: nixos
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: test -f /run/current-system/nixos-version (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: debian
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/issue | grep 'Debian' (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: omnios
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/release | grep -i OmniOS (sudo=false)
DEBUG ssh: stderr: cat: /etc/release: No such file or directory

DEBUG ssh: Exit status: 1
DEBUG guest: Trying: photon
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/photon-release | grep 'VMware Photon Linux' (sudo=false)
DEBUG ssh: stderr: cat: /etc/photon-release: No such file or directory

DEBUG ssh: Exit status: 1
DEBUG guest: Trying: redhat
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/redhat-release (sudo=false)
DEBUG ssh: stdout: CentOS Linux release 7.1.1503 (Core) 

DEBUG ssh: Exit status: 0
 INFO guest: Detected: redhat!
DEBUG guest: Searching for cap: insert_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: insert_public_key in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: remove_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: remove_public_key in linux
 INFO ssh: Inserting key to avoid password: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDQTamTNkW9fYNXDX+zqkFizv8GKvof4ba4YmLkcvsojv48TOauQ4J+yu82x7+CyqCkmGGlswVsYKY0zMZQE3TuRGWmigShwd95CaYivZVECv3nJXCBJejxDSnN4jB/4f6XtETizhToD80KHc0Eb1SNTbSGp7At3bQczYXquFAwqgiQjo7Ww1xU8Rdwj22LjiDLYKuJ3BPV3AHFucPpYtlsEGmnxSPH/Kyv52gKnzi4pwpKBrSn5+Rtn1ScWTShGz7MTtnk30TIKg+QTwoZUiBSqRUkDPC2pGOkWaFQwwiu839Dgc8FCTrDXi2akQiztlRnLKqsp8lEJ2Sb+cgLvRG1 vagrant
 INFO interface: detail: 
Inserting generated public key within guest...
 INFO interface: detail:     observer_client_1: 
    observer_client_1: Inserting generated public key within guest...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: insert_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: insert_public_key in linux
 INFO guest: Execute capability: insert_public_key [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDQTamTNkW9fYNXDX+zqkFizv8GKvof4ba4YmLkcvsojv48TOauQ4J+yu82x7+CyqCkmGGlswVsYKY0zMZQE3TuRGWmigShwd95CaYivZVECv3nJXCBJejxDSnN4jB/4f6XtETizhToD80KHc0Eb1SNTbSGp7At3bQczYXquFAwqgiQjo7Ww1xU8Rdwj22LjiDLYKuJ3BPV3AHFucPpYtlsEGmnxSPH/Kyv52gKnzi4pwpKBrSn5+Rtn1ScWTShGz7MTtnk30TIKg+QTwoZUiBSqRUkDPC2pGOkWaFQwwiu839Dgc8FCTrDXi2akQiztlRnLKqsp8lEJ2Sb+cgLvRG1 vagrant"] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: mkdir -p ~/.ssh (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chmod 0700 ~/.ssh (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: printf 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDQTamTNkW9fYNXDX+zqkFizv8GKvof4ba4YmLkcvsojv48TOauQ4J+yu82x7+CyqCkmGGlswVsYKY0zMZQE3TuRGWmigShwd95CaYivZVECv3nJXCBJejxDSnN4jB/4f6XtETizhToD80KHc0Eb1SNTbSGp7At3bQczYXquFAwqgiQjo7Ww1xU8Rdwj22LjiDLYKuJ3BPV3AHFucPpYtlsEGmnxSPH/Kyv52gKnzi4pwpKBrSn5+Rtn1ScWTShGz7MTtnk30TIKg+QTwoZUiBSqRUkDPC2pGOkWaFQwwiu839Dgc8FCTrDXi2akQiztlRnLKqsp8lEJ2Sb+cgLvRG1 vagrant\n' >> ~/.ssh/authorized_keys (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chmod 0600 ~/.ssh/authorized_keys (sudo=false)
DEBUG ssh: Exit status: 0
 INFO interface: detail: Removing insecure key from the guest if it's present...
 INFO interface: detail:     observer_client_1: Removing insecure key from the guest if it's present...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: remove_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: remove_public_key in linux
 INFO guest: Execute capability: remove_public_key [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key"] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: test -f ~/.ssh/authorized_keys (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: sed -e '/^.*ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key.*$/d' ~/.ssh/authorized_keys > ~/.ssh/authorized_keys.new
mv ~/.ssh/authorized_keys.new ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys
 (sudo=false)
DEBUG ssh: Exit status: 0
 INFO interface: detail: Key inserted! Disconnecting and reconnecting using new SSH key...
 INFO interface: detail:     observer_client_1: Key inserted! Disconnecting and reconnecting using new SSH key...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Checking key permissions: /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key
 INFO ssh: Attempting to correct key permissions to 0600
 INFO ssh: Attempting SSH connection...
 INFO ssh: Attempting to connect to SSH...
 INFO ssh:   - Host: 192.168.121.155
 INFO ssh:   - Port: 22
 INFO ssh:   - Username: vagrant
 INFO ssh:   - Password? false
 INFO ssh:   - Key Path: ["/mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key"]
DEBUG ssh: == Net-SSH connection debug-level log START ==
DEBUG ssh: D, [2016-06-22T11:31:23.938898 #15135] DEBUG -- net.ssh.transport.session[2b1670eb321c]: establishing connection to 192.168.121.155:22
D, [2016-06-22T11:31:23.939286 #15135] DEBUG -- net.ssh.transport.session[2b1670eb321c]: connection established
I, [2016-06-22T11:31:23.939355 #15135]  INFO -- net.ssh.transport.server_version[2b1670eb25d8]: negotiating protocol version
D, [2016-06-22T11:31:23.944747 #15135] DEBUG -- net.ssh.transport.server_version[2b1670eb25d8]: remote is `SSH-2.0-OpenSSH_6.6.1'
D, [2016-06-22T11:31:23.944782 #15135] DEBUG -- net.ssh.transport.server_version[2b1670eb25d8]: local is `SSH-2.0-Ruby/Net::SSH_2.9.1 x86_64-linux'
D, [2016-06-22T11:31:23.946996 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 1640 bytes
D, [2016-06-22T11:31:23.947060 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 0 type 20 len 1636
I, [2016-06-22T11:31:23.947096 #15135]  INFO -- net.ssh.transport.algorithms[2b1670eb19bc]: got KEXINIT from server
I, [2016-06-22T11:31:23.947155 #15135]  INFO -- net.ssh.transport.algorithms[2b1670eb19bc]: sending KEXINIT
D, [2016-06-22T11:31:23.947328 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 0 type 20 len 2020
D, [2016-06-22T11:31:23.947382 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 2024 bytes
I, [2016-06-22T11:31:23.947409 #15135]  INFO -- net.ssh.transport.algorithms[2b1670eb19bc]: negotiating algorithms
D, [2016-06-22T11:31:23.947493 #15135] DEBUG -- net.ssh.transport.algorithms[2b1670eb19bc]: negotiated:
* kex: diffie-hellman-group-exchange-sha1
* host_key: ssh-rsa
* encryption_server: aes128-cbc
* encryption_client: aes128-cbc
* hmac_client: hmac-sha1
* hmac_server: hmac-sha1
* compression_client: none
* compression_server: none
* language_client: 
* language_server: 
D, [2016-06-22T11:31:23.947505 #15135] DEBUG -- net.ssh.transport.algorithms[2b1670eb19bc]: exchanging keys
D, [2016-06-22T11:31:23.947616 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 1 type 34 len 20
D, [2016-06-22T11:31:23.947643 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 24 bytes
D, [2016-06-22T11:31:23.948676 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 152 bytes
D, [2016-06-22T11:31:23.948722 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 1 type 31 len 148
D, [2016-06-22T11:31:23.949581 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 2 type 32 len 140
D, [2016-06-22T11:31:23.949634 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 144 bytes
D, [2016-06-22T11:31:23.951169 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 720 bytes
D, [2016-06-22T11:31:23.951226 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 2 type 33 len 700
D, [2016-06-22T11:31:23.951708 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 3 type 21 len 20
D, [2016-06-22T11:31:23.951740 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 24 bytes
D, [2016-06-22T11:31:23.951789 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 3 type 21 len 12
D, [2016-06-22T11:31:23.951951 #15135] DEBUG -- net.ssh.authentication.session[2b1670e6c6a0]: beginning authentication of `vagrant'
D, [2016-06-22T11:31:23.952013 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 4 type 5 len 28
D, [2016-06-22T11:31:23.952031 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 52 bytes
D, [2016-06-22T11:31:23.991587 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 52 bytes
D, [2016-06-22T11:31:23.991722 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 4 type 6 len 28
D, [2016-06-22T11:31:23.991802 #15135] DEBUG -- net.ssh.authentication.session[2b1670e6c6a0]: trying none
D, [2016-06-22T11:31:23.991892 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 5 type 50 len 44
D, [2016-06-22T11:31:23.991931 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 68 bytes
D, [2016-06-22T11:31:23.993029 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 84 bytes
D, [2016-06-22T11:31:23.993092 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 5 type 51 len 60
D, [2016-06-22T11:31:23.993149 #15135] DEBUG -- net.ssh.authentication.session[2b1670e6c6a0]: allowed methods: publickey,gssapi-keyex,gssapi-with-mic,password
D, [2016-06-22T11:31:23.993181 #15135] DEBUG -- net.ssh.authentication.methods.none[2b1670e6b28c]: none failed
D, [2016-06-22T11:31:23.993237 #15135] DEBUG -- net.ssh.authentication.session[2b1670e6c6a0]: trying publickey
D, [2016-06-22T11:31:23.993393 #15135] DEBUG -- net.ssh.authentication.agent[2b1670e693c4]: connecting to ssh-agent
D, [2016-06-22T11:31:23.993483 #15135] DEBUG -- net.ssh.authentication.agent[2b1670e693c4]: sending agent request 1 len 44
D, [2016-06-22T11:31:23.993722 #15135] DEBUG -- net.ssh.authentication.agent[2b1670e693c4]: received agent packet 2 len 5
D, [2016-06-22T11:31:23.993750 #15135] DEBUG -- net.ssh.authentication.agent[2b1670e693c4]: sending agent request 11 len 0
D, [2016-06-22T11:31:23.994330 #15135] DEBUG -- net.ssh.authentication.agent[2b1670e693c4]: received agent packet 12 len 624
D, [2016-06-22T11:31:23.994514 #15135] DEBUG -- net.ssh.authentication.methods.publickey[2b1670e69798]: trying publickey (39:6b:5d:28:13:a3:70:bd:13:9f:ff:be:70:89:f6:f4)
D, [2016-06-22T11:31:23.994579 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 6 type 50 len 348
D, [2016-06-22T11:31:23.994609 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 372 bytes
D, [2016-06-22T11:31:23.995494 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 324 bytes
D, [2016-06-22T11:31:23.995567 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 6 type 60 len 300
D, [2016-06-22T11:31:23.996819 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: queueing packet nr 7 type 50 len 620
D, [2016-06-22T11:31:23.996857 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: sent 644 bytes
D, [2016-06-22T11:31:23.999834 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: read 36 bytes
D, [2016-06-22T11:31:23.999887 #15135] DEBUG -- tcpsocket[2b1670eb2d94]: received packet nr 7 type 52 len 12
D, [2016-06-22T11:31:23.999924 #15135] DEBUG -- net.ssh.authentication.methods.publickey[2b1670e69798]: publickey succeeded (39:6b:5d:28:13:a3:70:bd:13:9f:ff:be:70:89:f6:f4)

DEBUG ssh: == Net-SSH connection debug-level log END ==
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
 INFO wait_till_up: Time for SSH ready: 1.8732407093048096
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::ForwardPorts:0x00562ce1a5b590>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SetHostname:0x00562ce1a08020>
 INFO warden: Calling IN action: #<Proc:0x00562ce1a08318@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Proc:0x00562ce1a08318@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::SetHostname:0x00562ce1a08020>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::ForwardPorts:0x00562ce1a5b590>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00562ce1aae268>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::StartDomain:0x00562ce1b04848>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::SetBootOrder:0x00562ce1b4d4a8>
DEBUG create_network_interfaces: Configuring interface slot_number 1 options {:iface_type=>:private_network, :netmask=>"255.255.255.0", :dhcp_enabled=>true, :forward_mode=>"nat", :ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :libvirt__forward_mode=>"nat", :libvirt__dhcp_enabled=>true, :whatever=>true, :protocol=>"tcp", :id=>"177a6891-0670-4a5b-ab8e-8f2b3eccd91b", :network_name=>"vagrant_internal_net"}
 INFO interface: info: Configuring and enabling network interfaces...
 INFO interface: info: ==> observer_client_1: Configuring and enabling network interfaces...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: configure_networks
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: configure_networks in redhat
 INFO guest: Execute capability: configure_networks [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, [{:type=>:static, :ip=>"192.168.1.3", :netmask=>"255.255.255.0", :interface=>1, :use_dhcp_assigned_default_route=>nil, :mac_address=>"FA:16:3E:3D:C8:77"}]] (redhat)
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: flavor
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: flavor in redhat
 INFO guest: Execute capability: flavor [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/redhat-release (sudo=true)
DEBUG ssh: stdout: CentOS Linux release 7.1.1503 (Core) 

DEBUG ssh: Exit status: 0
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: network_scripts_dir
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: network_scripts_dir in redhat
 INFO guest: Execute capability: network_scripts_dir [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: /usr/sbin/biosdevname &>/dev/null; echo $? (sudo=true)
DEBUG ssh: stdout: 4

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: ls /sys/class/net | egrep -v lo\|docker (sudo=true)
DEBUG ssh: stdout: eth0
eth1

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /sys/class/net/eth0/address (sudo=true)
DEBUG ssh: stdout: 52:54:00:22:48:a1

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /sys/class/net/eth1/address (sudo=true)
DEBUG ssh: stdout: fa:16:3e:3d:c8:77

DEBUG ssh: Exit status: 0
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworkInterfaces:0x00562ce1b99d80>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworks:0x00562ce1c9ca48>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::ShareFolders:0x00562ce1cc1140>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSSettings:0x00562ce1ce5dd8>
 INFO synced_folders: Invoking synced folder enable: rsync
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_installed
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_installed in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_installed
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_installed in linux
 INFO guest: Execute capability: rsync_installed [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: which rsync (sudo=false)
DEBUG ssh: stdout: /usr/bin/rsync

DEBUG ssh: Exit status: 0
DEBUG ssh: Checking key permissions: /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_scrub_guestpath
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_command
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_command in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_command
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_command in linux
 INFO guest: Execute capability: rsync_command [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
 INFO interface: info: Rsyncing folder: /mnt/backup/home/uvsmtid/vagrant.issue.dir/ => /vagrant
 INFO interface: info: ==> observer_client_1: Rsyncing folder: /mnt/backup/home/uvsmtid/vagrant.issue.dir/ => /vagrant
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_pre
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_pre in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_pre
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_pre in linux
 INFO guest: Execute capability: rsync_pre [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, {:guestpath=>"/vagrant", :hostpath=>"/mnt/backup/home/uvsmtid/vagrant.issue.dir", :disabled=>false, :__vagrantfile=>true, :owner=>"vagrant", :group=>"vagrant"}] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: mkdir -p '/vagrant' (sudo=true)
DEBUG ssh: Exit status: 0
 INFO subprocess: Starting process: ["/usr/bin/rsync", "--verbose", "--archive", "--delete", "-z", "--copy-links", "--no-owner", "--no-group", "--rsync-path", "sudo rsync", "-e", "ssh -p 22 -o ControlMaster=auto -o ControlPath=/tmp/ssh.25 -o ControlPersist=10m -o StrictHostKeyChecking=no -o IdentitiesOnly=true -o UserKnownHostsFile=/dev/null -i '/mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key'", "--exclude", ".vagrant/", "/mnt/backup/home/uvsmtid/vagrant.issue.dir/", "vagrant@192.168.121.155:/vagrant"]
 INFO subprocess: Command not in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stderr: Warning: Permanently added '192.168.121.155' (ECDSA) to the list of known hosts.
DEBUG subprocess: stdout: sending incremental file list
DEBUG subprocess: stdout: ./
DEBUG subprocess: stdout: Vagrantfile
DEBUG subprocess: stdout: ip.addr.guest.txt
DEBUG subprocess: stdout: ip.addr.host.txt
vagrant.up.stderr
DEBUG subprocess: stdout: vagrant.up.stdout
DEBUG subprocess: stdout: dummy/
DEBUG subprocess: stdout: 
sent 15,263 bytes  received 131 bytes  30,788.00 bytes/sec
total size is 92,737  speedup is 6.02
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 32000
DEBUG subprocess: Exit status: 0
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_post
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_post in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_post
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_post in linux
 INFO guest: Execute capability: rsync_post [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, {:guestpath=>"/vagrant", :hostpath=>"/mnt/backup/home/uvsmtid/vagrant.issue.dir", :disabled=>false, :__vagrantfile=>true, :owner=>"vagrant", :group=>"vagrant"}] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: find '/vagrant' '!' -type l -a '(' ! -user vagrant -or ! -group vagrant ')' -print0 | xargs -0 -r chown vagrant:vagrant (sudo=true)
DEBUG ssh: Exit status: 0
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::SyncedFolders:0x00562ce1d126d0>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::SyncedFolderCleanup:0x00562ce1d3cca0>
 INFO warden: Calling OUT action: #<VagrantPlugins::SyncedFolderNFS::ActionCleanup:0x00562ce1d61488>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSValidIds:0x00562ce1d858b0>
 INFO provision: Writing provisioning sentinel so we don't provision again
 INFO interface: info: Running provisioner: shell...
 INFO interface: info: ==> observer_client_1: Running provisioner: shell...
 INFO environment: Running hook: provisioner_run
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: provisioner_run #<Method: Vagrant::Action::Builtin::Provision#run_provisioner>
 INFO warden: Calling IN action: #<Proc:0x00562ce09fb718@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
DEBUG ssh: Checking key permissions: /mnt/backup/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chown -R vagrant /tmp/vagrant-shell (sudo=true)
DEBUG ssh: stderr: chown: cannot access ‘/tmp/vagrant-shell’: No such file or directory

DEBUG ssh: Exit status: 1
DEBUG ssh: Uploading: /tmp/vagrant-shell20160622-15135-1rlz9u7.ps1 to /tmp/vagrant-shell
DEBUG ssh: Re-using SSH connection.
 INFO interface: detail: Running: inline script
 INFO interface: detail:     observer_client_1: Running: inline script
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chmod +x '/tmp/vagrant-shell' && /tmp/vagrant-shell (sudo=true)
DEBUG ssh: Exit status: 0
 INFO warden: Calling OUT action: #<Proc:0x00562ce09fb718@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Provision:0x00562ce0a096d8>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomain:0x00562ce0b478b0>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomainVolume:0x00562ce10ba090>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::HandleBoxImage:0x00562ce1168cd0>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::HandleBox:0x00562ce120df28>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::HandleStoragePool:0x00562ce12b37e8>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::SetNameOfDomain:0x00562ce136fd08>
 INFO warden: Calling OUT action: #<Proc:0x00562ce19aebb0@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Call:0x00562ce1b6c7e0>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::ConfigValidate:0x00562ce1b6c808>
 INFO interface: Machine: action ["up", "end", {:target=>:observer_client_1}]
 INFO environment: Released process lock: machine-action-3f82de25c284b1371becb8a838898b3c
DEBUG environment: Attempting to acquire process-lock: dotlock
 INFO environment: Acquired process lock: dotlock
 INFO environment: Released process lock: dotlock
 INFO environment: Running hook: environment_unload
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: environment_unload #<Vagrant::Action::Builder:0x00562ce1a1ede8>

@uvsmtid
Copy link

uvsmtid commented Jun 22, 2016

These inputs and results are identical to the previous post above with the only difference: vagrant-libvirt is version 0.0.33 (not 0.0.32). Setting required IP address in guest VM still fails.

The tests were done on identical host machine (some output paths, IPs, MACs, and any other hosts-specific values may be different) - again, see in the previous post above.

In order to install vagrant-libvirt version 0.0.33, I had to run this (standard) command rather than relying on Fedora 23 default RPM package repositories (because they do not have this version):

vagrant plugin install vagrant-libvirt

vagrant up STDOUT with vagrant-libvirt version 0.0.33

Bringing machine 'observer_client_1' up with 'libvirt' provider...
==> observer_client_1: Creating image (snapshot of base box volume).
==> observer_client_1: Creating domain with the following settings...
==> observer_client_1:  -- Name:              vagrant.issue.dir_observer_client_1
==> observer_client_1:  -- Domain type:       kvm
==> observer_client_1:  -- Cpus:              2
==> observer_client_1:  -- Memory:            2048M
==> observer_client_1:  -- Management MAC:    
==> observer_client_1:  -- Loader:            
==> observer_client_1:  -- Base box:          uvsmtid/centos-7.1-1503-gnome
==> observer_client_1:  -- Storage pool:      default
==> observer_client_1:  -- Image:             /var/lib/libvirt/images/vagrant.issue.dir_observer_client_1.img (138G)
==> observer_client_1:  -- Volume Cache:      default
==> observer_client_1:  -- Kernel:            
==> observer_client_1:  -- Initrd:            
==> observer_client_1:  -- Graphics Type:     vnc
==> observer_client_1:  -- Graphics Port:     5900
==> observer_client_1:  -- Graphics IP:       127.0.0.1
==> observer_client_1:  -- Graphics Password: Not defined
==> observer_client_1:  -- Video Type:        qxl
==> observer_client_1:  -- Video VRAM:        9216
==> observer_client_1:  -- Keymap:            en-us
==> observer_client_1:  -- TPM Path:          
==> observer_client_1:  -- INPUT:             type=mouse, bus=ps2
==> observer_client_1:  -- Command line : 
==> observer_client_1: Creating shared folders metadata...
==> observer_client_1: Starting domain.
==> observer_client_1: Waiting for domain to get an IP address...
==> observer_client_1: Waiting for SSH to become available...
    observer_client_1: 
    observer_client_1: Vagrant insecure key detected. Vagrant will automatically replace
    observer_client_1: this with a newly generated keypair for better security.
    observer_client_1: 
    observer_client_1: Inserting generated public key within guest...
    observer_client_1: Removing insecure key from the guest if it's present...
    observer_client_1: Key inserted! Disconnecting and reconnecting using new SSH key...
==> observer_client_1: Configuring and enabling network interfaces...
==> observer_client_1: Rsyncing folder: /home/uvsmtid/vagrant.issue.dir/ => /vagrant
==> observer_client_1: Running provisioner: shell...
    observer_client_1: Running: inline script

vagrant up STDERR with vagrant-libvirt version 0.0.33

 INFO global: Vagrant version: 1.8.1
 INFO global: Ruby version: 2.2.5
 INFO global: RubyGems version: 2.4.8
 INFO global: VAGRANT_DEFAULT_PROVIDER="libvirt"
 INFO global: VAGRANT_EXECUTABLE="/usr/share/vagrant/bin/vagrant"
 INFO global: VAGRANT_LOG="debug"
 INFO global: VAGRANT_INSTALLER_EMBEDDED_DIR="/var/lib/vagrant"
 INFO global: VAGRANT_INSTALLER_VERSION="2"
 INFO global: VAGRANT_INTERNAL_BUNDLERIZED="1"
 INFO global: VAGRANT_DETECTED_OS="Linux"
 INFO global: VAGRANT_INSTALLER_ENV="1"
 INFO global: Plugins:
 INFO global:   - builder = 3.2.2
 INFO global:   - bundler = 1.7.8
 INFO global:   - excon = 0.49.0
 INFO global:   - formatador = 0.2.5
 INFO global:   - fog-core = 1.40.0
 INFO global:   - multi_json = 1.12.1
 INFO global:   - fog-json = 1.0.2
 INFO global:   - mini_portile2 = 2.1.0
 INFO global:   - pkg-config = 1.1.7
 INFO global:   - nokogiri = 1.6.8
 INFO global:   - fog-xml = 0.1.2
 INFO global:   - json = 1.8.3
 INFO global:   - ruby-libvirt = 0.6.0
 INFO global:   - fog-libvirt = 0.0.3
 INFO global:   - vagrant-libvirt = 0.0.33
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/communicators/ssh/plugin.rb
 INFO manager: Registered plugin: ssh communicator
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/communicators/winrm/plugin.rb
 INFO manager: Registered plugin: winrm communicator
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/solaris11/plugin.rb
 INFO manager: Registered plugin: Solaris 11 guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/windows/plugin.rb
 INFO manager: Registered plugin: Windows guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/linux/plugin.rb
 INFO manager: Registered plugin: Linux guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/netbsd/plugin.rb
 INFO manager: Registered plugin: NetBSD guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/darwin/plugin.rb
 INFO manager: Registered plugin: Darwin guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/slackware/plugin.rb
 INFO manager: Registered plugin: Slackware guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/suse/plugin.rb
 INFO manager: Registered plugin: SUSE guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/redhat/plugin.rb
 INFO manager: Registered plugin: Red Hat Enterprise Linux guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/solaris/plugin.rb
 INFO manager: Registered plugin: Solaris guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/omnios/plugin.rb
 INFO manager: Registered plugin: OmniOS guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/nixos/plugin.rb
 INFO manager: Registered plugin: NixOS guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/freebsd/plugin.rb
 INFO manager: Registered plugin: FreeBSD guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/fedora/plugin.rb
 INFO manager: Registered plugin: Fedora guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/mint/plugin.rb
 INFO manager: Registered plugin: Mint guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/pld/plugin.rb
 INFO manager: Registered plugin: PLD Linux guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/atomic/plugin.rb
 INFO manager: Registered plugin: Atomic Host guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/smartos/plugin.rb
 INFO manager: Registered plugin: SmartOS guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/gentoo/plugin.rb
 INFO manager: Registered plugin: Gentoo guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/tinycore/plugin.rb
 INFO manager: Registered plugin: TinyCore Linux guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/debian/plugin.rb
 INFO manager: Registered plugin: Debian guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/esxi/plugin.rb
 INFO manager: Registered plugin: ESXi guest.
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/arch/plugin.rb
 INFO manager: Registered plugin: Arch guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/funtoo/plugin.rb
 INFO manager: Registered plugin: Funtoo guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/photon/plugin.rb
 INFO manager: Registered plugin: VMware Photon guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/ubuntu/plugin.rb
 INFO manager: Registered plugin: Ubuntu guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/openbsd/plugin.rb
 INFO manager: Registered plugin: OpenBSD guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/guests/coreos/plugin.rb
 INFO manager: Registered plugin: CoreOS guest
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/salt/plugin.rb
 INFO manager: Registered plugin: salt
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/docker/plugin.rb
 INFO manager: Registered plugin: docker
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/puppet/plugin.rb
 INFO manager: Registered plugin: puppet
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/chef/plugin.rb
 INFO manager: Registered plugin: chef
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/file/plugin.rb
 INFO manager: Registered plugin: file
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/cfengine/plugin.rb
 INFO manager: Registered plugin: CFEngine Provisioner
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/ansible/plugin.rb
 INFO manager: Registered plugin: ansible
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/provisioners/shell/plugin.rb
 INFO manager: Registered plugin: shell
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/windows/plugin.rb
 INFO manager: Registered plugin: Windows host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/linux/plugin.rb
 INFO manager: Registered plugin: Linux host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/darwin/plugin.rb
 INFO manager: Registered plugin: Mac OS X host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/slackware/plugin.rb
 INFO manager: Registered plugin: Slackware host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/suse/plugin.rb
 INFO manager: Registered plugin: SUSE host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/redhat/plugin.rb
 INFO manager: Registered plugin: Red Hat Enterprise Linux host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/freebsd/plugin.rb
 INFO manager: Registered plugin: FreeBSD host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/gentoo/plugin.rb
 INFO manager: Registered plugin: Gentoo host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/arch/plugin.rb
 INFO manager: Registered plugin: Arch host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/bsd/plugin.rb
 INFO manager: Registered plugin: BSD host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/hosts/null/plugin.rb
 INFO manager: Registered plugin: null host
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/synced_folders/rsync/plugin.rb
 INFO manager: Registered plugin: RSync synced folders
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/synced_folders/nfs/plugin.rb
 INFO manager: Registered plugin: NFS synced folders
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/synced_folders/smb/plugin.rb
 INFO manager: Registered plugin: SMB synced folders
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/cap/plugin.rb
 INFO manager: Registered plugin: cap command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/status/plugin.rb
 INFO manager: Registered plugin: status command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/ssh/plugin.rb
 INFO manager: Registered plugin: ssh command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/list-commands/plugin.rb
 INFO manager: Registered plugin: list-commands command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/rdp/plugin.rb
 INFO manager: Registered plugin: rdp command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/box/plugin.rb
 INFO manager: Registered plugin: box command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/reload/plugin.rb
 INFO manager: Registered plugin: reload command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/package/plugin.rb
 INFO manager: Registered plugin: package command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/resume/plugin.rb
 INFO manager: Registered plugin: resume command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/port/plugin.rb
 INFO manager: Registered plugin: port command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/snapshot/plugin.rb
 INFO manager: Registered plugin: snapshot command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/provider/plugin.rb
 INFO manager: Registered plugin: provider command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/powershell/plugin.rb
 INFO manager: Registered plugin: powershell command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/plugin/plugin.rb
 INFO manager: Registered plugin: plugin command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/init/plugin.rb
 INFO manager: Registered plugin: init command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/push/plugin.rb
 INFO manager: Registered plugin: push command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/halt/plugin.rb
 INFO manager: Registered plugin: halt command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/provision/plugin.rb
 INFO manager: Registered plugin: provision command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/up/plugin.rb
 INFO manager: Registered plugin: up command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/help/plugin.rb
 INFO manager: Registered plugin: help command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/suspend/plugin.rb
 INFO manager: Registered plugin: suspend command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/global-status/plugin.rb
 INFO manager: Registered plugin: global-status command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/ssh_config/plugin.rb
 INFO manager: Registered plugin: ssh-config command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/login/plugin.rb
 INFO manager: Registered plugin: vagrant-login
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/destroy/plugin.rb
 INFO manager: Registered plugin: destroy command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/commands/version/plugin.rb
 INFO manager: Registered plugin: version command
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/kernel_v1/plugin.rb
 INFO manager: Registered plugin: kernel
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/providers/docker/plugin.rb
 INFO manager: Registered plugin: docker-provider
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/providers/hyperv/plugin.rb
 INFO manager: Registered plugin: Hyper-V provider
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/providers/virtualbox/plugin.rb
 INFO manager: Registered plugin: VirtualBox provider
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/noop/plugin.rb
 INFO manager: Registered plugin: noop
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/ftp/plugin.rb
 INFO manager: Registered plugin: ftp
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/heroku/plugin.rb
 INFO manager: Registered plugin: heroku
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/local-exec/plugin.rb
 INFO manager: Registered plugin: local-exec
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/pushes/atlas/plugin.rb
 INFO manager: Registered plugin: atlas
DEBUG global: Loading core plugin: /usr/share/vagrant/plugins/kernel_v2/plugin.rb
 INFO manager: Registered plugin: kernel
 INFO global: Loading plugins!
 INFO manager: Registered plugin: libvirt
 INFO vagrant: `vagrant` invoked: ["up"]
DEBUG vagrant: Creating Vagrant environment
 INFO environment: Environment initialized (#<Vagrant::Environment:0x00560c49a3a538>)
 INFO environment:   - cwd: /home/uvsmtid/vagrant.issue.dir
 INFO environment: Home path: /home/uvsmtid/.vagrant.d
 INFO environment: Local data path: /home/uvsmtid/vagrant.issue.dir/.vagrant
DEBUG environment: Creating: /home/uvsmtid/vagrant.issue.dir/.vagrant
 INFO environment: Running hook: environment_plugins_loaded
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: environment_plugins_loaded #<Vagrant::Action::Builder:0x00560c49925508>
 INFO environment: Running hook: environment_load
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: environment_load #<Vagrant::Action::Builder:0x00560c48e62cb0>
 INFO cli: CLI: [] "up" []
DEBUG cli: Invoking command class: VagrantPlugins::CommandUp::Command []
DEBUG command: 'Up' each target VM...
 INFO loader: Set :root = ["#<Pathname:/home/uvsmtid/vagrant.issue.dir/Vagrantfile>"]
DEBUG loader: Populating proc cache for #<Pathname:/home/uvsmtid/vagrant.issue.dir/Vagrantfile>
DEBUG loader: Load procs for pathname: /home/uvsmtid/vagrant.issue.dir/Vagrantfile
 INFO loader: Loading configuration in order: [:home, :root]
DEBUG loader: Loading from: root (evaluating)
DEBUG loader: Configuration loaded successfully, finalizing and returning
DEBUG push: finalizing
 INFO host: Autodetecting host type for [#<Vagrant::Environment: /home/uvsmtid/vagrant.issue.dir>]
DEBUG host: Trying: darwin
DEBUG host: Trying: slackware
DEBUG host: Trying: suse
DEBUG host: Trying: redhat
 INFO host: Detected: redhat!
DEBUG host: Searching for cap: provider_install_libvirt
DEBUG host: Checking in: redhat
DEBUG host: Checking in: linux
DEBUG command: Getting target VMs for command. Arguments:
DEBUG command:  -- names: ["observer_client_1"]
DEBUG command:  -- options: {:provider=>nil}
DEBUG command: Finding machine that match name: observer_client_1
 INFO environment: Getting machine: observer_client_1 (libvirt)
 INFO environment: Uncached load of machine.
 INFO loader: Set "47305388005380_machine_observer_client_1" = ["[\"2\", #<Proc:0x00560c499d0778@/home/uvsmtid/vagrant.issue.dir/Vagrantfile:17>]"]
DEBUG loader: Populating proc cache for ["2", #<Proc:0x00560c499d0778@/home/uvsmtid/vagrant.issue.dir/Vagrantfile:17>]
 INFO loader: Loading configuration in order: [:home, :root, "47305388005380_machine_observer_client_1"]
DEBUG loader: Loading from: root (cache)
DEBUG loader: Loading from: 47305388005380_machine_observer_client_1 (evaluating)
DEBUG provisioner: Provisioner defined: 
DEBUG loader: Configuration loaded successfully, finalizing and returning
DEBUG push: finalizing
 INFO box_collection: Box found: uvsmtid/centos-7.1-1503-gnome (libvirt)
 INFO environment: Running hook: authenticate_box_url
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 3 hooks defined.
 INFO runner: Running action: authenticate_box_url #<Vagrant::Action::Builder:0x00560c48e4eb20>
 INFO warden: Calling IN action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x00560c4ac8c718>
DEBUG client: No authentication token in environment or /home/uvsmtid/.vagrant.d/data/vagrant_login_token
 INFO warden: Calling OUT action: #<VagrantPlugins::LoginCommand::AddAuthentication:0x00560c4ac8c718>
 INFO loader: Set :"47305388220480_uvsmtid/centos-7.1-1503-gnome_libvirt" = ["#<Pathname:/home/uvsmtid/.vagrant.d/boxes/uvsmtid-VAGRANTSLASH-centos-7.1-1503-gnome/1.0.1/libvirt/Vagrantfile>"]
DEBUG loader: Populating proc cache for #<Pathname:/home/uvsmtid/.vagrant.d/boxes/uvsmtid-VAGRANTSLASH-centos-7.1-1503-gnome/1.0.1/libvirt/Vagrantfile>
DEBUG loader: Load procs for pathname: /home/uvsmtid/.vagrant.d/boxes/uvsmtid-VAGRANTSLASH-centos-7.1-1503-gnome/1.0.1/libvirt/Vagrantfile
 INFO loader: Loading configuration in order: [:"47305388220480_uvsmtid/centos-7.1-1503-gnome_libvirt", :home, :root, "47305388005380_machine_observer_client_1"]
DEBUG loader: Loading from: 47305388220480_uvsmtid/centos-7.1-1503-gnome_libvirt (evaluating)
DEBUG loader: Loading from: root (cache)
DEBUG loader: Loading from: 47305388005380_machine_observer_client_1 (cache)
DEBUG loader: Configuration loaded successfully, finalizing and returning
DEBUG push: finalizing
 INFO machine: Initializing machine: observer_client_1
 INFO machine:   - Provider: VagrantPlugins::ProviderLibvirt::Provider
 INFO machine:   - Box: #<Vagrant::Box:0x00560c4acd18e0>
 INFO machine:   - Data dir: /home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt
 INFO machine: New machine ID: nil
 INFO interface: Machine: metadata ["provider", :libvirt, {:target=>:observer_client_1}]
 INFO command: With machine: observer_client_1 (#<VagrantPlugins::ProviderLibvirt::Provider:0x00560c4b001948 @machine=#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, @cap_logger=#<Log4r::Logger:0x00560c4b001448 @fullname="vagrant::capability_host::vagrantplugins::providerlibvirt::provider", @outputters=[], @additive=true, @name="provider", @path="vagrant::capability_host::vagrantplugins::providerlibvirt", @parent=#<Log4r::Logger:0x00560c48a68768 @fullname="vagrant", @outputters=[#<Log4r::StderrOutputter:0x00560c4909a488 @mon_owner=nil, @mon_count=0, @mon_mutex=#<Mutex:0x00560c4909a410>, @name="stderr", @level=0, @formatter=#<Log4r::DefaultFormatter:0x00560c4906cdf8 @depth=7>, @out=#<IO:<STDERR>>>], @additive=true, @name="vagrant", @path="", @parent=#<Log4r::RootLogger:0x00560c48a685d8 @level=0, @outputters=[]>, @level=1, @trace=false>, @level=1, @trace=false>, @cap_host_chain=[[:libvirt, #<#<Class:0x00560c4b0018f8>:0x00560c4b03e0c8>]], @cap_args=[#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>], @cap_caps={:docker=>#<Vagrant::Registry:0x00560c4b0017b8 @items={:public_address=>#<Proc:0x00560c49b80e38@/usr/share/vagrant/plugins/providers/docker/plugin.rb:54>, :proxy_machine=>#<Proc:0x00560c49b80cf8@/usr/share/vagrant/plugins/providers/docker/plugin.rb:59>}, @results_cache={}>, :hyperv=>#<Vagrant::Registry:0x00560c4b001718 @items={:public_address=>#<Proc:0x00560c49b8e4e8@/usr/share/vagrant/plugins/providers/hyperv/plugin.rb:25>}, @results_cache={}>, :virtualbox=>#<Vagrant::Registry:0x00560c4b001678 @items={:forwarded_ports=>#<Proc:0x00560c49b94d48@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:27>, :nic_mac_addresses=>#<Proc:0x00560c49b94ca8@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:32>, :public_address=>#<Proc:0x00560c49b94c80@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:37>, :snapshot_list=>#<Proc:0x00560c49b94c58@/usr/share/vagrant/plugins/providers/virtualbox/plugin.rb:42>}, @results_cache={}>, :libvirt=>#<Vagrant::Registry:0x00560c4b0015d8 @items={:nic_mac_addresses=>#<Proc:0x00560c49bd4b00@/home/uvsmtid/.vagrant.d/gems/gems/vagrant-libvirt-0.0.33/lib/vagrant-libvirt/plugin.rb:44>}, @results_cache={}>}>)
 INFO interface: info: Bringing machine 'observer_client_1' up with 'libvirt' provider...
 INFO batch_action: Enabling parallelization by default.
 INFO batch_action: Disabling parallelization because only executing one action
 INFO batch_action: Batch action will parallelize: false
 INFO batch_action: Starting action: #<Vagrant::Machine:0x00560c4af7e930> up {:destroy_on_error=>true, :install_provider=>true, :parallel=>true, :provision_ignore_sentinel=>false, :provision_types=>nil}
 INFO machine: Calling action: up on provider Libvirt (new)
DEBUG environment: Attempting to acquire process-lock: machine-action-651ef229d320eb50ea4995976765e4dc
DEBUG environment: Attempting to acquire process-lock: dotlock
 INFO environment: Acquired process lock: dotlock
 INFO environment: Released process lock: dotlock
 INFO environment: Acquired process lock: machine-action-651ef229d320eb50ea4995976765e4dc
 INFO interface: Machine: action ["up", "start", {:target=>:observer_client_1}]
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: machine_action_up #<Vagrant::Action::Builder:0x00560c4b1889b0>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::ConfigValidate:0x007f657c01eb48>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::Call:0x007f657c01eb20>
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: machine_action_up #<Vagrant::Action::Builder:0x007f657c092638>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::IsCreated:0x00560c4a3b77f0>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::IsCreated:0x00560c4a3b77f0>
 INFO driver: Connecting to Libvirt (qemu:///system?no_verify=1&keyfile=/home/uvsmtid/.ssh/id_rsa) ...
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: machine_action_up #<Vagrant::Action::Warden:0x007f657c375648>
 INFO warden: Calling IN action: #<Proc:0x00560c4ace3810@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::SetNameOfDomain:0x007f657c375580>
 INFO set_name_of_domain: Looking for domain vagrant.issue.dir_observer_client_1 through list [  <Fog::Compute::Libvirt::Server
    id="f123329f-ee39-4a6e-a8e7-56c0e4c7d635",
    cpus=2,
    cputime=0,
    os_type="hvm",
    memory_size=2097152,
    max_memory_size=2097152,
    name="vagrantdir_rhel7_minion",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="f123329f-ee39-4a6e-a8e7-56c0e4c7d635",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:eb:33:1f",
      id=nil,
      type="network",
      network="vagrant-libvirt",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:27:6c:cb",
      id=nil,
      type="network",
      network="vagrant_internal_net_B",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:31:72:ec",
      id=nil,
      type="network",
      network="vagrant_internal_net_A",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:88:1c:4b",
      id=nil,
      type="network",
      network="vagrant_primary_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/vagrantdir_rhel7_minion.img",
      pool_name="default",
      key="/var/lib/libvirt/images/vagrantdir_rhel7_minion.img",
      name="vagrantdir_rhel7_minion.img",
      path="/var/lib/libvirt/images/vagrantdir_rhel7_minion.img",
      capacity=138,
      allocation=1,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"vnc", :port=>"-1", :listen=>"127.0.0.1"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="cf4de170-67b9-4648-b9b2-4fd8863731f3",
    cpus=1,
    cputime=0,
    os_type="hvm",
    memory_size=1048576,
    max_memory_size=1048576,
    name="virtual-water-way",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="cf4de170-67b9-4648-b9b2-4fd8863731f3",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:83:f0:2e",
      id=nil,
      type="network",
      network="internal_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/virtual-water-way-clone.qcow2",
      pool_name="default",
      key="/var/lib/libvirt/images/virtual-water-way-clone.qcow2",
      name="virtual-water-way-clone.qcow2",
      path="/var/lib/libvirt/images/virtual-water-way-clone.qcow2",
      capacity=50,
      allocation=6,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"spice"},
    state="shutoff"
  >,   <Fog::Compute::Libvirt::Server
    id="dbef3a95-8f4a-41fe-a4f7-d2bcb1b0fd62",
    cpus=2,
    cputime=0,
    os_type="hvm",
    memory_size=2097152,
    max_memory_size=2097152,
    name="vagrantdir_rhel5_minion",
    arch="x86_64",
    persistent=true,
    domain_type="kvm",
    uuid="dbef3a95-8f4a-41fe-a4f7-d2bcb1b0fd62",
    autostart=false,
    nics=[    <Fog::Compute::Libvirt::Nic
      mac="52:54:00:7d:2f:fb",
      id=nil,
      type="network",
      network="vagrant-libvirt",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:98:e2:78",
      id=nil,
      type="network",
      network="vagrant_internal_net_B",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:6f:b4:75",
      id=nil,
      type="network",
      network="vagrant_internal_net_A",
      bridge=nil,
      model="virtio"
    >,     <Fog::Compute::Libvirt::Nic
      mac="52:54:00:52:2e:75",
      id=nil,
      type="network",
      network="vagrant_primary_net",
      bridge=nil,
      model="virtio"
    >],
    volumes=[    <Fog::Compute::Libvirt::Volume
      id="/var/lib/libvirt/images/vagrantdir_rhel5_minion.img",
      pool_name="default",
      key="/var/lib/libvirt/images/vagrantdir_rhel5_minion.img",
      name="vagrantdir_rhel5_minion.img",
      path="/var/lib/libvirt/images/vagrantdir_rhel5_minion.img",
      capacity=138,
      allocation=1,
      format_type="qcow2",
      backing_volume=nil
    >],
    active=false,
    boot_order=["hd"],
    display={:type=>"vnc", :port=>"-1", :listen=>"127.0.0.1"},
    state="shutoff"
  >]
 INFO set_name_of_domain: Looking for domain vagrant.issue.dir_observer_client_1
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::HandleStoragePool:0x007f657c321ac0>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::HandleBox:0x007f657c2e6e48>
 INFO handle_box: Machine already has box. HandleBox will not run.
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::HandleBoxImage:0x007f657c2ac748>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomainVolume:0x007f657c25ce28>
 INFO interface: info: Creating image (snapshot of base box volume).
 INFO interface: info: ==> observer_client_1: Creating image (snapshot of base box volume).
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomain:0x007f657c221aa8>
 INFO interface: info: Creating domain with the following settings...
 INFO interface: info: ==> observer_client_1: Creating domain with the following settings...
 INFO interface: info:  -- Name:              vagrant.issue.dir_observer_client_1
 INFO interface: info: ==> observer_client_1:  -- Name:              vagrant.issue.dir_observer_client_1
 INFO interface: info:  -- Domain type:       kvm
 INFO interface: info: ==> observer_client_1:  -- Domain type:       kvm
 INFO interface: info:  -- Cpus:              2
 INFO interface: info: ==> observer_client_1:  -- Cpus:              2
 INFO interface: info:  -- Memory:            2048M
 INFO interface: info: ==> observer_client_1:  -- Memory:            2048M
 INFO interface: info:  -- Management MAC:    
 INFO interface: info: ==> observer_client_1:  -- Management MAC:    
 INFO interface: info:  -- Loader:            
 INFO interface: info: ==> observer_client_1:  -- Loader:            
 INFO interface: info:  -- Base box:          uvsmtid/centos-7.1-1503-gnome
 INFO interface: info: ==> observer_client_1:  -- Base box:          uvsmtid/centos-7.1-1503-gnome
 INFO interface: info:  -- Storage pool:      default
 INFO interface: info: ==> observer_client_1:  -- Storage pool:      default
 INFO interface: info:  -- Image:             /var/lib/libvirt/images/vagrant.issue.dir_observer_client_1.img (138G)
 INFO interface: info: ==> observer_client_1:  -- Image:             /var/lib/libvirt/images/vagrant.issue.dir_observer_client_1.img (138G)
 INFO interface: info:  -- Volume Cache:      default
 INFO interface: info: ==> observer_client_1:  -- Volume Cache:      default
 INFO interface: info:  -- Kernel:            
 INFO interface: info: ==> observer_client_1:  -- Kernel:            
 INFO interface: info:  -- Initrd:            
 INFO interface: info: ==> observer_client_1:  -- Initrd:            
 INFO interface: info:  -- Graphics Type:     vnc
 INFO interface: info: ==> observer_client_1:  -- Graphics Type:     vnc
 INFO interface: info:  -- Graphics Port:     5900
 INFO interface: info: ==> observer_client_1:  -- Graphics Port:     5900
 INFO interface: info:  -- Graphics IP:       127.0.0.1
 INFO interface: info: ==> observer_client_1:  -- Graphics IP:       127.0.0.1
 INFO interface: info:  -- Graphics Password: Not defined
 INFO interface: info: ==> observer_client_1:  -- Graphics Password: Not defined
 INFO interface: info:  -- Video Type:        qxl
 INFO interface: info: ==> observer_client_1:  -- Video Type:        qxl
 INFO interface: info:  -- Video VRAM:        9216
 INFO interface: info: ==> observer_client_1:  -- Video VRAM:        9216
 INFO interface: info:  -- Keymap:            en-us
 INFO interface: info: ==> observer_client_1:  -- Keymap:            en-us
 INFO interface: info:  -- TPM Path:          
 INFO interface: info: ==> observer_client_1:  -- TPM Path:          
 INFO interface: info:  -- INPUT:             type=mouse, bus=ps2
 INFO interface: info: ==> observer_client_1:  -- INPUT:             type=mouse, bus=ps2
 INFO interface: info:  -- Command line : 
 INFO interface: info: ==> observer_client_1:  -- Command line : 
 INFO machine: New machine ID: "ad17a07c-a3c7-45fb-85ff-556500b3c9f6"
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::Provision:0x007f657c1d9c80>
 INFO provision: Checking provisioner sentinel file...
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSValidIds:0x007f657c177b70>
 INFO warden: Calling IN action: #<VagrantPlugins::SyncedFolderNFS::ActionCleanup:0x007f657c14cc40>
DEBUG host: Searching for cap: nfs_prune
DEBUG host: Checking in: redhat
DEBUG host: Checking in: linux
DEBUG host: Found cap: nfs_prune in linux
 INFO nfs: NFS pruning. Valid IDs: ["f123329f-ee39-4a6e-a8e7-56c0e4c7d635", "cf4de170-67b9-4648-b9b2-4fd8863731f3", "ad17a07c-a3c7-45fb-85ff-556500b3c9f6", "dbef3a95-8f4a-41fe-a4f7-d2bcb1b0fd62"]
DEBUG host: Searching for cap: nfs_prune
DEBUG host: Checking in: redhat
DEBUG host: Checking in: linux
DEBUG host: Found cap: nfs_prune in linux
 INFO host: Execute capability: nfs_prune [#<Vagrant::Environment: /home/uvsmtid/vagrant.issue.dir>, #<Vagrant::UI::Prefixed:0x00560c4afc1d48 @logger=#<Log4r::Logger:0x00560c4afc1cf8 @fullname="vagrant::ui::interface", @outputters=[], @additive=true, @name="interface", @path="vagrant::ui", @parent=#<Log4r::Logger:0x00560c48a68768 @fullname="vagrant", @outputters=[#<Log4r::StderrOutputter:0x00560c4909a488 @mon_owner=nil, @mon_count=0, @mon_mutex=#<Mutex:0x00560c4909a410>, @name="stderr", @level=0, @formatter=#<Log4r::DefaultFormatter:0x00560c4906cdf8 @depth=7>, @out=#<IO:<STDERR>>>], @additive=true, @name="vagrant", @path="", @parent=#<Log4r::RootLogger:0x00560c48a685d8 @level=0, @outputters=[]>, @level=1, @trace=false>, @level=1, @trace=false>, @opts={}, @stdin=#<IO:<STDIN>>, @stdout=#<IO:<STDOUT>>, @stderr=#<IO:<STDERR>>, @prefix=:observer_client_1, @ui=#<Vagrant::UI::Basic:0x00560c49a3a2b8 @logger=#<Log4r::Logger:0x00560c49a3a268 @fullname="vagrant::ui::interface", @outputters=[], @additive=true, @name="interface", @path="vagrant::ui", @parent=#<Log4r::Logger:0x00560c48a68768 @fullname="vagrant", @outputters=[#<Log4r::StderrOutputter:0x00560c4909a488 @mon_owner=nil, @mon_count=0, @mon_mutex=#<Mutex:0x00560c4909a410>, @name="stderr", @level=0, @formatter=#<Log4r::DefaultFormatter:0x00560c4906cdf8 @depth=7>, @out=#<IO:<STDERR>>>], @additive=true, @name="vagrant", @path="", @parent=#<Log4r::RootLogger:0x00560c48a685d8 @level=0, @outputters=[]>, @level=1, @trace=false>, @level=1, @trace=false>, @opts={:color=>:default}, @stdin=#<IO:<STDIN>>, @stdout=#<IO:<STDOUT>>, @stderr=#<IO:<STDERR>>, @lock=#<Mutex:0x00560c49a01288>>>, ["f123329f-ee39-4a6e-a8e7-56c0e4c7d635", "cf4de170-67b9-4648-b9b2-4fd8863731f3", "ad17a07c-a3c7-45fb-85ff-556500b3c9f6", "dbef3a95-8f4a-41fe-a4f7-d2bcb1b0fd62"]] (redhat)
 INFO linux: Pruning invalid NFS entries...
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolderCleanup:0x007f657c101628>
 INFO synced_folder_cleanup: Invoking synced folder cleanup for: rsync
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SyncedFolders:0x007f657c0b5d18>
 INFO synced_folders: SyncedFolders loading from cache: false
 INFO synced_folders: Synced Folder Implementation: rsync
 INFO synced_folders:   - /vagrant: . => /vagrant
 INFO synced_folders: Invoking synced folder prepare for: rsync
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSSettings:0x007f657c04ab30>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::ShareFolders:0x00560c4b18a288>
 INFO interface: info: Creating shared folders metadata...
 INFO interface: info: ==> observer_client_1: Creating shared folders metadata...
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworks:0x00560c4b10ca40>
 INFO create_networks: Using vagrant-libvirt at 192.168.121.0/24 as the management network nat is the mode
DEBUG create_networks: In config found network type private_network options {:ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :whatever=>true, :protocol=>"tcp", :id=>"ecb1d77c-5f92-4ae3-bc51-7d75b6f10464"}
DEBUG create_networks: In config found network type forwarded_port options {:guest=>22, :host=>2222, :host_ip=>"127.0.0.1", :id=>"ssh", :auto_correct=>true, :protocol=>"tcp"}
DEBUG create_networks: Searching for network with options {:iface_type=>:private_network, :network_name=>"vagrant-libvirt", :ip=>"192.168.121.0", :netmask=>"255.255.255.0", :dhcp_enabled=>true, :forward_mode=>"nat", :guest_ipv6=>"yes"}
DEBUG create_networks: looking up network with ip == 192.168.121.0
DEBUG create_networks: Checking that network name does not clash with ip
DEBUG create_networks: Searching for network with options {:iface_type=>:private_network, :netmask=>"255.255.255.0", :dhcp_enabled=>true, :forward_mode=>"nat", :ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :whatever=>true, :protocol=>"tcp", :id=>"ecb1d77c-5f92-4ae3-bc51-7d75b6f10464", :network_name=>"vagrant_internal_net"}
DEBUG create_networks: looking up network with ip == 192.168.1.0
DEBUG create_networks: Checking that network name does not clash with ip
DEBUG create_networks: looking up network named vagrant_internal_net
DEBUG create_networks: generating name for bridge
DEBUG create_networks: looking up bridge named virbr0
DEBUG create_networks: looking up bridge named virbr1
DEBUG create_networks: looking up bridge named virbr2
DEBUG create_networks: looking up bridge named virbr3
DEBUG create_networks: looking up bridge named virbr4
DEBUG create_networks: found available bridge name virbr4
DEBUG create_networks: created network
 INFO create_networks: Saving information about created network vagrant_internal_net, UUID=546f4728-c103-4d34-939c-83fb94211d31 to file /home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/created_networks.
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworkInterfaces:0x00560c49aa9140>
 INFO create_network_interfaces: Using vagrant-libvirt at 192.168.121.0/24 as the management network nat is the mode
DEBUG create_network_interfaces: In config found network type private_network options {:ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :whatever=>true, :protocol=>"tcp", :id=>"ecb1d77c-5f92-4ae3-bc51-7d75b6f10464"}
DEBUG create_network_interfaces: In config found network type forwarded_port options {:guest=>22, :host=>2222, :host_ip=>"127.0.0.1", :id=>"ssh", :auto_correct=>true, :protocol=>"tcp"}
DEBUG create_network_interfaces: Adapter not specified so found slot 0
DEBUG create_network_interfaces: Found network by name
DEBUG create_network_interfaces: Adapter not specified so found slot 1
DEBUG create_network_interfaces: Found network by name
 INFO create_network_interfaces: Creating network interface eth0 connected to network vagrant-libvirt.
 INFO create_network_interfaces: Creating network interface eth1 connected to network vagrant_internal_net. Using MAC address: FA:16:3E:3D:C8:77
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::SetBootOrder:0x00560c499d9170>
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::StartDomain:0x00560c499320c8>
 INFO interface: info: Starting domain.
 INFO interface: info: ==> observer_client_1: Starting domain.
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00560c4985d3a0>
DEBUG wait_till_up: Searching for IP for MAC address: 52:54:00:c4:47:06
 INFO interface: info: Waiting for domain to get an IP address...
 INFO interface: info: ==> observer_client_1: Waiting for domain to get an IP address...
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO retryable: Retryable exception raised: #<Fog::Errors::TimeoutError: The specified wait_for timeout (2 seconds) was exceeded>
 INFO wait_till_up: Got IP address 192.168.121.109
 INFO wait_till_up: Time for getting IP: 13.12329387664795
 INFO interface: info: Waiting for SSH to become available...
 INFO interface: info: ==> observer_client_1: Waiting for SSH to become available...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Checking key permissions: /home/uvsmtid/.vagrant.d/insecure_private_key
 INFO ssh: Attempting SSH connection...
 INFO ssh: Attempting to connect to SSH...
 INFO ssh:   - Host: 192.168.121.109
 INFO ssh:   - Port: 22
 INFO ssh:   - Username: vagrant
 INFO ssh:   - Password? false
 INFO ssh:   - Key Path: ["/home/uvsmtid/.vagrant.d/insecure_private_key"]
DEBUG ssh: == Net-SSH connection debug-level log START ==
DEBUG ssh: D, [2016-06-22T13:38:43.446420 #11433] DEBUG -- net.ssh.transport.session[2b0624806a24]: establishing connection to 192.168.121.109:22
D, [2016-06-22T13:38:43.446987 #11433] DEBUG -- net.ssh.transport.session[2b0624806a24]: connection established
I, [2016-06-22T13:38:43.447097 #11433]  INFO -- net.ssh.transport.server_version[2b06247f58c8]: negotiating protocol version
D, [2016-06-22T13:38:43.447128 #11433] DEBUG -- net.ssh.transport.server_version[2b06247f58c8]: local is `SSH-2.0-Ruby/Net::SSH_3.2.0 x86_64-linux'
D, [2016-06-22T13:38:43.460616 #11433] DEBUG -- net.ssh.transport.server_version[2b06247f58c8]: remote is `SSH-2.0-OpenSSH_6.6.1'
I, [2016-06-22T13:38:43.461021 #11433]  INFO -- net.ssh.transport.algorithms[2b06247d9f60]: sending KEXINIT
D, [2016-06-22T13:38:43.461323 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 0 type 20 len 1684
D, [2016-06-22T13:38:43.461438 #11433] DEBUG -- socket[2b0624805908]: sent 1688 bytes
D, [2016-06-22T13:38:43.467916 #11433] DEBUG -- socket[2b0624805908]: read 1640 bytes
D, [2016-06-22T13:38:43.468091 #11433] DEBUG -- socket[2b0624805908]: received packet nr 0 type 20 len 1636
I, [2016-06-22T13:38:43.468160 #11433]  INFO -- net.ssh.transport.algorithms[2b06247d9f60]: got KEXINIT from server
I, [2016-06-22T13:38:43.468321 #11433]  INFO -- net.ssh.transport.algorithms[2b06247d9f60]: negotiating algorithms
D, [2016-06-22T13:38:43.468464 #11433] DEBUG -- net.ssh.transport.algorithms[2b06247d9f60]: negotiated:
* kex: diffie-hellman-group-exchange-sha1
* host_key: ssh-rsa
* encryption_server: aes128-cbc
* encryption_client: aes128-cbc
* hmac_client: hmac-sha1
* hmac_server: hmac-sha1
* compression_client: none
* compression_server: none
* language_client: 
* language_server: 
D, [2016-06-22T13:38:43.468499 #11433] DEBUG -- net.ssh.transport.algorithms[2b06247d9f60]: exchanging keys
D, [2016-06-22T13:38:43.468708 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 1 type 34 len 20
D, [2016-06-22T13:38:43.468794 #11433] DEBUG -- socket[2b0624805908]: sent 24 bytes
D, [2016-06-22T13:38:43.486679 #11433] DEBUG -- socket[2b0624805908]: read 152 bytes
D, [2016-06-22T13:38:43.486820 #11433] DEBUG -- socket[2b0624805908]: received packet nr 1 type 31 len 148
D, [2016-06-22T13:38:43.488683 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 2 type 32 len 140
D, [2016-06-22T13:38:43.488813 #11433] DEBUG -- socket[2b0624805908]: sent 144 bytes
D, [2016-06-22T13:38:43.491129 #11433] DEBUG -- socket[2b0624805908]: read 720 bytes
D, [2016-06-22T13:38:43.491259 #11433] DEBUG -- socket[2b0624805908]: received packet nr 2 type 33 len 700
D, [2016-06-22T13:38:43.492514 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 3 type 21 len 20
D, [2016-06-22T13:38:43.492632 #11433] DEBUG -- socket[2b0624805908]: sent 24 bytes
D, [2016-06-22T13:38:43.492753 #11433] DEBUG -- socket[2b0624805908]: received packet nr 3 type 21 len 12
D, [2016-06-22T13:38:43.493153 #11433] DEBUG -- net.ssh.authentication.session[2b06257015c4]: beginning authentication of `vagrant'
D, [2016-06-22T13:38:43.493310 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 4 type 5 len 28
D, [2016-06-22T13:38:43.493360 #11433] DEBUG -- socket[2b0624805908]: sent 52 bytes
D, [2016-06-22T13:38:43.533019 #11433] DEBUG -- socket[2b0624805908]: read 52 bytes
D, [2016-06-22T13:38:43.533232 #11433] DEBUG -- socket[2b0624805908]: received packet nr 4 type 6 len 28
D, [2016-06-22T13:38:43.533372 #11433] DEBUG -- net.ssh.authentication.session[2b06257015c4]: trying none
D, [2016-06-22T13:38:43.533522 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 5 type 50 len 44
D, [2016-06-22T13:38:43.533588 #11433] DEBUG -- socket[2b0624805908]: sent 68 bytes
D, [2016-06-22T13:38:43.548261 #11433] DEBUG -- socket[2b0624805908]: read 84 bytes
D, [2016-06-22T13:38:43.548439 #11433] DEBUG -- socket[2b0624805908]: received packet nr 5 type 51 len 60
D, [2016-06-22T13:38:43.548535 #11433] DEBUG -- net.ssh.authentication.session[2b06257015c4]: allowed methods: publickey,gssapi-keyex,gssapi-with-mic,password
D, [2016-06-22T13:38:43.548609 #11433] DEBUG -- net.ssh.authentication.methods.none[2b062570064c]: none failed
D, [2016-06-22T13:38:43.548678 #11433] DEBUG -- net.ssh.authentication.session[2b06257015c4]: trying publickey
D, [2016-06-22T13:38:43.548951 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: connecting to ssh-agent
D, [2016-06-22T13:38:43.549550 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: sending agent request 1 len 44
D, [2016-06-22T13:38:43.549870 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: received agent packet 2 len 5
D, [2016-06-22T13:38:43.549924 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: sending agent request 11 len 0
D, [2016-06-22T13:38:43.550984 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: received agent packet 12 len 624
D, [2016-06-22T13:38:43.551352 #11433] DEBUG -- net.ssh.authentication.methods.publickey[2b0625717478]: trying publickey (dd:3b:b8:2e:85:04:06:e9:ab:ff:a8:0a:c0:04:6e:d6)
D, [2016-06-22T13:38:43.551495 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 6 type 50 len 348
D, [2016-06-22T13:38:43.551578 #11433] DEBUG -- socket[2b0624805908]: sent 372 bytes
D, [2016-06-22T13:38:43.592541 #11433] DEBUG -- socket[2b0624805908]: read 324 bytes
D, [2016-06-22T13:38:43.592699 #11433] DEBUG -- socket[2b0624805908]: received packet nr 6 type 60 len 300
D, [2016-06-22T13:38:43.592873 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: sending agent request 13 len 649
D, [2016-06-22T13:38:43.597149 #11433] DEBUG -- net.ssh.authentication.agent[2b062571725c]: received agent packet 14 len 276
D, [2016-06-22T13:38:43.597332 #11433] DEBUG -- socket[2b0624805908]: queueing packet nr 7 type 50 len 620
D, [2016-06-22T13:38:43.597428 #11433] DEBUG -- socket[2b0624805908]: sent 644 bytes
D, [2016-06-22T13:38:43.610056 #11433] DEBUG -- socket[2b0624805908]: read 36 bytes
D, [2016-06-22T13:38:43.610252 #11433] DEBUG -- socket[2b0624805908]: received packet nr 7 type 52 len 12
D, [2016-06-22T13:38:43.610313 #11433] DEBUG -- net.ssh.authentication.methods.publickey[2b0625717478]: publickey succeeded (dd:3b:b8:2e:85:04:06:e9:ab:ff:a8:0a:c0:04:6e:d6)

DEBUG ssh: == Net-SSH connection debug-level log END ==
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Checking key permissions: /home/uvsmtid/.vagrant.d/insecure_private_key
 INFO interface: detail: 
Vagrant insecure key detected. Vagrant will automatically replace
this with a newly generated keypair for better security.
 INFO interface: detail:     observer_client_1: 
    observer_client_1: Vagrant insecure key detected. Vagrant will automatically replace
    observer_client_1: this with a newly generated keypair for better security.
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
 INFO guest: Autodetecting host type for [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>]
DEBUG guest: Trying: mint
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/issue | grep 'Linux Mint' (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: atomic
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: grep 'ostree=' /proc/cmdline (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: fedora
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: grep 'Fedora release' /etc/redhat-release (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: pld
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/pld-release (sudo=false)
DEBUG ssh: stderr: cat: /etc/pld-release
DEBUG ssh: stderr: : No such file or directory

DEBUG ssh: Exit status: 1
DEBUG guest: Trying: ubuntu
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: [ -x /usr/bin/lsb_release ] && /usr/bin/lsb_release -i 2>/dev/null | grep Ubuntu (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: slackware
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/slackware-version (sudo=false)
DEBUG ssh: stderr: cat: /etc/slackware-version
DEBUG ssh: stderr: : No such file or directory

DEBUG ssh: Exit status: 1
DEBUG guest: Trying: suse
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: test -f /etc/SuSE-release || grep -q SUSE /etc/os-release (sudo=false)
DEBUG ssh: Exit status: 1
DEBUG guest: Trying: redhat
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/redhat-release (sudo=false)
DEBUG ssh: stdout: CentOS Linux release 7.1.1503 (Core) 

DEBUG ssh: Exit status: 0
 INFO guest: Detected: redhat!
DEBUG guest: Searching for cap: insert_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: insert_public_key in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: remove_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: remove_public_key in linux
 INFO ssh: Inserting key to avoid password: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC82+rH0ZT6/ca72Jzjmq/DLcpcOca5KH40GLg5OM3y+jpKP4IpNVAdaiwnhCz/jXH6NZRAiHE/pAM7B+LRFJMAYDju3MmAlWTC3IgtQR/yOr1fN0qit48dFipnT/SgHg5BnwVhoVgr68a5NlZG7QgbnWTlyBQJBdQQCmbzyRoU216rHXJxdbSGycLkGhz48gGf9GGckyAasTVNrRp/R02Y7J2cR4LhrnU4fIsB6pzquyKoXey+Q270a47B7bJ1M8wNQ8kWAJPkmJFThaTLEubD/Zhee75QObAGgY9kEg6wjw2igg0cqio+Uf0C6LIntXPRhjgKcsKKaODFpO3WdvBx vagrant
 INFO interface: detail: 
Inserting generated public key within guest...
 INFO interface: detail:     observer_client_1: 
    observer_client_1: Inserting generated public key within guest...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: insert_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: insert_public_key in linux
 INFO guest: Execute capability: insert_public_key [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, "ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC82+rH0ZT6/ca72Jzjmq/DLcpcOca5KH40GLg5OM3y+jpKP4IpNVAdaiwnhCz/jXH6NZRAiHE/pAM7B+LRFJMAYDju3MmAlWTC3IgtQR/yOr1fN0qit48dFipnT/SgHg5BnwVhoVgr68a5NlZG7QgbnWTlyBQJBdQQCmbzyRoU216rHXJxdbSGycLkGhz48gGf9GGckyAasTVNrRp/R02Y7J2cR4LhrnU4fIsB6pzquyKoXey+Q270a47B7bJ1M8wNQ8kWAJPkmJFThaTLEubD/Zhee75QObAGgY9kEg6wjw2igg0cqio+Uf0C6LIntXPRhjgKcsKKaODFpO3WdvBx vagrant"] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: mkdir -p ~/.ssh (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chmod 0700 ~/.ssh (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: printf 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC82+rH0ZT6/ca72Jzjmq/DLcpcOca5KH40GLg5OM3y+jpKP4IpNVAdaiwnhCz/jXH6NZRAiHE/pAM7B+LRFJMAYDju3MmAlWTC3IgtQR/yOr1fN0qit48dFipnT/SgHg5BnwVhoVgr68a5NlZG7QgbnWTlyBQJBdQQCmbzyRoU216rHXJxdbSGycLkGhz48gGf9GGckyAasTVNrRp/R02Y7J2cR4LhrnU4fIsB6pzquyKoXey+Q270a47B7bJ1M8wNQ8kWAJPkmJFThaTLEubD/Zhee75QObAGgY9kEg6wjw2igg0cqio+Uf0C6LIntXPRhjgKcsKKaODFpO3WdvBx vagrant\n' >> ~/.ssh/authorized_keys (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chmod 0600 ~/.ssh/authorized_keys (sudo=false)
DEBUG ssh: Exit status: 0
 INFO interface: detail: Removing insecure key from the guest if it's present...
 INFO interface: detail:     observer_client_1: Removing insecure key from the guest if it's present...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: remove_public_key
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: remove_public_key in linux
 INFO guest: Execute capability: remove_public_key [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, "ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key"] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: test -f ~/.ssh/authorized_keys (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: sed -e '/^.*ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA6NF8iallvQVp22WDkTkyrtvp9eWW6A8YVr+kz4TjGYe7gHzIw+niNltGEFHzD8+v1I2YJ6oXevct1YeS0o9HZyN1Q9qgCgzUFtdOKLv6IedplqoPkcmF0aYet2PkEDo3MlTBckFXPITAMzF8dJSIFo9D8HfdOV0IAdx4O7PtixWKn5y2hMNG0zQPyUecp4pzC6kivAIhyfHilFR61RGL+GPXQ2MWZWFYbAGjyiYJnAmCP3NOTd0jMZEnDkbUvxhMmBYSdETk1rRgm+R4LOzFUGaHqHDLKLX+FIPKcF96hrucXzcWyLbIbEgE98OHlnVYCzRdK8jlqm8tehUc9c9WhQ== vagrant insecure public key.*$/d' ~/.ssh/authorized_keys > ~/.ssh/authorized_keys.new
mv ~/.ssh/authorized_keys.new ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys
 (sudo=false)
DEBUG ssh: Exit status: 0
 INFO interface: detail: Key inserted! Disconnecting and reconnecting using new SSH key...
 INFO interface: detail:     observer_client_1: Key inserted! Disconnecting and reconnecting using new SSH key...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Checking key permissions: /home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key
 INFO ssh: Attempting to correct key permissions to 0600
 INFO ssh: Attempting SSH connection...
 INFO ssh: Attempting to connect to SSH...
 INFO ssh:   - Host: 192.168.121.109
 INFO ssh:   - Port: 22
 INFO ssh:   - Username: vagrant
 INFO ssh:   - Password? false
 INFO ssh:   - Key Path: ["/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key"]
DEBUG ssh: == Net-SSH connection debug-level log START ==
DEBUG ssh: D, [2016-06-22T13:38:45.565569 #11433] DEBUG -- net.ssh.transport.session[2b0625387d94]: establishing connection to 192.168.121.109:22
D, [2016-06-22T13:38:45.566114 #11433] DEBUG -- net.ssh.transport.session[2b0625387d94]: connection established
I, [2016-06-22T13:38:45.566199 #11433]  INFO -- net.ssh.transport.server_version[2b06253867f0]: negotiating protocol version
D, [2016-06-22T13:38:45.566239 #11433] DEBUG -- net.ssh.transport.server_version[2b06253867f0]: local is `SSH-2.0-Ruby/Net::SSH_3.2.0 x86_64-linux'
D, [2016-06-22T13:38:45.580171 #11433] DEBUG -- net.ssh.transport.server_version[2b06253867f0]: remote is `SSH-2.0-OpenSSH_6.6.1'
I, [2016-06-22T13:38:45.580430 #11433]  INFO -- net.ssh.transport.algorithms[2b062537b4cc]: sending KEXINIT
D, [2016-06-22T13:38:45.580584 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 0 type 20 len 1684
D, [2016-06-22T13:38:45.580658 #11433] DEBUG -- socket[2b06253872f4]: sent 1688 bytes
D, [2016-06-22T13:38:45.583348 #11433] DEBUG -- socket[2b06253872f4]: read 1640 bytes
D, [2016-06-22T13:38:45.583472 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 0 type 20 len 1636
I, [2016-06-22T13:38:45.583532 #11433]  INFO -- net.ssh.transport.algorithms[2b062537b4cc]: got KEXINIT from server
I, [2016-06-22T13:38:45.583665 #11433]  INFO -- net.ssh.transport.algorithms[2b062537b4cc]: negotiating algorithms
D, [2016-06-22T13:38:45.583769 #11433] DEBUG -- net.ssh.transport.algorithms[2b062537b4cc]: negotiated:
* kex: diffie-hellman-group-exchange-sha1
* host_key: ssh-rsa
* encryption_server: aes128-cbc
* encryption_client: aes128-cbc
* hmac_client: hmac-sha1
* hmac_server: hmac-sha1
* compression_client: none
* compression_server: none
* language_client: 
* language_server: 
D, [2016-06-22T13:38:45.583790 #11433] DEBUG -- net.ssh.transport.algorithms[2b062537b4cc]: exchanging keys
D, [2016-06-22T13:38:45.583917 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 1 type 34 len 20
D, [2016-06-22T13:38:45.583975 #11433] DEBUG -- socket[2b06253872f4]: sent 24 bytes
D, [2016-06-22T13:38:45.585167 #11433] DEBUG -- socket[2b06253872f4]: read 152 bytes
D, [2016-06-22T13:38:45.585234 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 1 type 31 len 148
D, [2016-06-22T13:38:45.586571 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 2 type 32 len 140
D, [2016-06-22T13:38:45.586653 #11433] DEBUG -- socket[2b06253872f4]: sent 144 bytes
D, [2016-06-22T13:38:45.589040 #11433] DEBUG -- socket[2b06253872f4]: read 720 bytes
D, [2016-06-22T13:38:45.589171 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 2 type 33 len 700
D, [2016-06-22T13:38:45.590056 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 3 type 21 len 20
D, [2016-06-22T13:38:45.590148 #11433] DEBUG -- socket[2b06253872f4]: sent 24 bytes
D, [2016-06-22T13:38:45.590240 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 3 type 21 len 12
D, [2016-06-22T13:38:45.590451 #11433] DEBUG -- net.ssh.authentication.session[2b06251f62b4]: beginning authentication of `vagrant'
D, [2016-06-22T13:38:45.590537 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 4 type 5 len 28
D, [2016-06-22T13:38:45.590560 #11433] DEBUG -- socket[2b06253872f4]: sent 52 bytes
D, [2016-06-22T13:38:45.629766 #11433] DEBUG -- socket[2b06253872f4]: read 52 bytes
D, [2016-06-22T13:38:45.629932 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 4 type 6 len 28
D, [2016-06-22T13:38:45.630029 #11433] DEBUG -- net.ssh.authentication.session[2b06251f62b4]: trying none
D, [2016-06-22T13:38:45.630223 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 5 type 50 len 44
D, [2016-06-22T13:38:45.630293 #11433] DEBUG -- socket[2b06253872f4]: sent 68 bytes
D, [2016-06-22T13:38:45.631640 #11433] DEBUG -- socket[2b06253872f4]: read 84 bytes
D, [2016-06-22T13:38:45.631796 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 5 type 51 len 60
D, [2016-06-22T13:38:45.631888 #11433] DEBUG -- net.ssh.authentication.session[2b06251f62b4]: allowed methods: publickey,gssapi-keyex,gssapi-with-mic,password
D, [2016-06-22T13:38:45.631977 #11433] DEBUG -- net.ssh.authentication.methods.none[2b06251ee0b4]: none failed
D, [2016-06-22T13:38:45.632045 #11433] DEBUG -- net.ssh.authentication.session[2b06251f62b4]: trying publickey
D, [2016-06-22T13:38:45.632327 #11433] DEBUG -- net.ssh.authentication.agent[2b06251db7fc]: connecting to ssh-agent
D, [2016-06-22T13:38:45.632478 #11433] DEBUG -- net.ssh.authentication.agent[2b06251db7fc]: sending agent request 1 len 44
D, [2016-06-22T13:38:45.632781 #11433] DEBUG -- net.ssh.authentication.agent[2b06251db7fc]: received agent packet 2 len 5
D, [2016-06-22T13:38:45.632834 #11433] DEBUG -- net.ssh.authentication.agent[2b06251db7fc]: sending agent request 11 len 0
D, [2016-06-22T13:38:45.633513 #11433] DEBUG -- net.ssh.authentication.agent[2b06251db7fc]: received agent packet 12 len 624
D, [2016-06-22T13:38:45.633797 #11433] DEBUG -- net.ssh.authentication.methods.publickey[2b06251dbba8]: trying publickey (ca:4d:ea:51:89:75:dd:2e:56:6b:99:d7:1d:09:99:02)
D, [2016-06-22T13:38:45.633900 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 6 type 50 len 348
D, [2016-06-22T13:38:45.633966 #11433] DEBUG -- socket[2b06253872f4]: sent 372 bytes
D, [2016-06-22T13:38:45.635170 #11433] DEBUG -- socket[2b06253872f4]: read 324 bytes
D, [2016-06-22T13:38:45.635272 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 6 type 60 len 300
D, [2016-06-22T13:38:45.636886 #11433] DEBUG -- socket[2b06253872f4]: queueing packet nr 7 type 50 len 620
D, [2016-06-22T13:38:45.636946 #11433] DEBUG -- socket[2b06253872f4]: sent 644 bytes
D, [2016-06-22T13:38:45.643253 #11433] DEBUG -- socket[2b06253872f4]: read 36 bytes
D, [2016-06-22T13:38:45.643402 #11433] DEBUG -- socket[2b06253872f4]: received packet nr 7 type 52 len 12
D, [2016-06-22T13:38:45.643469 #11433] DEBUG -- net.ssh.authentication.methods.publickey[2b06251dbba8]: publickey succeeded (ca:4d:ea:51:89:75:dd:2e:56:6b:99:d7:1d:09:99:02)

DEBUG ssh: == Net-SSH connection debug-level log END ==
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
 INFO wait_till_up: Time for SSH ready: 2.3943850994110107
 INFO warden: Calling IN action: #<VagrantPlugins::ProviderLibvirt::Action::ForwardPorts:0x00560c48e72368>
 INFO warden: Calling IN action: #<Vagrant::Action::Builtin::SetHostname:0x00560c48c8fb40>
 INFO warden: Calling IN action: #<Proc:0x00560c48c8f668@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Proc:0x00560c48c8f668@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::SetHostname:0x00560c48c8fb40>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::ForwardPorts:0x00560c48e72368>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::WaitTillUp:0x00560c4985d3a0>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::StartDomain:0x00560c499320c8>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::SetBootOrder:0x00560c499d9170>
DEBUG create_network_interfaces: Configuring interface slot_number 1 options {:iface_type=>:private_network, :netmask=>"255.255.255.0", :dhcp_enabled=>true, :forward_mode=>"nat", :ip=>"192.168.1.3", :libvirt__network_name=>"vagrant_internal_net", :mac=>"FA:16:3E:3D:C8:77", :libvirt__netmask=>"255.255.255.0", :whatever=>true, :protocol=>"tcp", :id=>"ecb1d77c-5f92-4ae3-bc51-7d75b6f10464", :network_name=>"vagrant_internal_net"}
 INFO interface: info: Configuring and enabling network interfaces...
 INFO interface: info: ==> observer_client_1: Configuring and enabling network interfaces...
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: configure_networks
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: configure_networks in redhat
 INFO guest: Execute capability: configure_networks [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, [{:type=>:static, :ip=>"192.168.1.3", :netmask=>"255.255.255.0", :interface=>1, :use_dhcp_assigned_default_route=>nil, :mac_address=>"FA:16:3E:3D:C8:77"}]] (redhat)
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: flavor
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: flavor in redhat
 INFO guest: Execute capability: flavor [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /etc/redhat-release (sudo=true)
DEBUG ssh: stdout: CentOS Linux release 7.1.1503 (Core) 

DEBUG ssh: Exit status: 0
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: network_scripts_dir
DEBUG guest: Checking in: redhat
DEBUG guest: Found cap: network_scripts_dir in redhat
 INFO guest: Execute capability: network_scripts_dir [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: /usr/sbin/biosdevname &>/dev/null; echo $? (sudo=true)
DEBUG ssh: stdout: 4

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: ls /sys/class/net | egrep -v lo\|docker (sudo=true)
DEBUG ssh: stdout: eth0
eth1

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /sys/class/net/eth0/address (sudo=true)
DEBUG ssh: stdout: 52:54:00:c4:47:06

DEBUG ssh: Exit status: 0
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: cat /sys/class/net/eth1/address (sudo=true)
DEBUG ssh: stdout: fa:16:3e:3d:c8:77

DEBUG ssh: Exit status: 0
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworkInterfaces:0x00560c49aa9140>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateNetworks:0x00560c4b10ca40>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::ShareFolders:0x00560c4b18a288>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSSettings:0x007f657c04ab30>
 INFO synced_folders: Invoking synced folder enable: rsync
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_installed
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_installed in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_installed
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_installed in linux
 INFO guest: Execute capability: rsync_installed [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: which rsync (sudo=false)
DEBUG ssh: stdout: /usr/bin/rsync

DEBUG ssh: Exit status: 0
DEBUG ssh: Checking key permissions: /home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_scrub_guestpath
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_command
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_command in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_command
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_command in linux
 INFO guest: Execute capability: rsync_command [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>] (redhat)
 INFO interface: info: Rsyncing folder: /home/uvsmtid/vagrant.issue.dir/ => /vagrant
 INFO interface: info: ==> observer_client_1: Rsyncing folder: /home/uvsmtid/vagrant.issue.dir/ => /vagrant
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_pre
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_pre in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_pre
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_pre in linux
 INFO guest: Execute capability: rsync_pre [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, {:guestpath=>"/vagrant", :hostpath=>"/home/uvsmtid/vagrant.issue.dir", :disabled=>false, :__vagrantfile=>true, :owner=>"vagrant", :group=>"vagrant"}] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: mkdir -p '/vagrant' (sudo=true)
DEBUG ssh: Exit status: 0
 INFO subprocess: Starting process: ["/usr/bin/rsync", "--verbose", "--archive", "--delete", "-z", "--copy-links", "--no-owner", "--no-group", "--rsync-path", "sudo rsync", "-e", "ssh -p 22 -o ControlMaster=auto -o ControlPath=/tmp/ssh.806 -o ControlPersist=10m -o StrictHostKeyChecking=no -o IdentitiesOnly=true -o UserKnownHostsFile=/dev/null -i '/home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key'", "--exclude", ".vagrant/", "/home/uvsmtid/vagrant.issue.dir/", "vagrant@192.168.121.109:/vagrant"]
 INFO subprocess: Command not in installer, restoring original environment...
DEBUG subprocess: Selecting on IO
DEBUG subprocess: stderr: Warning: Permanently added '192.168.121.109' (ECDSA) to the list of known hosts.
DEBUG subprocess: stdout: sending incremental file list
DEBUG subprocess: stdout: ./
DEBUG subprocess: stdout: Vagrantfile
DEBUG subprocess: stdout: vagrant.stderr.txt
DEBUG subprocess: stdout: vagrant.stdout.txt
DEBUG subprocess: stdout: 
sent 12,739 bytes  received 85 bytes  8,549.33 bytes/sec
total size is 73,370  speedup is 5.72
DEBUG subprocess: Waiting for process to exit. Remaining to timeout: 31999
DEBUG subprocess: Exit status: 0
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_post
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_post in linux
DEBUG ssh: Checking whether SSH is ready...
DEBUG ssh: Re-using SSH connection.
 INFO ssh: SSH is ready!
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute:  (sudo=false)
DEBUG ssh: Exit status: 0
DEBUG guest: Searching for cap: rsync_post
DEBUG guest: Checking in: redhat
DEBUG guest: Checking in: linux
DEBUG guest: Found cap: rsync_post in linux
 INFO guest: Execute capability: rsync_post [#<Vagrant::Machine: observer_client_1 (VagrantPlugins::ProviderLibvirt::Provider)>, {:guestpath=>"/vagrant", :hostpath=>"/home/uvsmtid/vagrant.issue.dir", :disabled=>false, :__vagrantfile=>true, :owner=>"vagrant", :group=>"vagrant"}] (redhat)
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: find '/vagrant' '!' -type l -a '(' ! -user vagrant -or ! -group vagrant ')' -print0 | xargs -0 -r chown vagrant:vagrant (sudo=true)
DEBUG ssh: Exit status: 0
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::SyncedFolders:0x007f657c0b5d18>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::SyncedFolderCleanup:0x007f657c101628>
 INFO warden: Calling OUT action: #<VagrantPlugins::SyncedFolderNFS::ActionCleanup:0x007f657c14cc40>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::PrepareNFSValidIds:0x007f657c177b70>
 INFO provision: Writing provisioning sentinel so we don't provision again
 INFO interface: info: Running provisioner: shell...
 INFO interface: info: ==> observer_client_1: Running provisioner: shell...
 INFO environment: Running hook: provisioner_run
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: provisioner_run #<Method: Vagrant::Action::Builtin::Provision#run_provisioner>
 INFO warden: Calling IN action: #<Proc:0x007f657c520a10@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
DEBUG ssh: Checking key permissions: /home/uvsmtid/vagrant.issue.dir/.vagrant/machines/observer_client_1/libvirt/private_key
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chown -R vagrant /tmp/vagrant-shell (sudo=true)
DEBUG ssh: stderr: chown: cannot access ‘/tmp/vagrant-shell’: No such file or directory

DEBUG ssh: Exit status: 1
DEBUG ssh: Uploading: /tmp/vagrant-shell20160622-11433-1r92kic.ps1 to /tmp/vagrant-shell
DEBUG ssh: Re-using SSH connection.
 INFO interface: detail: Running: inline script
 INFO interface: detail:     observer_client_1: Running: inline script
DEBUG ssh: Re-using SSH connection.
 INFO ssh: Execute: chmod +x '/tmp/vagrant-shell' && /tmp/vagrant-shell (sudo=true)
DEBUG ssh: Exit status: 0
 INFO warden: Calling OUT action: #<Proc:0x007f657c520a10@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Provision:0x007f657c1d9c80>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomain:0x007f657c221aa8>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::CreateDomainVolume:0x007f657c25ce28>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::HandleBoxImage:0x007f657c2ac748>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::HandleBox:0x007f657c2e6e48>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::HandleStoragePool:0x007f657c321ac0>
 INFO warden: Calling OUT action: #<VagrantPlugins::ProviderLibvirt::Action::SetNameOfDomain:0x007f657c375580>
 INFO warden: Calling OUT action: #<Proc:0x00560c4ace3810@/usr/share/vagrant/lib/vagrant/action/warden.rb:94 (lambda)>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::Call:0x007f657c01eb20>
 INFO warden: Calling OUT action: #<Vagrant::Action::Builtin::ConfigValidate:0x007f657c01eb48>
 INFO interface: Machine: action ["up", "end", {:target=>:observer_client_1}]
 INFO environment: Released process lock: machine-action-651ef229d320eb50ea4995976765e4dc
DEBUG environment: Attempting to acquire process-lock: dotlock
 INFO environment: Acquired process lock: dotlock
 INFO environment: Released process lock: dotlock
 INFO environment: Running hook: environment_unload
 INFO runner: Preparing hooks for middleware sequence...
 INFO runner: 2 hooks defined.
 INFO runner: Running action: environment_unload #<Vagrant::Action::Builder:0x00560c4ae9ac58>

@uvsmtid
Copy link

uvsmtid commented Jun 22, 2016

Ultimately

There is no known way to have specific fixed IP address assigned by Vagrant on private network.

This is true and consistently reproducible at least with CentOS 7.1 with the software versions used in the tests above.

NOTE: For example, if you switch box from CentOS 7.1 (uvsmtid/centos-7.1-1503-gnome) to CentOS 5.5 (uvsmtid/centos-5.5-minimal) in the Vagrantfile from the previous post everything works perfectly:

 diff -u Vagrantfile.orig Vagrantfile
--- Vagrantfile.orig    2016-06-22 15:42:00.591950245 +0800
+++ Vagrantfile 2016-06-22 16:15:36.512209916 +0800
@@ -16,7 +16,7 @@

   config.vm.define "observer_client_1" do |observer_client_1|

-    observer_client_1.vm.box = "uvsmtid/centos-7.1-1503-gnome"
+    observer_client_1.vm.box = "uvsmtid/centos-5.5-minimal"

     # See libvirt configuration:
     #   https://github.com/pradels/vagrant-libvirt

UPDATE: As it turns out below, the "known way" is to use lowercase MAC addresses. There will just be one issue left to be fixed - refresh of network state by Vagrant based on newly added MAC=>IP mapping.

@uvsmtid
Copy link

uvsmtid commented Jul 1, 2016

I think I have a clue to solve the problem.

Cause

I re-tested the example above on recent F24 x86_64 and the issue persists.
So, I decided to play with virsh command and see how creation of networks (together with Vagrant boxes) and their configuration seen by libvirt are affected. And I noticed that XML output (immediately after creating Vagrant boxes) on a fresh host OS does not contain any predefined MAC to IP address mapping listed Vagrantfile:

virsh net-dumpxml vagrant_internal_net

For example, this is the difference of (L) initial XML output and (R) XML output after modifications (see below in this post) - the host tag is missing:

 diff -u network.initial.xml network.modified.xml
--- network.initial.xml 2016-07-01 21:50:23.527342699 +0800
+++ network.modified.xml        2016-07-01 21:50:04.296402869 +0800
@@ -11,6 +11,7 @@
   <ip address='192.168.1.1' netmask='255.255.255.0'>
     <dhcp>
       <range start='192.168.1.1' end='192.168.1.254'/>
+      <host mac='fa:16:3e:3d:c8:77' ip='192.168.1.3'/>
     </dhcp>
   </ip>
 </network>

Workaround

To workaround this issue (automated in our continuous integration) this script can be used:

# Start Vagrant boxes (and create all necessary networks).

vagrant up

# Show XML configuration of the required network.
# Normally, after initial creation via Vagrant,
# it does not have `host` tags which map
# MAC to IP addresses.

virsh net-dumpxml [network_name]

# Add necessary MAC to IP mapping.
# Use `net-update` subcommand to avoid messing
# with XML config in `bash` scripts directly.
# See official example: http://wiki.libvirt.org/page/Networking#virsh_net-update
# - Command `add` may fail if specified MAC address
#   already exists (in the XML output above).
#   It is OK to ignore non-zero exit status.
# - Command `modify` should not fail as the mapping
#   is supposed to be already added by now.

virsh net-update vagrant_internal_net add ip-dhcp-host "<host mac='XX:XX:XX:XX:XX:XX' ip='NNN.NNN.NNN.NNN' />" --current
virsh net-update vagrant_internal_net modify ip-dhcp-host "<host mac='XX:XX:XX:XX:XX:XX' ip='NNN.NNN.NNN.NNN' />" --current

# Show XML configuration again to
# demonstrate the changes.

virsh net-dumpxml [network_name]

# Restart Vagrant boxes (to get required IPs from DHCP).

vagrant reload

Solution

I guess the host tags propagating MAC to IP address mapping from Vagrantfile got lost somewhere on the way to libvirt network configuration.

@infernix
Copy link
Member

infernix commented Jul 1, 2016

So for the @uvsmtid is describing, the bug is not in vagrant-libvirt but in Vagrant:

In vagrants plugins/guests/fedora/cap/configure_networks.rb:

          # Read interface MAC addresses for later matching
          mac_addresses = Array.new(interface_names.length)
          interface_names.each_with_index do |ifname, index|
            machine.communicate.sudo("cat /sys/class/net/#{ifname}/address") do |_, result|
              mac_addresses[index] = result.strip
            end
          end

This is reading a lowercase mac address, e.g. fa:16:3e:3d:c8:77, but your Vagrantfile specifies it uppercase, and the test further down in that code doesn't account for that:

     # as what interfaces we're actually configuring since we use that later.
          interfaces = Set.new
          networks.each do |network|
            interface = nil
            if network[:mac_address]
              found_idx = mac_addresses.find_index(network[:mac_address]) # <--- right here
              # Ignore network if requested MAC address could not be found
              next if found_idx.nil?
              interface = interface_names[found_idx]
            else
              ifname_by_slot = interface_names_by_slot[network[:interface]-1]
              # Don't overwrite if interface was already matched via MAC address
              next if interfaces.include?(ifname_by_slot)
              interface = ifname_by_slot
            end

So @uvsmtid that bug should be reported to Vagrant, not vagrant-libvirt. I will update the documentation that the mac address should be specified in lowercase, but it's really a workaround for this problem in Vagrant itself.

The initial issue mentioned above appears to be a box issue where the box starts a dhcp client by default on any interface, and then when Vagrant (and not vagrant-libvirt) reconfigures the interface, it doesn't stop the already started dhcpi client. That too is not a vagrant-libvirt specific issue.

I am therefore closing this as invalid, but feel free to reopen them if there's a concrete issue that vagrant-libvirt has and can be reproduced.

@uvsmtid
Copy link

uvsmtid commented Jul 11, 2016

Cross-reference to newly created Vagrant issue: hashicorp/vagrant#7566


Forcing MAC-to-IP mapping into dnsmasq

Vagrant will (hopefully) fix the "uppercase MAC problem" (which is trivial to avoid, if well-informed, but very difficult to spot, if not expecting).

Nevertheless, I noticed that there may still be a problem to force MAC-to-IP mapping in dnsmasq. I won't specify exact condition yet (it is hard to isolate and I lost incentive after finding a workaround) - instead, I'll just post the workaround itself ("the dance with tambourine around libvirt") in a script from our continuous integration platform (highlighted part is Jinja template which can still be relatively readable): https://github.com/uvsmtid/common-salt-states/blob/e35511242b674c02b0d54656293ae0f679e60afd/states/common/jenkins/configure_jobs_ext/deploy_pipeline.instantiate_vagrant_hosts.xml#L88-L126

The script uses virsh net-update [name] add ... followed by virsh net-update [name] modify ... commands to force MAC-to-IP DHCP mapping on live network managed by dnsmasq.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants