Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Creating new LV in existing VG requires specifying disk #96

Closed
tabowling opened this issue May 29, 2020 · 4 comments · Fixed by #201
Closed

Creating new LV in existing VG requires specifying disk #96

tabowling opened this issue May 29, 2020 · 4 comments · Fixed by #201
Milestone

Comments

@tabowling
Copy link
Contributor

I want to create a new LV in an existing VG. I would like it to assume the existing setup and only specify the new things to add.

Today, I do this with a simple lvcreate command to create a new LV in an existing VG. I only need to provide the VGname, LVname, and size:
lvcreate -L1G -s -n /dev/virtual-machines/ha1-snapshot /dev/virtual-machines/ha1

However, this does not seem to work with the following playbook, as it errors suggesting it cannot lookup the disks.

- hosts: all
  remote_user: root

#  become: yes
#  become_method: sudo
#  become_user: root

  vars:

  tasks:
    - name: create some test storage
      include_role:
        name: linux-system-roles.storage
      vars:
        storage_pools:
          - name: fedora_alderaan
            # type: lvm
            state: present
            volumes:
              - name: test
                size: "1G"    
                # type: lvm
                # fs_type: xfs
                fs_label: "test"
                mount_point: '/mnt/test'
@tabowling
Copy link
Contributor Author

Actual output with errors:

$ ansible-playbook -l alderaan storage-test.yml -vvv
[sudo] password for tbowling: 
ansible-playbook 2.9.9
  config file = /etc/ansible/ansible.cfg
  configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python3.7/site-packages/ansible
  executable location = /usr/bin/ansible-playbook
  python version = 3.7.7 (default, Mar 13 2020, 10:23:39) [GCC 9.2.1 20190827 (Red Hat 9.2.1-1)]
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin

PLAYBOOK: storage-test.yml *************************************************************************************************
1 plays in storage-test.yml

PLAY [all] *****************************************************************************************************************

TASK [Gathering Facts] *****************************************************************************************************
task path: /home/tbowling/src/ansible-schtuff/storage-test.yml:1
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'echo ~root && sleep 0'"'"''
<alderaan> (0, b'/root\n', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411 && echo ansible-tmp-1590778692.539002-466755-187366296496411="` echo /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411 `" ) && sleep 0'"'"''
<alderaan> (0, b'ansible-tmp-1590778692.539002-466755-187366296496411=/root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411\n', b'')
Using module file /usr/lib/python3.7/site-packages/ansible/modules/system/setup.py
<alderaan> PUT /root/.ansible/tmp/ansible-local-466749o064q3yn/tmp7rm5wcza TO /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411/AnsiballZ_setup.py
<alderaan> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 '[alderaan]'
<alderaan> (0, b'sftp> put /root/.ansible/tmp/ansible-local-466749o064q3yn/tmp7rm5wcza /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411/AnsiballZ_setup.py\n', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'chmod u+x /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411/ /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411/AnsiballZ_setup.py && sleep 0'"'"''
<alderaan> (0, b'', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 -tt alderaan '/bin/sh -c '"'"'/usr/bin/python3 /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411/AnsiballZ_setup.py && sleep 0'"'"''
<alderaan> (0, b'\r\n{"ansible_facts": {"ansible_fips": false, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "root", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "kvm", "ansible_virtualization_role": "host", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "31", "ansible_distribution_major_version": "31", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC0mwu1vp/LE8n7p296f4cTNzcB5v21lbbmLB0M/IOy+abrsks3RCaSnnu2Yo57Jt/RrX/NXDscPBLJqH26xtXYduGBg7r+NEdAh91OlJct1qmeIzkr42i+x33Z7lGJIvD3v0XSOTgnnlt4cg0P1U5wbCuzDZdoIlWE4UDgqHoUMpyOtnz35FOhKVAuyqqM0fS8r9hJgHq0k8B2HcYokxvw1zyCbFnsJZaQhYdHpr+9WG6E/Qh076btVCcBStES+yUJeKu/PUpwBfYM1Hmv/8YT/fTt9FUzUjz81MRxgoj53IhO48Lkw3YVvgo/UrXkDXNahSrqoQch7nHcSplHPt5w6E5HGBaF31tmQotNcbjJaPQGAPwK1otG+PnhENich+2MpynJ5Be6DEujaSnRJwiZNTLZ1cWA3OlbsA5Mg6LL0X4UCcsk2ViGwxiO+qKGXST2vPpLjGwf9dbY5eZDNOBZ7aY7hKSwLYcNcjOY6IjD2Zf41Pqzj1LLQ/Nqxk2PUqU=", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH4lhgfSMSQpVJ89nrqisouF6AlKP0CUkTX8FOiECo+cS+4QNlOzvs/JyidqaP2AanyqJvzTBY8HsR8gOuC/2u8=", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIMOEc5rDOaRxc08BFH3urjVcS11PNySdWd51kqqo9Brh", "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_system_capabilities_enforced": "True", "ansible_system_capabilities": ["cap_chown", "cap_dac_override", "cap_dac_read_search", "cap_fowner", "cap_fsetid", "cap_kill", "cap_setgid", "cap_setuid", "cap_setpcap", "cap_linux_immutable", "cap_net_bind_service", "cap_net_broadcast", "cap_net_admin", "cap_net_raw", "cap_ipc_lock", "cap_ipc_owner", "cap_sys_module", "cap_sys_rawio", "cap_sys_chroot", "cap_sys_ptrace", "cap_sys_pacct", "cap_sys_admin", "cap_sys_boot", "cap_sys_nice", "cap_sys_resource", "cap_sys_time", "cap_sys_tty_config", "cap_mknod", "cap_lease", "cap_audit_write", "cap_audit_control", "cap_setfcap", "cap_mac_override", "cap_mac_admin", "cap_syslog", "cap_wake_alarm", "cap_block_suspend", "cap_audit_read+ep"], "ansible_dns": {"search": ["attlocal.net", "alderaan.local"], "nameservers": ["192.168.1.254", "2600:1700:5ce0:1e70::1"]}, "ansible_system": "Linux", "ansible_kernel": "5.6.13-200.fc31.x86_64", "ansible_kernel_version": "#1 SMP Thu May 14 23:26:14 UTC 2020", "ansible_machine": "x86_64", "ansible_python_version": "3.7.7", "ansible_fqdn": "alderaan.local", "ansible_hostname": "alderaan", "ansible_nodename": "alderaan.local", "ansible_domain": "local", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "6f0fb4ea4f3242399adead5d572fc4eb", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/vmlinuz-5.6.13-200.fc31.x86_64", "root": "/dev/mapper/fedora_alderaan-root", "ro": true, "resume": "/dev/mapper/fedora_alderaan-swap", "rd.lvm.lv": "fedora_alderaan/swap", "rhgb": true, "quiet": true}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/vmlinuz-5.6.13-200.fc31.x86_64", "root": "/dev/mapper/fedora_alderaan-root", "ro": true, "resume": "/dev/mapper/fedora_alderaan-swap", "rd.lvm.lv": ["fedora_alderaan/root", "fedora_alderaan/swap"], "rhgb": true, "quiet": true}, "ansible_env": {"SHELL": "/bin/bash", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "rs=0:di=38;5;33:ln=38;5;51:mh=00:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=01;37;41:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;40:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.zst=38;5;9:*.tzst=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.wim=38;5;9:*.swm=38;5;9:*.dwm=38;5;9:*.esd=38;5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.mjpg=38;5;13:*.mjpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=38;5;45:*.au=38;5;45:*.flac=38;5;45:*.m4a=38;5;45:*.mid=38;5;45:*.midi=38;5;45:*.mka=38;5;45:*.mp3=38;5;45:*.mpc=38;5;45:*.ogg=38;5;45:*.ra=38;5;45:*.wav=38;5;45:*.oga=38;5;45:*.opus=38;5;45:*.spx=38;5;45:*.xspf=38;5;45:", "SSH_CONNECTION": "192.168.1.83 39584 192.168.1.77 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "TERM": "xterm-256color", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "7", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "192.168.1.83 39584 22", "PATH": "/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/2"}, "ansible_is_chroot": false, "ansible_lsb": {}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 32, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_processor": ["0", "AuthenticAMD", "AMD A8-4500M APU with Radeon(tm) HD Graphics", "1", "AuthenticAMD", "AMD A8-4500M APU with Radeon(tm) HD Graphics", "2", "AuthenticAMD", "AMD A8-4500M APU with Radeon(tm) HD Graphics", "3", "AuthenticAMD", "AMD A8-4500M APU with Radeon(tm) HD Graphics"], "ansible_processor_count": 1, "ansible_processor_cores": 2, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 4, "ansible_memtotal_mb": 7420, "ansible_memfree_mb": 4165, "ansible_swaptotal_mb": 3531, "ansible_swapfree_mb": 3531, "ansible_memory_mb": {"real": {"total": 7420, "used": 3255, "free": 4165}, "nocache": {"free": 6767, "used": 653}, "swap": {"total": 3531, "free": 3531, "used": 0, "cached": 0}}, "ansible_bios_date": "02/21/2013", "ansible_bios_version": "F.26", "ansible_form_factor": "Notebook", "ansible_product_name": "HP Pavilion g7 Notebook PC", "ansible_product_serial": "5CD2432N3N", "ansible_product_uuid": "32444335-3334-4e32-334e-8434977ef3d8", "ansible_product_version": "0889110002305910000620100", "ansible_system_vendor": "Hewlett-Packard", "ansible_devices": {"dm-1": {"virtual": 1, "links": {"ids": ["dm-name-fedora_alderaan-swap", "dm-uuid-LVM-s9yuXfgWSHiVNvnx9vL0uxL9qtjSmUFzfc66CKG4RGOiLed7caDm8dAg8K46Jgln"], "uuids": ["f4d58419-fcf8-443c-ab2e-09d6a31c8f16"], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7233536", "sectorsize": "512", "size": "3.45 GB", "host": "", "holders": []}, "sr0": {"virtual": 1, "links": {"ids": ["ata-hp_DVD-RAM_UJ8B1_SCC3317114"], "uuids": [], "labels": [], "masters": []}, "vendor": "hp", "model": "DVD-RAM UJ8B1", "sas_address": null, "sas_device_handle": null, "removable": "1", "support_discard": "0", "partitions": {}, "rotational": "1", "scheduler_mode": "bfq", "sectors": "2097151", "sectorsize": "512", "size": "1024.00 MB", "host": "SATA controller: Advanced Micro Devices, Inc. [AMD] FCH SATA Controller [AHCI mode]", "holders": []}, "dm-0": {"virtual": 1, "links": {"ids": ["dm-name-fedora_alderaan-root", "dm-uuid-LVM-s9yuXfgWSHiVNvnx9vL0uxL9qtjSmUFz0Uouh0roQNrIchLO7wH9Trs1698hl4mE"], "uuids": ["5b2bdc56-6671-4242-aebd-cdb09f8f9891"], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "31457280", "sectorsize": "512", "size": "15.00 GB", "host": "", "holders": []}, "sda": {"virtual": 1, "links": {"ids": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364", "wwn-0x5001b448b1614e83"], "uuids": [], "labels": [], "masters": []}, "vendor": "ATA", "model": "WDC  WDS500G2B0A", "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "wwn": "0x5001b448b1614e83", "partitions": {"sda2": {"links": {"ids": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364-part2", "wwn-0x5001b448b1614e83-part2"], "uuids": ["96984978-7450-4fbc-8977-166c89f9bd1b"], "labels": [], "masters": []}, "start": "1230848", "sectors": "2097152", "sectorsize": 512, "size": "1.00 GB", "uuid": "96984978-7450-4fbc-8977-166c89f9bd1b", "holders": []}, "sda3": {"links": {"ids": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364-part3", "lvm-pv-uuid-mcWQhV-EPgR-ksqW-kP7T-qMn9-OSMv-4Ds53d", "wwn-0x5001b448b1614e83-part3"], "uuids": [], "labels": [], "masters": ["dm-0", "dm-1"]}, "start": "3328000", "sectors": "973445120", "sectorsize": 512, "size": "464.17 GB", "uuid": null, "holders": ["fedora_alderaan-swap", "fedora_alderaan-root"]}, "sda1": {"links": {"ids": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364-part1", "wwn-0x5001b448b1614e83-part1"], "uuids": ["DCF5-A41A"], "labels": [], "masters": []}, "start": "2048", "sectors": "1228800", "sectorsize": 512, "size": "600.00 MB", "uuid": "DCF5-A41A", "holders": []}}, "rotational": "0", "scheduler_mode": "bfq", "sectors": "976773168", "sectorsize": "512", "size": "465.76 GB", "host": "SATA controller: Advanced Micro Devices, Inc. [AMD] FCH SATA Controller [AHCI mode]", "holders": []}}, "ansible_device_links": {"ids": {"dm-1": ["dm-name-fedora_alderaan-swap", "dm-uuid-LVM-s9yuXfgWSHiVNvnx9vL0uxL9qtjSmUFzfc66CKG4RGOiLed7caDm8dAg8K46Jgln"], "dm-0": ["dm-name-fedora_alderaan-root", "dm-uuid-LVM-s9yuXfgWSHiVNvnx9vL0uxL9qtjSmUFz0Uouh0roQNrIchLO7wH9Trs1698hl4mE"], "sr0": ["ata-hp_DVD-RAM_UJ8B1_SCC3317114"], "sda3": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364-part3", "lvm-pv-uuid-mcWQhV-EPgR-ksqW-kP7T-qMn9-OSMv-4Ds53d", "wwn-0x5001b448b1614e83-part3"], "sda2": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364-part2", "wwn-0x5001b448b1614e83-part2"], "sda1": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364-part1", "wwn-0x5001b448b1614e83-part1"], "sda": ["ata-WDC_WDS500G2B0A-00SM50_1940AE801364", "wwn-0x5001b448b1614e83"]}, "uuids": {"dm-1": ["f4d58419-fcf8-443c-ab2e-09d6a31c8f16"], "dm-0": ["5b2bdc56-6671-4242-aebd-cdb09f8f9891"], "sda2": ["96984978-7450-4fbc-8977-166c89f9bd1b"], "sda1": ["DCF5-A41A"]}, "labels": {}, "masters": {"sda3": ["dm-0", "dm-1"]}}, "ansible_uptime_seconds": 2562, "ansible_lvm": {"lvs": {"root": {"size_g": "15.00", "vg": "fedora_alderaan"}, "swap": {"size_g": "3.45", "vg": "fedora_alderaan"}}, "vgs": {"fedora_alderaan": {"size_g": "464.17", "free_g": "445.72", "num_lvs": "2", "num_pvs": "1"}}, "pvs": {"/dev/sda3": {"size_g": "464.17", "free_g": "445.72", "vg": "fedora_alderaan"}}}, "ansible_mounts": [{"mount": "/", "device": "/dev/mapper/fedora_alderaan-root", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 16095641600, "size_available": 7547293696, "block_size": 4096, "block_total": 3929600, "block_available": 1842601, "block_used": 2086999, "inode_total": 7864320, "inode_available": 7592043, "inode_used": 272277, "uuid": "5b2bdc56-6671-4242-aebd-cdb09f8f9891"}, {"mount": "/boot", "device": "/dev/sda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "size_total": 1063256064, "size_available": 778571776, "block_size": 4096, "block_total": 259584, "block_available": 190081, "block_used": 69503, "inode_total": 524288, "inode_available": 524255, "inode_used": 33, "uuid": "96984978-7450-4fbc-8977-166c89f9bd1b"}, {"mount": "/boot/efi", "device": "/dev/sda1", "fstype": "vfat", "options": "rw,relatime,fmask=0077,dmask=0077,codepage=437,iocharset=ascii,shortname=winnt,errors=remount-ro", "size_total": 627900416, "size_available": 619220992, "block_size": 4096, "block_total": 153296, "block_available": 151177, "block_used": 2119, "inode_total": 0, "inode_available": 0, "inode_used": 0, "uuid": "DCF5-A41A"}], "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 7, "micro": 7, "releaselevel": "final", "serial": 0}, "version_info": [3, 7, 7, "final", 0], "executable": "/usr/bin/python3", "has_sslcontext": true, "type": "cpython"}, "ansible_date_time": {"year": "2020", "month": "05", "weekday": "Friday", "weekday_number": "5", "weeknumber": "21", "day": "29", "hour": "14", "minute": "58", "second": "14", "epoch": "1590778694", "date": "2020-05-29", "time": "14:58:14", "iso8601_micro": "2020-05-29T18:58:14.257329Z", "iso8601": "2020-05-29T18:58:14Z", "iso8601_basic": "20200529T145814257091", "iso8601_basic_short": "20200529T145814", "tz": "EDT", "tz_offset": "-0400"}, "ansible_iscsi_iqn": "iqn.1994-05.com.redhat:e43b99e2edd", "ansible_interfaces": ["lo", "eno1", "wlo1"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "host", "netmask": "255.0.0.0", "network": "127.0.0.0"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off"}, "timestamping": ["tx_software", "rx_software", "software"], "hw_timestamp_filters": []}, "ansible_eno1": {"device": "eno1", "macaddress": "84:34:97:7e:f3:d8", "mtu": 1500, "active": false, "module": "r8169", "type": "ether", "pciid": "0000:05:00.0", "speed": -1, "promisc": false, "features": {"rx_checksumming": "on", "tx_checksumming": "on", "tx_checksum_ipv4": "on", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "off", "tx_scatter_gather": "off", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "off", "tx_tcp_segmentation": "off", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "off", "generic_segmentation_offload": "off [requested on]", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "on", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off", "rx_all": "off", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off"}, "timestamping": ["tx_software", "rx_software", "software"], "hw_timestamp_filters": []}, "ansible_wlo1": {"device": "wlo1", "macaddress": "20:68:9d:be:47:04", "mtu": 1500, "active": true, "module": "ath9k", "type": "ether", "pciid": "0000:02:00.0", "promisc": false, "ipv4": {"address": "192.168.1.77", "broadcast": "192.168.1.255", "netmask": "255.255.255.0", "network": "192.168.1.0"}, "ipv6": [{"address": "2600:1700:5ce0:1e70:a551:3f5c:3ab1:7446", "prefix": "64", "scope": "global"}, {"address": "fe80::2686:b280:8259:f3bb", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "off", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "off", "tx_scatter_gather": "off [fixed]", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "off", "tx_tcp_segmentation": "off [fixed]", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off [fixed]", "tx_tcp6_segmentation": "off [fixed]", "generic_segmentation_offload": "off [requested on]", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off"}, "timestamping": ["rx_software", "software"], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "192.168.1.254", "interface": "wlo1", "address": "192.168.1.77", "broadcast": "192.168.1.255", "netmask": "255.255.255.0", "network": "192.168.1.0", "macaddress": "20:68:9d:be:47:04", "mtu": 1500, "type": "ether", "alias": "wlo1"}, "ansible_default_ipv6": {"gateway": "fe80::16ed:bbff:feaf:a4e5", "interface": "wlo1", "address": "2600:1700:5ce0:1e70:a551:3f5c:3ab1:7446", "prefix": "64", "scope": "global", "macaddress": "20:68:9d:be:47:04", "mtu": 1500, "type": "ether"}, "ansible_all_ipv4_addresses": ["192.168.1.77"], "ansible_all_ipv6_addresses": ["2600:1700:5ce0:1e70:a551:3f5c:3ab1:7446", "fe80::2686:b280:8259:f3bb"], "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": "*", "fact_path": "/etc/ansible/facts.d"}}}\r\n', b'Shared connection to alderaan closed.\r\n')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'rm -f -r /root/.ansible/tmp/ansible-tmp-1590778692.539002-466755-187366296496411/ > /dev/null 2>&1 && sleep 0'"'"''
<alderaan> (0, b'', b'')
ok: [alderaan]
META: ran handlers

TASK [create some test storage] ********************************************************************************************
task path: /home/tbowling/src/ansible-schtuff/storage-test.yml:11

TASK [linux-system-roles.storage : define an empty list of pools to be used in testing] ************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main.yml:2
ok: [alderaan] => {
    "ansible_facts": {
        "_storage_pools_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : define an empty list of volumes to be used in testing] **********************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main.yml:6
ok: [alderaan] => {
    "ansible_facts": {
        "_storage_volumes_list": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : include the appropriate backend tasks] **************************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main.yml:10
included: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml for alderaan

TASK [linux-system-roles.storage : get a list of rpm packages installed on host machine] ***********************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:2
skipping: [alderaan] => {
    "changed": false,
    "skip_reason": "Conditional result was False"
}

TASK [linux-system-roles.storage : make sure blivet is available] **********************************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:7
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'echo ~root && sleep 0'"'"''
<alderaan> (0, b'/root\n', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770 && echo ansible-tmp-1590778694.8135836-466778-150043181234770="` echo /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770 `" ) && sleep 0'"'"''
<alderaan> (0, b'ansible-tmp-1590778694.8135836-466778-150043181234770=/root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770\n', b'')
Using module file /usr/lib/python3.7/site-packages/ansible/modules/packaging/os/dnf.py
<alderaan> PUT /root/.ansible/tmp/ansible-local-466749o064q3yn/tmpu_a69n1z TO /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770/AnsiballZ_dnf.py
<alderaan> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 '[alderaan]'
<alderaan> (0, b'sftp> put /root/.ansible/tmp/ansible-local-466749o064q3yn/tmpu_a69n1z /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770/AnsiballZ_dnf.py\n', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'chmod u+x /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770/ /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770/AnsiballZ_dnf.py && sleep 0'"'"''
<alderaan> (0, b'', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 -tt alderaan '/bin/sh -c '"'"'/usr/bin/python3 /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770/AnsiballZ_dnf.py && sleep 0'"'"''
<alderaan> (0, b'\r\n{"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["python3-blivet"], "state": "present", "allow_downgrade": false, "autoremove": false, "bugfix": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "lock_timeout": 30, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "releasever": null}}}\r\n', b'Shared connection to alderaan closed.\r\n')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'rm -f -r /root/.ansible/tmp/ansible-tmp-1590778694.8135836-466778-150043181234770/ > /dev/null 2>&1 && sleep 0'"'"''
<alderaan> (0, b'', b'')
ok: [alderaan] => {
    "changed": false,
    "invocation": {
        "module_args": {
            "allow_downgrade": false,
            "autoremove": false,
            "bugfix": false,
            "conf_file": null,
            "disable_excludes": null,
            "disable_gpg_check": false,
            "disable_plugin": [],
            "disablerepo": [],
            "download_dir": null,
            "download_only": false,
            "enable_plugin": [],
            "enablerepo": [],
            "exclude": [],
            "install_repoquery": true,
            "install_weak_deps": true,
            "installroot": "/",
            "list": null,
            "lock_timeout": 30,
            "name": [
                "python3-blivet"
            ],
            "releasever": null,
            "security": false,
            "skip_broken": false,
            "state": "present",
            "update_cache": false,
            "update_only": false,
            "validate_certs": true
        }
    },
    "msg": "Nothing to do",
    "rc": 0,
    "results": []
}

TASK [linux-system-roles.storage : initialize internal facts] **************************************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:18
ok: [alderaan] => {
    "ansible_facts": {
        "_storage_pools": [],
        "_storage_vol_defaults": [],
        "_storage_vol_pools": [],
        "_storage_vols_no_defaults": [],
        "_storage_vols_no_defaults_by_pool": {},
        "_storage_vols_w_defaults": [],
        "_storage_volumes": []
    },
    "changed": false
}

TASK [linux-system-roles.storage : Apply defaults to pools and volumes [1/6]] **********************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:28
ok: [alderaan] => (item={'name': 'fedora_alderaan', 'state': 'present', 'volumes': [{'name': 'test', 'size': '1G', 'fs_label': 'test', 'mount_point': '/mnt/test'}]}) => {
    "ansible_facts": {
        "_storage_pools": [
            {
                "name": "fedora_alderaan",
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "fs_label": "test",
                        "mount_point": "/mnt/test",
                        "name": "test",
                        "size": "1G"
                    }
                ]
            }
        ]
    },
    "ansible_loop_var": "pool",
    "changed": false,
    "pool": {
        "name": "fedora_alderaan",
        "state": "present",
        "volumes": [
            {
                "fs_label": "test",
                "mount_point": "/mnt/test",
                "name": "test",
                "size": "1G"
            }
        ]
    }
}

TASK [linux-system-roles.storage : Apply defaults to pools and volumes [2/6]] **********************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:36
ok: [alderaan] => (item=[{'state': 'present', 'type': 'lvm', 'name': 'fedora_alderaan', 'volumes': [{'name': 'test', 'size': '1G', 'fs_label': 'test', 'mount_point': '/mnt/test'}]}, {'name': 'test', 'size': '1G', 'fs_label': 'test', 'mount_point': '/mnt/test'}]) => {
    "ansible_facts": {
        "_storage_vol_defaults": [
            {
                "fs_create_options": "",
                "fs_label": "",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "",
                "size": 0,
                "state": "present",
                "type": "lvm"
            }
        ],
        "_storage_vol_pools": [
            "fedora_alderaan"
        ],
        "_storage_vols_no_defaults": [
            {
                "fs_label": "test",
                "mount_point": "/mnt/test",
                "name": "test",
                "size": "1G"
            }
        ]
    },
    "ansible_loop_var": "item",
    "changed": false,
    "item": [
        {
            "name": "fedora_alderaan",
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "fs_label": "test",
                    "mount_point": "/mnt/test",
                    "name": "test",
                    "size": "1G"
                }
            ]
        },
        {
            "fs_label": "test",
            "mount_point": "/mnt/test",
            "name": "test",
            "size": "1G"
        }
    ]
}

TASK [linux-system-roles.storage : Apply defaults to pools and volumes [3/6]] **********************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:44
ok: [alderaan] => (item=[{'name': 'test', 'size': '1G', 'fs_label': 'test', 'mount_point': '/mnt/test'}, {'state': 'present', 'type': 'lvm', 'size': 0, 'fs_type': 'xfs', 'fs_label': '', 'fs_create_options': '', 'fs_overwrite_existing': True, 'mount_point': '', 'mount_options': 'defaults', 'mount_check': 0, 'mount_passno': 0, 'mount_device_identifier': 'uuid'}]) => {
    "ansible_facts": {
        "_storage_vols_w_defaults": [
            {
                "fs_create_options": "",
                "fs_label": "test",
                "fs_overwrite_existing": true,
                "fs_type": "xfs",
                "mount_check": 0,
                "mount_device_identifier": "uuid",
                "mount_options": "defaults",
                "mount_passno": 0,
                "mount_point": "/mnt/test",
                "name": "test",
                "pool": "fedora_alderaan",
                "size": "1G",
                "state": "present",
                "type": "lvm"
            }
        ]
    },
    "ansible_index_var": "idx",
    "ansible_loop_var": "item",
    "changed": false,
    "idx": 0,
    "item": [
        {
            "fs_label": "test",
            "mount_point": "/mnt/test",
            "name": "test",
            "size": "1G"
        },
        {
            "fs_create_options": "",
            "fs_label": "",
            "fs_overwrite_existing": true,
            "fs_type": "xfs",
            "mount_check": 0,
            "mount_device_identifier": "uuid",
            "mount_options": "defaults",
            "mount_passno": 0,
            "mount_point": "",
            "size": 0,
            "state": "present",
            "type": "lvm"
        }
    ]
}

TASK [linux-system-roles.storage : Apply defaults to pools and volumes [4/6]] **********************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:52
ok: [alderaan] => (item={'state': 'present', 'type': 'lvm', 'name': 'fedora_alderaan', 'volumes': [{'name': 'test', 'size': '1G', 'fs_label': 'test', 'mount_point': '/mnt/test'}]}) => {
    "ansible_facts": {
        "_storage_vols_no_defaults_by_pool": {
            "fedora_alderaan": [
                {
                    "fs_create_options": "",
                    "fs_label": "test",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "/mnt/test",
                    "name": "test",
                    "pool": "fedora_alderaan",
                    "size": "1G",
                    "state": "present",
                    "type": "lvm"
                }
            ]
        }
    },
    "ansible_loop_var": "item",
    "changed": false,
    "item": {
        "name": "fedora_alderaan",
        "state": "present",
        "type": "lvm",
        "volumes": [
            {
                "fs_label": "test",
                "mount_point": "/mnt/test",
                "name": "test",
                "size": "1G"
            }
        ]
    }
}

TASK [linux-system-roles.storage : Apply defaults to pools and volumes [5/6]] **********************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:58
ok: [alderaan] => (item={'state': 'present', 'type': 'lvm', 'name': 'fedora_alderaan', 'volumes': [{'name': 'test', 'size': '1G', 'fs_label': 'test', 'mount_point': '/mnt/test'}]}) => {
    "ansible_facts": {
        "_storage_pools": [
            {
                "name": "fedora_alderaan",
                "state": "present",
                "type": "lvm",
                "volumes": [
                    {
                        "fs_create_options": "",
                        "fs_label": "test",
                        "fs_overwrite_existing": true,
                        "fs_type": "xfs",
                        "mount_check": 0,
                        "mount_device_identifier": "uuid",
                        "mount_options": "defaults",
                        "mount_passno": 0,
                        "mount_point": "/mnt/test",
                        "name": "test",
                        "pool": "fedora_alderaan",
                        "size": "1G",
                        "state": "present",
                        "type": "lvm"
                    }
                ]
            }
        ]
    },
    "ansible_index_var": "idx",
    "ansible_loop_var": "pool",
    "changed": false,
    "idx": 0,
    "pool": {
        "name": "fedora_alderaan",
        "state": "present",
        "type": "lvm",
        "volumes": [
            {
                "fs_label": "test",
                "mount_point": "/mnt/test",
                "name": "test",
                "size": "1G"
            }
        ]
    }
}

TASK [linux-system-roles.storage : Apply defaults to pools and volumes [6/6]] **********************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:67

TASK [linux-system-roles.storage : debug] **********************************************************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:79
ok: [alderaan] => {
    "_storage_pools": [
        {
            "name": "fedora_alderaan",
            "state": "present",
            "type": "lvm",
            "volumes": [
                {
                    "fs_create_options": "",
                    "fs_label": "test",
                    "fs_overwrite_existing": true,
                    "fs_type": "xfs",
                    "mount_check": 0,
                    "mount_device_identifier": "uuid",
                    "mount_options": "defaults",
                    "mount_passno": 0,
                    "mount_point": "/mnt/test",
                    "name": "test",
                    "pool": "fedora_alderaan",
                    "size": "1G",
                    "state": "present",
                    "type": "lvm"
                }
            ]
        }
    ]
}

TASK [linux-system-roles.storage : debug] **********************************************************************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:82
ok: [alderaan] => {
    "_storage_volumes": []
}

TASK [linux-system-roles.storage : manage the pools and volumes to match the specified state] ******************************
task path: /usr/share/ansible/roles/linux-system-roles.storage/tasks/main-blivet.yml:85
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'echo ~root && sleep 0'"'"''
<alderaan> (0, b'/root\n', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981 && echo ansible-tmp-1590778707.7451808-467077-127951557497981="` echo /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981 `" ) && sleep 0'"'"''
<alderaan> (0, b'ansible-tmp-1590778707.7451808-467077-127951557497981=/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981\n', b'')
Using module file /usr/share/ansible/roles/linux-system-roles.storage/library/blivet.py
<alderaan> PUT /root/.ansible/tmp/ansible-local-466749o064q3yn/tmpfp4vay0f TO /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py
<alderaan> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 '[alderaan]'
<alderaan> (0, b'sftp> put /root/.ansible/tmp/ansible-local-466749o064q3yn/tmpfp4vay0f /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py\n', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'chmod u+x /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/ /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py && sleep 0'"'"''
<alderaan> (0, b'', b'')
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 -tt alderaan '/bin/sh -c '"'"'/usr/bin/python3 /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py && sleep 0'"'"''
<alderaan> (1, b'Traceback (most recent call last):\r\n  File "/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py", line 102, in <module>\r\n    _ansiballz_main()\r\n  File "/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py", line 94, in _ansiballz_main\r\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n  File "/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py", line 40, in invoke_module\r\n    runpy.run_module(mod_name=\'ansible.modules.blivet\', init_globals=None, run_name=\'__main__\', alter_sys=True)\r\n  File "/usr/lib64/python3.7/runpy.py", line 205, in run_module\r\n    return _run_module_code(code, init_globals, run_name, mod_spec)\r\n  File "/usr/lib64/python3.7/runpy.py", line 96, in _run_module_code\r\n    mod_name, mod_spec, pkg_name, script_name)\r\n  File "/usr/lib64/python3.7/runpy.py", line 85, in _run_code\r\n    exec(code, run_globals)\r\n  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 730, in <module>\r\n  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 727, in main\r\n  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 692, in run_module\r\n  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 539, in manage_pool\r\n  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 457, in manage\r\n  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 381, in _look_up_disks\r\nKeyError: \'disks\'\r\n', b'Shared connection to alderaan closed.\r\n')
<alderaan> Failed to connect to the host via ssh: Shared connection to alderaan closed.
<alderaan> ESTABLISH SSH CONNECTION FOR USER: root
<alderaan> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o 'IdentityFile="/home/tbowling/.ssh/id_rsa_demo"' -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="root"' -o ConnectTimeout=10 -o ControlPath=/root/.ansible/cp/01d6e12cb9 alderaan '/bin/sh -c '"'"'rm -f -r /root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/ > /dev/null 2>&1 && sleep 0'"'"''
<alderaan> (0, b'', b'')
The full traceback is:
Traceback (most recent call last):
  File "/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py", line 102, in <module>
    _ansiballz_main()
  File "/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py", line 94, in _ansiballz_main
    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
  File "/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py", line 40, in invoke_module
    runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True)
  File "/usr/lib64/python3.7/runpy.py", line 205, in run_module
    return _run_module_code(code, init_globals, run_name, mod_spec)
  File "/usr/lib64/python3.7/runpy.py", line 96, in _run_module_code
    mod_name, mod_spec, pkg_name, script_name)
  File "/usr/lib64/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 730, in <module>
  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 727, in main
  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 692, in run_module
  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 539, in manage_pool
  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 457, in manage
  File "/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py", line 381, in _look_up_disks
KeyError: 'disks'
fatal: [alderaan]: FAILED! => {
    "changed": false,
    "module_stderr": "Shared connection to alderaan closed.\r\n",
    "module_stdout": "Traceback (most recent call last):\r\n  File \"/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py\", line 102, in <module>\r\n    _ansiballz_main()\r\n  File \"/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py\", line 94, in _ansiballz_main\r\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n  File \"/root/.ansible/tmp/ansible-tmp-1590778707.7451808-467077-127951557497981/AnsiballZ_blivet.py\", line 40, in invoke_module\r\n    runpy.run_module(mod_name='ansible.modules.blivet', init_globals=None, run_name='__main__', alter_sys=True)\r\n  File \"/usr/lib64/python3.7/runpy.py\", line 205, in run_module\r\n    return _run_module_code(code, init_globals, run_name, mod_spec)\r\n  File \"/usr/lib64/python3.7/runpy.py\", line 96, in _run_module_code\r\n    mod_name, mod_spec, pkg_name, script_name)\r\n  File \"/usr/lib64/python3.7/runpy.py\", line 85, in _run_code\r\n    exec(code, run_globals)\r\n  File \"/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 730, in <module>\r\n  File \"/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 727, in main\r\n  File \"/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 692, in run_module\r\n  File \"/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 539, in manage_pool\r\n  File \"/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 457, in manage\r\n  File \"/tmp/ansible_blivet_payload_6a_40ky_/ansible_blivet_payload.zip/ansible/modules/blivet.py\", line 381, in _look_up_disks\r\nKeyError: 'disks'\r\n",
    "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error",
    "rc": 1
}

PLAY RECAP *****************************************************************************************************************
alderaan                   : ok=13   changed=0    unreachable=0    failed=1    skipped=2    rescued=0    ignored=0   

@tabowling
Copy link
Contributor Author

It does work successfully if I add disks: ['sda3'] to the playbook:

- hosts: all
  remote_user: root

#  become: yes
#  become_method: sudo
#  become_user: root

  vars:

  tasks:
    - name: create some test storage
      include_role:
        name: linux-system-roles.storage
      vars:
        storage_pools:
          - name: fedora_alderaan
            disks: ['sda3']
            # type: lvm
            state: present
            volumes:
              - name: test
                size: "1G"    
                # type: lvm
                # fs_type: xfs
                fs_label: "test"
                mount_point: '/mnt/test'

Results after successful run

[root@alderaan ~]# lvs
  LV   VG              Attr       LSize  Pool Origin Data%  Meta%  Move Log Cpy%Sync Convert
  root fedora_alderaan -wi-ao---- 15.00g                                                    
  swap fedora_alderaan -wi-ao---- <3.45g                                                    

[root@alderaan ~]# df -h
Filesystem                        Size  Used Avail Use% Mounted on
devtmpfs                          3.7G     0  3.7G   0% /dev
tmpfs                             3.7G     0  3.7G   0% /dev/shm
tmpfs                             3.7G  1.4M  3.7G   1% /run
/dev/mapper/fedora_alderaan-root   15G  8.0G  7.1G  54% /
tmpfs                             3.7G  588K  3.7G   1% /tmp
/dev/sda2                        1014M  272M  743M  27% /boot
/dev/sda1                         599M  8.3M  591M   2% /boot/efi
tmpfs                             743M     0  743M   0% /run/user/0
/dev/mapper/fedora_alderaan-test 1014M   40M  975M   4% /mnt/test

@dwlehman
Copy link
Collaborator

dwlehman commented Jun 1, 2020

So the request is to interpret a pool w/ no disks as a lookup by whatever information is provided? That's pretty reasonable, but understand that it will be unusable when the pool isn't pre-existing.

@dwlehman
Copy link
Collaborator

dwlehman commented Jul 9, 2020

#59 should enable this as part (fbb0a9e) of improved handling for missing parameters. It includes a test for this specific case.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants