Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ovirt_disk upload with wait set to false always fails #68487

Closed
jan-zmeskal opened this issue Mar 26, 2020 · 3 comments
Closed

ovirt_disk upload with wait set to false always fails #68487

jan-zmeskal opened this issue Mar 26, 2020 · 3 comments
Labels
affects_2.9 This issue/PR affects Ansible v2.9 bug This issue/PR relates to a bug. cloud module This issue/PR relates to a module. ovirt oVirt and RHV community support:community This issue/PR relates to code supported by the Ansible community. traceback This issue/PR includes a traceback.

Comments

@jan-zmeskal
Copy link
Contributor

SUMMARY

When you upload disk to RHV using ovirt_disk module, it always fails with error if you set wait parameter to false. However, the disk is actually successfully uploaded.

ISSUE TYPE
  • Bug Report
COMPONENT NAME

ovirt_disk

ANSIBLE VERSION
ansible 2.9.5
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/site-packages/ansible
  executable location = /usr/bin/ansible
  python version = 2.7.5 (default, Sep 26 2019, 13:23:47) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
CONFIGURATION

OS / ENVIRONMENT

RHEL 7.8
rhvm-4.3.9.0-0.1.el7.noarch

STEPS TO REPRODUCE

First try running this playbook, it will succeed.

- hosts: localhost
  gather_facts: no
  tasks:
    - ovirt_auth:
        url: https://<engine_fqdn>/ovirt-engine/api
        username: admin@internal
        password: '<password>'
        insecure: yes

    - ovirt_disk:
        auth: "{{ ovirt_auth }}"
        name: rhcos_uploaded
        size: 1GiB
        format: raw
        interface: virtio_scsi
        storage_domain: nfs_0
        bootable: yes
        sparse: no
        sparsify: no
        timeout: 3600
        upload_image_path: /root/Fedora-Cloud-Base-31-1.9.x86_64.raw.xz
        wait: yes

Now run the same playbook again, just with wait parameter set to no. The playbook will fail with error described below.

EXPECTED RESULTS

The playbook should succeed even though it does not know if the disk will actually be uploaded to RHV. However, once it has been handed over to RHV, the playbook does not need to care any more if we don't want to wait for result of the upload operation.

ACTUAL RESULTS
ansible-playbook 2.9.5
  config file = /etc/ansible/ansible.cfg
  configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
  ansible python module location = /usr/lib/python2.7/site-packages/ansible
  executable location = /usr/bin/ansible-playbook
  python version = 2.7.5 (default, Sep 26 2019, 13:23:47) [GCC 4.8.5 20150623 (Red Hat 4.8.5-39)]
Using /etc/ansible/ansible.cfg as config file
host_list declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
script declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
auto declined parsing /etc/ansible/hosts as it did not pass its verify_file() method
Parsed /etc/ansible/hosts inventory source with ini plugin
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'

PLAYBOOK: upload.yaml ************************************************************************************************************************************************************************
1 plays in upload.yaml

PLAY [localhost] *****************************************************************************************************************************************************************************
META: ran handlers

TASK [ovirt_auth] ****************************************************************************************************************************************************************************
task path: /root/upload.yaml:4
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: root
<127.0.0.1> EXEC /bin/sh -c 'echo ~root && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515 `" && echo ansible-tmp-1585216003.07-185778057869515="` echo /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515 `" ) && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/cloud/ovirt/ovirt_auth.py
<127.0.0.1> PUT /root/.ansible/tmp/ansible-local-6958pIDYnc/tmpPeM7aQ TO /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515/AnsiballZ_ovirt_auth.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515/ /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515/AnsiballZ_ovirt_auth.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python2 /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515/AnsiballZ_ovirt_auth.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1585216003.07-185778057869515/ > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
    "ansible_facts": {
        "ovirt_auth": {
            "ca_file": null, 
            "compress": true, 
            "headers": null, 
            "insecure": true, 
            "kerberos": false, 
            "timeout": 0, 
            "token": "78HN8BKcOAaOIzuSuVRMisPkHHF6WMayf57mobuf_2U404HPcjZkBdZRgJUTHKA_Z639mF8fs-zudW40a1MorA", 
            "url": "https://<engine_fqdn>/ovirt-engine/api"
        }
    }, 
    "changed": false, 
    "invocation": {
        "module_args": {
            "ca_file": null, 
            "compress": true, 
            "headers": null, 
            "hostname": null, 
            "insecure": true, 
            "kerberos": false, 
            "ovirt_auth": null, 
            "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", 
            "state": "present", 
            "timeout": 0, 
            "token": null, 
            "url": "https://<engine_fqdn>/ovirt-engine/api", 
            "username": "admin@internal"
        }
    }
}

TASK [ovirt_disk] ****************************************************************************************************************************************************************************
task path: /root/upload.yaml:10
<127.0.0.1> ESTABLISH LOCAL CONNECTION FOR USER: root
<127.0.0.1> EXEC /bin/sh -c 'echo ~root && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631 `" && echo ansible-tmp-1585216008.87-27215033466631="` echo /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631 `" ) && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/cloud/ovirt/ovirt_disk.py
<127.0.0.1> PUT /root/.ansible/tmp/ansible-local-6958pIDYnc/tmpfOSn3I TO /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631/AnsiballZ_ovirt_disk.py
<127.0.0.1> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631/ /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631/AnsiballZ_ovirt_disk.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c '/usr/bin/python2 /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631/AnsiballZ_ovirt_disk.py && sleep 0'
<127.0.0.1> EXEC /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1585216008.87-27215033466631/ > /dev/null 2>&1 && sleep 0'
The full traceback is:
Traceback (most recent call last):
  File "/tmp/ansible_ovirt_disk_payload_lTZEhw/ansible_ovirt_disk_payload.zip/ansible/modules/cloud/ovirt/ovirt_disk.py", line 743, in main
  File "/tmp/ansible_ovirt_disk_payload_lTZEhw/ansible_ovirt_disk_payload.zip/ansible/modules/cloud/ovirt/ovirt_disk.py", line 480, in upload_disk_image
  File "/tmp/ansible_ovirt_disk_payload_lTZEhw/ansible_ovirt_disk_payload.zip/ansible/modules/cloud/ovirt/ovirt_disk.py", line 363, in transfer
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/services.py", line 13081, in add
    return self._internal_add(image_transfer, headers, query, wait)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 232, in _internal_add
    return future.wait() if wait else future
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 55, in wait
    return self._code(response)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 229, in callback
    self._check_fault(response)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 132, in _check_fault
    self._raise_error(response, body)
  File "/usr/lib64/python2.7/site-packages/ovirtsdk4/service.py", line 118, in _raise_error
    raise error
Error: Fault reason is "Operation Failed". Fault detail is "[Cannot transfer Virtual Disk: The following disks are locked: rhcos_uploaded. Please try again in a few minutes.]". HTTP response code is 409.
fatal: [localhost]: FAILED! => {
    "changed": false, 
    "invocation": {
        "module_args": {
            "activate": null, 
            "auth": {
                "ca_file": null, 
                "compress": true, 
                "headers": null, 
                "insecure": true, 
                "kerberos": false, 
                "timeout": 0, 
                "token": "78HN8BKcOAaOIzuSuVRMisPkHHF6WMayf57mobuf_2U404HPcjZkBdZRgJUTHKA_Z639mF8fs-zudW40a1MorA", 
                "url": "https://<engine_fqdn>/ovirt-engine/api"
            }, 
            "bootable": true, 
            "content_type": "data", 
            "description": null, 
            "download_image_path": null, 
            "fetch_nested": false, 
            "force": false, 
            "format": "raw", 
            "host": null, 
            "id": "6156da4e-6995-44cd-9261-eb0ba7f4b91c", 
            "image_provider": null, 
            "interface": "virtio_scsi", 
            "logical_unit": null, 
            "name": "rhcos_uploaded", 
            "nested_attributes": [], 
            "openstack_volume_type": null, 
            "poll_interval": 3, 
            "profile": null, 
            "quota_id": null, 
            "shareable": null, 
            "size": "1GiB", 
            "sparse": false, 
            "sparsify": false, 
            "state": "present", 
            "storage_domain": "nfs_0", 
            "storage_domains": null, 
            "timeout": 3600, 
            "upload_image_path": "/root/Fedora-Cloud-Base-31-1.9.x86_64.raw.xz", 
            "vm_id": null, 
            "vm_name": null, 
            "wait": false, 
            "wipe_after_delete": null
        }
    }, 
    "msg": "Fault reason is \"Operation Failed\". Fault detail is \"[Cannot transfer Virtual Disk: The following disks are locked: rhcos_uploaded. Please try again in a few minutes.]\". HTTP response code is 409."
}

PLAY RECAP ***********************************************************************************************************************************************************************************
localhost                  : ok=1    changed=0    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   
@ansibot
Copy link
Contributor

ansibot commented Mar 26, 2020

Files identified in the description:

If these files are incorrect, please update the component name section of the description or use the !component bot command.

click here for bot help

@ansibot ansibot added affects_2.9 This issue/PR affects Ansible v2.9 bug This issue/PR relates to a bug. cloud module This issue/PR relates to a module. needs_triage Needs a first human triage before being processed. ovirt oVirt and RHV community support:community This issue/PR relates to code supported by the Ansible community. traceback This issue/PR includes a traceback. labels Mar 26, 2020
@mnecas
Copy link
Contributor

mnecas commented Apr 17, 2020

We will continue on the issue in the collection.

@mnecas
Copy link
Contributor

mnecas commented Apr 17, 2020

close_me

@ansibot ansibot closed this as completed Apr 24, 2020
@sivel sivel removed the needs_triage Needs a first human triage before being processed. label Apr 29, 2020
@ansible ansible locked and limited conversation to collaborators May 22, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
affects_2.9 This issue/PR affects Ansible v2.9 bug This issue/PR relates to a bug. cloud module This issue/PR relates to a module. ovirt oVirt and RHV community support:community This issue/PR relates to code supported by the Ansible community. traceback This issue/PR includes a traceback.
Projects
None yet
Development

No branches or pull requests

4 participants