Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing to create a LV thinpool #43

Open
papanito opened this issue Dec 10, 2018 · 1 comment
Open

Failing to create a LV thinpool #43

papanito opened this issue Dec 10, 2018 · 1 comment

Comments

@papanito
Copy link
Contributor

I have 3 bare-metal servers on which I have 3 disks which are configured as a RAID. There are 3 partitions. One of these partitions I will use for my gluster cluster.

I've tried to configure a pool as follows:

---
- name: Create a GlusterFS brick on the servers
  remote_user: root
  hosts: gluster
  gather_facts: false
  vars:
    gluster_infra_disktype: JBOD
    # Dataalignment, for JBOD default is 256K if not provided.
    gluster_infra_dalign: 256K
    # VDO creation
    #gluster_infra_vdo:
      #- { name: 'hc_vdo_1', device: '/dev/md3' }
    gluster_infra_volume_groups:
      - { vgname: 'vg_md3', pvname: '/dev/md3' }
    # https://de.slideshare.net/GlusterCommunity/data-reduction-for-gluster-with-vdo
    # thinpoolname is optional, if not provided `vgname' followed by _thinpool is
    # used for name. poolmetadatasize is optional, default 16G is used
    gluster_infra_thinpools:
      - { vgname: 'vg_md3', thinpoolname: 'vg_md3_thinpool', thinpoolsize: '1.6T', poolmetadatasize: '2G' }
    # Thinvolumes for the brick. `thinpoolname' is optional, if omitted `vgname' followed by _thinpool is used
    gluster_infra_lv_logicalvols:
      - { vgname: 'vg_md3', thinpool: 'vg_md3_thinpool', vname: 'vg_md3_thinlv', lvsize: '1.8T' }
    gluster_infra_mount_devices:
      - { path: '/data/gluster/brick1', vgname: 'vg_md3', lvname: 'vg_md3_thinlv' }

  roles:
  - gluster.infra

Unfortunately this fails with:

TASK [gluster.infra/roles/backend_setup : Create a LV thinpool] *******************************************************************************************
failed: [prds0001] (item={'vgname': 'vg_md3', 'thinpoolname': 'vg_md3_thinpool', 'thinpoolsize': '1.6T', 'poolmetadatasize': '2G'}) => {"changed": false, "item": {"poolmetadatasize": "2G", "thinpoolname": "vg_md3_thinpool", "thinpoolsize": "1.6T", "vgname": "vg_md3"}, "module_stderr": "Shared connection to prds0001 closed.\r\n", "module_stdout": "Traceback (most recent call last):\r\n  File \"/root/.ansible/tmp/ansible-tmp-1544421225.6793776-67207272652594/AnsiballZ_lvol.py\", line 113, in <module>\r\n    _ansiballz_main()\r\n  File \"/root/.ansible/tmp/ansible-tmp-1544421225.6793776-67207272652594/AnsiballZ_lvol.py\", line 105, in _ansiballz_main\r\n    invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\r\n  File \"/root/.ansible/tmp/ansible-tmp-1544421225.6793776-67207272652594/AnsiballZ_lvol.py\", line 48, in invoke_module\r\n    imp.load_module('__main__', mod, module, MOD_DESC)\r\n  File \"/usr/lib/python3.5/imp.py\", line 234, in load_module\r\n    return load_source(name, filename, file)\r\n  File \"/usr/lib/python3.5/imp.py\", line 170, in load_source\r\n    module = _exec(spec, sys.modules[name])\r\n  File \"<frozen importlib._bootstrap>\", line 626, in _exec\r\n  File \"<frozen importlib._bootstrap_external>\", line 673, in exec_module\r\n  File \"<frozen importlib._bootstrap>\", line 222, in _call_with_frames_removed\r\n  File \"/tmp/ansible_lvol_payload_yyp74adb/__main__.py\", line 557, in <module>\r\n  File \"/tmp/ansible_lvol_payload_yyp74adb/__main__.py\", line 510, in main\r\nValueError: invalid literal for int() with base 10: '1.6'\r\n", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
@sac
Copy link
Member

sac commented May 8, 2019

@papanito firstly, I'm sorry I missed this.
I'm thinking this could be lvol module limitation, they expect only int. Can you please provide something like 163G and check.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants