Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proxmox 7 - VM TrueNAS Core vs Scale #108

Closed
sonapsent opened this issue Nov 17, 2021 · 6 comments
Closed

Proxmox 7 - VM TrueNAS Core vs Scale #108

sonapsent opened this issue Nov 17, 2021 · 6 comments
Assignees
Labels
Diagnose Diagnose/Discuss a issue or concern wontfix This will not be worked on

Comments

@sonapsent
Copy link

I have two Proxmox servers and have been using your deb install for a few months now. All TrueNAS installs are VMs on each Proxmox host with HBA passthrough. On the new second server I thought I would try the Scale RC1 as it would behave better as a VM being Linux and with QEMU guest agent. Below are the errors I received while cloning a random VM to the new Scale with ZFS over ISCSI. I shutdown the Scale VM and created a 2nd TrueNAS Core VM, after importing the zfs pool and settings it worked with no errors. For now I will continue using TrueNAS Core. Let me know if you need more info or would like me to try something as I can switch back fairly easy.

Notes:

  • Blocksize warning even though the ZFS pool has ashift 12
  • The first disk seems to have be created as pool/dataset/zvol when viewed on TrueNAS Scale, but is not found, and fails before cloning the second disk

Output:
create full clone of drive sata0 (zScuzzy:vm-106-disk-0)
Warning: volblocksize (4096) is less than the default minimum block size (8192).
To reduce wasted space a volblocksize of 16384 is recommended.
iscsiadm: No session found.
iscsiadm: No session found.
transferred 0.0 B of 52.0 MiB (0.00%)
...
transferred 52.0 MiB of 52.0 MiB (100.00%)
cannot open 'OmegaPool-oscuzzy/vm-113-disk-0': dataset does not exist
create full clone of drive sata1 (zScuzzy:vm-106-disk-1)
cannot open 'OmegaPool-oscuzzy': no such pool
Could not find lu_name for zvol vm-113-disk-0 at /usr/share/perl5/PVE/Storage/ZFSPlugin.pm line 118.
TASK ERROR: clone failed: command '/usr/bin/ssh -o 'BatchMode=yes' -i /etc/pve/priv/zfs/192.168.XX.XX_id_rsa root@192.168.XX.XX zfs create -s -b 4k -V 8388608k OmegaPool-oscuzzy/vm-113-disk-0' failed: exit code 1

@TheGrandWazoo TheGrandWazoo self-assigned this Nov 21, 2021
@TheGrandWazoo TheGrandWazoo added the Diagnose Diagnose/Discuss a issue or concern label Nov 21, 2021
@TheGrandWazoo
Copy link
Owner

Please check issue #87 to see if this maybe you issue.

@sonapsent
Copy link
Author

sonapsent commented Nov 21, 2021

No I do not think this is the same issue. The ssl redirect is unchecked in TrueNAS-Scale and all my settings in Proxmox are the same when I use the same pool in a different VM TrueNAS-Core which works fine.

I noticed the generated command below is wrong:
'/usr/bin/ssh -o 'BatchMode=yes' -i /etc/pve/priv/zfs/192.168.1.50_id_rsa root@192.168.1.50 zfs create -s -b 4k -V 8388608k OmegaPool-oscuzzy/vm-113-disk-0'

This should have a "/" not a "-" : OmegaPool/oscuzzy/vm-113-disk-0

Is anyone else successfully running Proxmox 7.0 and TrueNAS-Scale? I realize scale is only in a release candidate state.

@TheGrandWazoo
Copy link
Owner

TheGrandWazoo commented Nov 22, 2021

I believe so. I have a couple that have it running. I have not yet. I had a bad HDD drive that I need to install to get the Scale up and running. Someone had to do two targets between colons (e.g. - iqn.org.something:taget1:another-target). I believe it is in one of the issues.

Possible you can provide me the syslog output from this? I remember having to do something with a character on the TrueNAS-SCALE in the very beginning.

@TheGrandWazoo
Copy link
Owner

Look at Issue #75 and why the slash (/) was turned into a dash (-).

@sonapsent
Copy link
Author

Can you tell me what two targets accomplishes? How would I set that up in Truenas?

Here is my storage.cfg:

zfs: oScuzzy
        blocksize 4k
        iscsiprovider freenas
        pool OmegaPool/oscuzzy
        portal 192.168.1.50
        target iqn.2005-10.org.freenas.ctl:oscuzzy
        content images
        freenas_apiv4_host 192.168.1.50
        freenas_password *removed*
        freenas_use_ssl 0
        freenas_user root
        nowritecache 0
        sparse 1

I can successfully create a new disk for an existing VM and it shows up in TN-Scale, but when I delete it on Proxmox it is not removed in TN-Scale. Those logs are not terribly interesting as they appear successful in Proxmox.

Here is the syslog that captures a failure when cloning VM-113 to 106:

Nov 21 21:11:10 pve1 pvedaemon[1314744]: <root@pam> starting task UPID:pve1:0022A76A:00EBE43D:619AFC3E:qmclone:113:root@pam:
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : create_lu(/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0)
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_create_lu : called with (method=create_lu; param[0]=/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0)
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_first_available_lunid : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_first_available_lunid : 3
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : called with (method=create_lu; result_value_type=name; object=/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0)
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-201-disk-0' and '/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/vm-110-disk-0' and '/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-113-disk-0' and '/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu(/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0) : name : lun not found
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_create_extent : called with (lun_path=/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0)
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: FreeNAS::API::create_extent(lun_path=zvol/OmegaPool/oscuzzy/vm-106-disk-0) : successful
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_create_target_to_extent : called with (target_id=1, extent_id=6, lun_id=3)
Nov 21 21:11:11 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:11 pve1 pvestatd[4163]: command '/usr/bin/ssh -o 'BatchMode=yes' -i /etc/pve/priv/zfs/192.168.1.51_id_rsa root@192.168.1.51 zfs get -o value -Hp available,used OmegaPool/oscuzzy' failed: exit code 255
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_create_target_to_extent(target_id=1, extent_id=6, lun_id=3) : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: FreeNAS::create_lu(lun_path=/dev/zvol/OmegaPool/oscuzzy/vm-106-disk-0, lun_id=3) : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : add_view()
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : list_lu(/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0)
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : called with (method=list_lu; result_value_type=name; object=/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0)
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : called
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : called
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : successful
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-201-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/vm-110-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-113-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-106-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:12 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu(/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0) : name : lun not found
Nov 21 21:11:12 pve1 pvedaemon[2271082]: VM 113 qmp command failed - VM 113 not running
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_lun_command : list_lu(/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0)
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : called with (method=list_lu; result_value_type=name; object=/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0)
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : called
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : called
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : called
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_get_targetid : successful : 1
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : called
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_target_to_extent : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : called
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : called for host '192.168.1.50'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_api_call : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_iscsi_get_extent : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::freenas_list_lu : successful
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-201-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/vm-110-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-113-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu : Verifing 'zvol/OmegaPool/oscuzzy/vm-106-disk-0' and '/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0'
Nov 21 21:11:13 pve1 pvedaemon[2271082]: PVE::Storage::LunCmd::FreeNAS::run_list_lu(/dev/zvol/OmegaPool-oscuzzy/vm-106-disk-0) : name : lun not found
Nov 21 21:11:13 pve1 pvedaemon[2271082]: Could not find lu_name for zvol vm-106-disk-0 at /usr/share/perl5/PVE/Storage/ZFSPlugin.pm line 118.
Nov 21 21:11:13 pve1 pvedaemon[2271082]: clone failed: Could not find lu_name for zvol vm-106-disk-0 at /usr/share/perl5/PVE/Storage/ZFSPlugin.pm line 118.
Nov 21 21:11:13 pve1 pvedaemon[1314744]: <root@pam> end task UPID:pve1:0022A76A:00EBE43D:619AFC3E:qmclone:113:root@pam: clone failed: Could not find lu_name for zvol vm-106-disk-0 at /usr/share/perl5/PVE/Storage/ZFSPlugin.pm line 118.

@stale
Copy link

stale bot commented Jan 21, 2022

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

@stale stale bot added the wontfix This will not be worked on label Jan 21, 2022
@stale stale bot closed this as completed Jan 28, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Diagnose Diagnose/Discuss a issue or concern wontfix This will not be worked on
Projects
None yet
Development

No branches or pull requests

2 participants