Skip to content

[Veeam] Find VBRViDatastore and VBRServer by name#6582

Merged
DaanHoogland merged 1 commit intoapache:mainfrom
scclouds:veeam-find-VBRViDatastore-VBRServer-by-name
Aug 2, 2022
Merged

[Veeam] Find VBRViDatastore and VBRServer by name#6582
DaanHoogland merged 1 commit intoapache:mainfrom
scclouds:veeam-find-VBRViDatastore-VBRServer-by-name

Conversation

@SadiJr
Copy link
Copy Markdown
Contributor

@SadiJr SadiJr commented Jul 28, 2022

Description

Using the VMWare hypervisor, with the Veeam plugin active, ACS tries to find, in Veeam, VBRViDatastore using the UUID of this storage in ACS, and VBRServer using the IP of the host in ACS. But, in some scenarios, the VBRViDatastore/VBRServer, in Veeam, can use the name of this component, which causes an exception in ACS. This PR aims to fix this behavior, improving the search of VBRViDatastore and VBRServer by using the name.

Types of changes

  • Breaking change (fix or feature that would cause existing functionality to change)
  • New feature (non-breaking change which adds functionality)
  • Bug fix (non-breaking change which fixes an issue)
  • Enhancement (improves an existing feature and functionality)
  • Cleanup (Code refactoring and cleanup, that may add test cases)

Feature/Enhancement Scale or Bug Severity

Feature/Enhancement Scale

  • Major
  • Minor

How Has This Been Tested?

It was tested in a local lab:

  1. I added new storage and host in my lab, and make sure the component, in Veeam, are identified by their names;
  2. I created a VM (the volume is in the new storage, and VM is in the new host) and added this VM to a backup offering;
  3. I make some manual backups;
  4. I tried to restore a backed-up volume of this VM;
  5. Before, an exception occurs because ACS can't find both VBRViDatastore/VBRServer in Veeam;
  6. Now, ACS can restore volume with no problem.

@sonarqubecloud
Copy link
Copy Markdown

SonarCloud Quality Gate failed.    Quality Gate failed

Bug C 1 Bug
Vulnerability A 0 Vulnerabilities
Security Hotspot A 0 Security Hotspots
Code Smell A 1 Code Smell

0.0% 0.0% Coverage
0.0% 0.0% Duplication

Copy link
Copy Markdown
Contributor

@DaanHoogland DaanHoogland left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

clgtm

@DaanHoogland
Copy link
Copy Markdown
Contributor

@blueorangutan package

@blueorangutan
Copy link
Copy Markdown

@DaanHoogland a Jenkins job has been kicked to build packages. It will be bundled with KVM, XenServer and VMware SystemVM templates. I'll keep you posted as I make progress.

@blueorangutan
Copy link
Copy Markdown

Packaging result: ✔️ el7 ✔️ el8 ✔️ debian ✔️ suse15. SL-JID 3873

@DaanHoogland
Copy link
Copy Markdown
Contributor

@blueorangutan test centos7 vmware-67u3

@blueorangutan
Copy link
Copy Markdown

@DaanHoogland a Trillian-Jenkins test job (centos7 mgmt + vmware-67u3) has been kicked to run smoke tests

Copy link
Copy Markdown
Contributor

@JoaoJandre JoaoJandre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

CLGM

@blueorangutan
Copy link
Copy Markdown

Trillian test result (tid-4588)
Environment: vmware-67u3 (x2), Advanced Networking with Mgmt server 7
Total time taken: 39550 seconds
Marvin logs: https://github.com/blueorangutan/acs-prs/releases/download/trillian/pr6582-t4588-vmware-67u3.zip
Smoke tests completed. 101 look OK, 0 have errors
Only failed tests results shown below:

Test Result Time (s) Test File

@DaanHoogland
Copy link
Copy Markdown
Contributor

As this requires a veeam augmented installation, I'm going to trust the submitter's testing.

@DaanHoogland DaanHoogland merged commit 6ba0ef2 into apache:main Aug 2, 2022
neogismm pushed a commit to neogismm/cloudstack that referenced this pull request Aug 6, 2022
Co-authored-by: Rafael Weingärtner <rafaelweingartner@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants