Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

log shipping enhancements #5278

Open
potatoqualitee opened this issue Mar 22, 2019 · 7 comments
Open

log shipping enhancements #5278

potatoqualitee opened this issue Mar 22, 2019 · 7 comments
Assignees
Labels
azure sql Command has issue with Azure SQL due to unsupported SMO or T-SQL code. feature

Comments

@potatoqualitee
Copy link
Member

please explore log shipping to blob storage

@potatoqualitee potatoqualitee added feature azure sql Command has issue with Azure SQL due to unsupported SMO or T-SQL code. labels Mar 22, 2019
@wsmelton
Copy link
Member

Ouch that would introduce so much latency...

@potatoqualitee
Copy link
Member Author

could it still be useful for Managed Instance migrations?

@sanderstad
Copy link
Contributor

I'm exploring this scenario and I don't thinks there is such a big latency in there if done correctly.
What I would like to do is, in case the blob storage is being used, is to copy the files to the local server in the backup job on the primary and the copy job on the secondary.
It does involve a lot of working parts though and this will not be an easy feature, but I'll get it to work.

@WeyardWiz
Copy link

@sanderstad hey Sander, is there any update to this? I really would like logshipping solution to azure blob storage, and this dbatools cmdlet would be just perfect if it has option to store not just to a file share (e.g. network drive), but to azure blob storage as well. let us choose the storage option, this would literally make the cmdlet a masterpiece! Ive also requested this formally last year too, but havent received updates and thought to circle back here on it

@winthropgutmann-ef
Copy link

winthropgutmann-ef commented Jun 7, 2022

@WeyardWiz If it helps, cloud storage as our backend for leveraging backup operations in addition to supporting DBA tools, specifically for log shipping. The S3 bucket is encrypted with an AWS managed key; we have an IAM role attached to the sql instances and a policy that allows a read and write permissions from the bucket, along with permission to decrypt accessing KMS.

We mount the bucket using rclone, that is run with nssm (not sucking service manager) on boot which mounts the bucket to a local volume. We expose this new mount as a fileshare via powershell, this is where we assign ACL permissions.

We backup to this location, ignoring the "copy" step in log shipping, and restore right from the network share (bucket). We replaced Idera SQLsafe with the Ola backup scripts and it provided better throughput writing and reading to S3 than the 3rd party tool because it does not support multiple backup files. If I understand backing up to native URL storage (azure blob storage) you may not be able to specify multiple backup files for backup operations, this might be prohibitive from a performance point of view.

Our rlclone mount command looks something like this:
mount <rclone storage object>:<aws-s3-bucket-name/directory> <drive/mount letter> --config="C:\scripts\rclone\rclone.conf" --vfs-cache-mode full --cache-dir <local NVMe storage path> --cache-chunk-total-size 100G --no-modtime -o GroupName=Administrators --file-perms 0777 --dir-perms 0777

To satisfy the log shipping requirement of a shared folder we run something like this

if(-not(Get-SMBShare -Name $ShareName)){
	New-SmbShare -Name $ShareName -Path $Path -FullAccess $ListOfDomainAccounts
}

Hopefully you'll find some inspiration here

@WeyardWiz
Copy link

@WeyardWiz If it helps, cloud storage as our backend for leveraging backup operations in addition to supporting DBA tools, specifically for log shipping. The S3 bucket is encrypted with an AWS managed key; we have an IAM role attached to the sql instances and a policy that allows a read and write permissions from the bucket, along with permission to decrypt accessing KMS.

We mount the bucket using rclone, that is run with nssm (not sucking service manager) on boot which mounts the bucket to a local volume. We expose this new mount as a fileshare via powershell, this is where we assign ACL permissions.

We backup to this location, ignoring the "copy" step in log shipping, and restore right from the network share (bucket). We replaced Idera SQLsafe with the Ola backup scripts and it provided better throughput writing and reading to S3 than the 3rd party tool because it does not support multiple backup files. If I understand backing up to native URL storage (azure blob storage) you may not be able to specify multiple backup files for backup operations, this might be prohibitive from a performance point of view.

Our rlclone mount command looks something like this: mount <rclone storage object>:<aws-s3-bucket-name/directory> <drive/mount letter> --config="C:\scripts\rclone\rclone.conf" --vfs-cache-mode full --cache-dir <local NVMe storage path> --cache-chunk-total-size 100G --no-modtime -o GroupName=Administrators --file-perms 0777 --dir-perms 0777

To satisfy the log shipping requirement of a shared folder we run something like this

if(-not(Get-SMBShare -Name $ShareName)){
	New-SmbShare -Name $ShareName -Path $Path -FullAccess $ListOfDomainAccounts
}

Hopefully you'll find some inspiration here

Interesting! You know, the funny thing is i was able to bypass the SMB constraint issue in my organization by simply connecting to mobile hotspot apparently...Nonetheless, obviously theres no hotspot on the VMs so that wont work longterm LOL

rclone might be what Ima try next in the meantime while they update the Invoke function. Btw, I dont understand this line you mentioned: "We replaced Idera SQLsafe with the Ola backup script". Whats Idera SQLsafe/Ola backup script? Are these opensource logshipping scripts as an alternative to InvokeDbaLogShipping command? I was searching for so long for log shipping PowerShell scripts and the closest I found was this DBATools InvokeDbaLogShipping command.

@winthropgutmann-ef
Copy link

@WeyardWiz Sorry, I made a few assumptions. When I had mentioned:

We replaced Idera SQLsafe with the Ola backup scripts and it provided better throughput writing and reading to S3 than the 3rd party tool because it does not support multiple backup files. If I understand backing up to native URL storage (azure blob storage) you may not be able to specify multiple backup files for backup operations, this might be prohibitive from a performance point of view.

I was trying to speak to the performance of using a mounted bucket vs using a 3rd party tool that can write directly to cloud storage. Idera SQLsafe is a SQL server backup vendor. Ola Hallengren is a well known SQL author who arguably has written the best out of box maintenance solutions that include backups and index/stats maintenance. They are not an alternative to dbatools log shipping. Though you may find that using Ola's scripts in conjunction with the dbatools library can make your life easier.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
azure sql Command has issue with Azure SQL due to unsupported SMO or T-SQL code. feature
Projects
None yet
Development

No branches or pull requests

5 participants