-
-
Notifications
You must be signed in to change notification settings - Fork 795
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
log shipping enhancements #5278
Comments
Ouch that would introduce so much latency... |
could it still be useful for Managed Instance migrations? |
I'm exploring this scenario and I don't thinks there is such a big latency in there if done correctly. |
@sanderstad hey Sander, is there any update to this? I really would like logshipping solution to azure blob storage, and this dbatools cmdlet would be just perfect if it has option to store not just to a file share (e.g. network drive), but to azure blob storage as well. let us choose the storage option, this would literally make the cmdlet a masterpiece! Ive also requested this formally last year too, but havent received updates and thought to circle back here on it |
@WeyardWiz If it helps, cloud storage as our backend for leveraging backup operations in addition to supporting DBA tools, specifically for log shipping. The S3 bucket is encrypted with an AWS managed key; we have an IAM role attached to the sql instances and a policy that allows a read and write permissions from the bucket, along with permission to decrypt accessing KMS. We mount the bucket using rclone, that is run with nssm (not sucking service manager) on boot which mounts the bucket to a local volume. We expose this new mount as a fileshare via powershell, this is where we assign ACL permissions. We backup to this location, ignoring the "copy" step in log shipping, and restore right from the network share (bucket). We replaced Idera SQLsafe with the Ola backup scripts and it provided better throughput writing and reading to S3 than the 3rd party tool because it does not support multiple backup files. If I understand backing up to native URL storage (azure blob storage) you may not be able to specify multiple backup files for backup operations, this might be prohibitive from a performance point of view. Our rlclone mount command looks something like this: To satisfy the log shipping requirement of a shared folder we run something like this
Hopefully you'll find some inspiration here |
Interesting! You know, the funny thing is i was able to bypass the SMB constraint issue in my organization by simply connecting to mobile hotspot apparently...Nonetheless, obviously theres no hotspot on the VMs so that wont work longterm LOL rclone might be what Ima try next in the meantime while they update the Invoke function. Btw, I dont understand this line you mentioned: "We replaced Idera SQLsafe with the Ola backup script". Whats Idera SQLsafe/Ola backup script? Are these opensource logshipping scripts as an alternative to InvokeDbaLogShipping command? I was searching for so long for log shipping PowerShell scripts and the closest I found was this DBATools InvokeDbaLogShipping command. |
@WeyardWiz Sorry, I made a few assumptions. When I had mentioned:
I was trying to speak to the performance of using a mounted bucket vs using a 3rd party tool that can write directly to cloud storage. Idera SQLsafe is a SQL server backup vendor. Ola Hallengren is a well known SQL author who arguably has written the best out of box maintenance solutions that include backups and index/stats maintenance. They are not an alternative to dbatools log shipping. Though you may find that using Ola's scripts in conjunction with the dbatools library can make your life easier. |
please explore log shipping to blob storage
The text was updated successfully, but these errors were encountered: