You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is your feature request related to a problem? Please describe.
I have a need to restore data from Azure Blob storage (without copying locally first). The backups were created with Ola's script directly to a file share, and then transferred to Blob Storage. The files maintain their current path (i.e. {databaseName}/{full|diff|log}/{original filename}) in the storage container.
Describe the solution you'd like
Be able to pass sp_databaseRestore the location of the backups, an access key, and have it restore directly from the cloud.
Describe alternatives you've considered
Downloading it first to a file share, then restoring.
Are you ready to build the code for the feature?
I've got this in development, should have a pull request soon.
It will have limitations:
The SQL Server will need to be able to run powershell and Az.Storage module will need to be available to install and import. I don't know how people will feel about this, I have an alternative below.
The user will need to create and provide a shared access key to enable powershell to query the blob storage
The database will need a credential already created with access to the container to do the restore
The main issue I'm facing is getting the list of files to potentially restore. I essentially need 2 different ways to access Azure Storage - firstly for Powershell to list the files, then for SQL to restore the files. The 2nd one I am ok with, it's a SQL Credential and it's doing it the way that MS seems to suggest. With this approach, we'd need to add the following parameters to the script:
This allows SQL to import and run PowerShell commands to query the Storage Container and retrieve a list of files that match the pattern in @BackupPathFull (and diffs and logs).
Option 2 is to use the comma separated abilities of @backupPathFull (and diffs and logs). You do the work first to retrieve the list and pass the string in, we then leverage the split, and do the restore which only requires a SQL Credential to be created.
I'm in favour of the first option as it makes my life simpler, and will create the pull request that way, but open to feedback if that's going to be a problem (users can always just ignore this feature if it is a problem in their environment).
The text was updated successfully, but these errors were encountered:
I'm totally fine with the first option because I think the use case is so limited today. So few folks are asking to do it, so ANY way that we support it is probably good. I won't be able to help with the code, so I wanna empower you to be able to do it the fastest, easiest way you can, and in a way that you can support if questions come up. Sounds good!
I hit some more problems. Current plan is to leave it up to the user to pass in a full list of files along with the folder paths. I hit too many problems with trying to execute the commands from within SQL - too many access problems for my production environment.
Is your feature request related to a problem? Please describe.
I have a need to restore data from Azure Blob storage (without copying locally first). The backups were created with Ola's script directly to a file share, and then transferred to Blob Storage. The files maintain their current path (i.e.
{databaseName}/{full|diff|log}/{original filename}
) in the storage container.Describe the solution you'd like
Be able to pass sp_databaseRestore the location of the backups, an access key, and have it restore directly from the cloud.
Describe alternatives you've considered
Downloading it first to a file share, then restoring.
Are you ready to build the code for the feature?
I've got this in development, should have a pull request soon.
It will have limitations:
The main issue I'm facing is getting the list of files to potentially restore. I essentially need 2 different ways to access Azure Storage - firstly for Powershell to list the files, then for SQL to restore the files. The 2nd one I am ok with, it's a SQL Credential and it's doing it the way that MS seems to suggest. With this approach, we'd need to add the following parameters to the script:
This allows SQL to import and run PowerShell commands to query the Storage Container and retrieve a list of files that match the pattern in
@BackupPathFull
(and diffs and logs).Option 2 is to use the comma separated abilities of
@backupPathFull
(and diffs and logs). You do the work first to retrieve the list and pass the string in, we then leverage the split, and do the restore which only requires a SQL Credential to be created.I'm in favour of the first option as it makes my life simpler, and will create the pull request that way, but open to feedback if that's going to be a problem (users can always just ignore this feature if it is a problem in their environment).
The text was updated successfully, but these errors were encountered: