Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enable sp_DatabaseRestore to use Blob Storage #3572

Open
gdoddsy opened this issue Sep 24, 2024 · 2 comments
Open

Enable sp_DatabaseRestore to use Blob Storage #3572

gdoddsy opened this issue Sep 24, 2024 · 2 comments

Comments

@gdoddsy
Copy link
Contributor

gdoddsy commented Sep 24, 2024

Is your feature request related to a problem? Please describe.
I have a need to restore data from Azure Blob storage (without copying locally first). The backups were created with Ola's script directly to a file share, and then transferred to Blob Storage. The files maintain their current path (i.e. {databaseName}/{full|diff|log}/{original filename}) in the storage container.

Describe the solution you'd like
Be able to pass sp_databaseRestore the location of the backups, an access key, and have it restore directly from the cloud.

Describe alternatives you've considered
Downloading it first to a file share, then restoring.

Are you ready to build the code for the feature?
I've got this in development, should have a pull request soon.

It will have limitations:

  1. The SQL Server will need to be able to run powershell and Az.Storage module will need to be available to install and import. I don't know how people will feel about this, I have an alternative below.
  2. The user will need to create and provide a shared access key to enable powershell to query the blob storage
  3. The database will need a credential already created with access to the container to do the restore

The main issue I'm facing is getting the list of files to potentially restore. I essentially need 2 different ways to access Azure Storage - firstly for Powershell to list the files, then for SQL to restore the files. The 2nd one I am ok with, it's a SQL Credential and it's doing it the way that MS seems to suggest. With this approach, we'd need to add the following parameters to the script:

EXEC sp_DatabaseRestore @Database = N'Database',				-- nvarchar(128)
						@BackupPathFull = N'Database/Full',	-- nvarchar(260)
						@BackupPathDiff = N'Database/Diff',	-- nvarchar(260)
						@RunCheckDB = 0,					-- bit
						@RestoreDiff = 1,					-- bit
						@RunRecovery = 1,					-- bit
						@StopAt = N'20240921000000',		-- nvarchar(14)
						@DatabaseOwner = 'sa',				-- sysname
						@Debug = 0,
						@Execute = 'Y',
						@MoveFiles=1,
						@MoveDataDrive = 'C:\Database\Data',
						@MoveLogDrive = 'C:\Database\Logs'
--New params:
						@BackupSource = 'URL',
						@StorageAccountName = 'YourStorageAccount',
						@StoargeContainerName = 'YourContainer',
						@StorageAccountKey = 'ProvideAKey',

This allows SQL to import and run PowerShell commands to query the Storage Container and retrieve a list of files that match the pattern in @BackupPathFull (and diffs and logs).

Option 2 is to use the comma separated abilities of @backupPathFull (and diffs and logs). You do the work first to retrieve the list and pass the string in, we then leverage the split, and do the restore which only requires a SQL Credential to be created.

I'm in favour of the first option as it makes my life simpler, and will create the pull request that way, but open to feedback if that's going to be a problem (users can always just ignore this feature if it is a problem in their environment).

@BrentOzar
Copy link
Member

I'm totally fine with the first option because I think the use case is so limited today. So few folks are asking to do it, so ANY way that we support it is probably good. I won't be able to help with the code, so I wanna empower you to be able to do it the fastest, easiest way you can, and in a way that you can support if questions come up. Sounds good!

@gdoddsy
Copy link
Contributor Author

gdoddsy commented Oct 8, 2024

I hit some more problems. Current plan is to leave it up to the user to pass in a full list of files along with the folder paths. I hit too many problems with trying to execute the commands from within SQL - too many access problems for my production environment.

I'll be back in a week or 2 to fix it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants