Backing up a SQL database to Amazon S3 using Powershell

Amazon’s S3 service provides a cheap and reliable way of storing backups, with plenty of options to simplify management of these. This post talks about how I’m handling backups for this blog.

The blogs content doesn’t change dramatically, so daily full backups are fine for this scenario, and these will be kept for 7 days.

Setting up the S3 bucket

Create a new S3 bucket to store your backups, this example will use the name “s3-sql-backups” and assume you are using the AWS console.

Select the newly created bucket and select “Properties” from the options on the top right, then select “Lifecycle”. S3 lifecycle rules allow you to automatically delete or archive content to the Glacier service when the file is a certain age. Create a new lifecycle rule for the whole bucket, that permanently deletes the file after 7 days. This will stop you running up a large S3 bill due to storing old backups. Alter the time range to suite your needs, or consider pushing the backups to Glacier for cheaper long term storage.

Your script will need to access this bucket, create an IAM user or role to use, and give them full control of this bucket with the following access control policy, replacing s3-sql-backups with the name of your bucket.

The Script

The script is straightforward and uses the SQL Backup cmdlets described by Allen White on The backup is created, with the timestamp in the name, copied to S3 and then deleted from the local machine (along with any other old backups).

The variables at the start of the script need to be updated to point to the correct database, S3 bucket and backup location. The script assumes Windows Authentication to connect to the database. If you are running the script on an EC2 instance, and have applied the security policy to the IAM role used by the instance the AccessKey and SecretKey parameters can be removed from the Write-S3Object command.


The script then needs to be scheduled to run every night, I’m using scheduled tasks for this, creating a task that runs nightly and triggers the powershell script by running Powershell.exe with the arguments -ExecutionPolicy Bypass C:\SqlBackup\SqlBackupToS3.ps1.


Leave the script to run and check your S3 bucket to ensure the backups are being saved correctly. There is no notification available if the backups fail which isn’t ideal but this could be added in someway.

  • Post
  • Ed Walsh
    Ed Walsh

    Brilliant script.

    I expanded it a touch to allow it to backup all user databases and also to compress the bak file.

    If you are using PS v4 you need to add community extensions, if you have PS v5 you can use the new compression control: Compress-Archive

    Hope you find this useful.

    Ed W.


    $server = '.'

    $s3Bucket = 'bucket_name'

    $backupPath = 'S:\SqlBackup\' #Change to a location that works for you.

    $region = 'us-east-1'

    # accessKey and secretKey can be removed if running on an EC2 instance and using IAM roles for security

    $accessKey = 'USER-ACCESS-KEY'

    $secretKey = 'USER-SECRET-KEY'

    $databases = Invoke-Sqlcmd -ServerInstance $server -Query "SELECT [name]

    FROM master.dbo.sysdatabases

    WHERE dbid > 4 ;"

    # loop through each instances and backup up all the user databases

    foreach ($database in $databases)


    $timestamp = get-date -format yyyyMMddHHmmss

    $fileName =  "$($$timestamp.bak"

    $zipfilename = "$($$"

    $filePath = Join-Path $backupPath $fileName

    $zipfilepath = Join-Path $backupPath $zipfileName

    #Write-Host $fileName

    Backup-SqlDatabase -ServerInstance $server -Database $ -BackupFile $filePath #-Compression On #Compression will not work with SQL Express

    # If you are using SQL Express Comment out -Compression On in line above and uncomment Line below that starts with: Write-Zip.  

    # You need to install PS Community Extensions for Write-Zip to work. If you are using Powershell v5

    # you can alter line below to use the native command Compress-Archive ( reference )

    Write-Zip -path $filePath -OutputPath $zipfilepath

    Write-S3Object -BucketName $s3Bucket -File $zipfilePath -Key $zipfileName -Region $region -AccessKey $accessKey -SecretKey $secretKey

    Remove-Item $backupPath$($*.bak

    Remove-Item $backupPath$($*.zip


  • Ofer Groman
    Ofer Groman

    Man, you are a life saver. You saved me hours of frustration. Such a simple and easy to follow implementation.

    Thank you so much for this.

  • A Sing
    A Sing
    Hi Rhys , This is an awesome post, helps a lot. It looks like it is a 2 step process. 1. take back up on a local drive and 2. to upload it to S3. is there a way to have SQL server backing up directly to S3? Thanks in Advance
  • Hi, I'm not sure, I think you'd have to mount the S3 bucket as a windows drive, think there are some third party tools that might allow you to do this at the moment, but I'm not aware of any official AWS way of doing this. Rhys