Copying all files from an AWS S3 bucket using Powershell

The AWS Powershell tools allow you to quickly and easily interact with the AWS APIs.

To save a copy of all files in a S3 bucket, or folder within a bucket, you need to first get a list of all the objects, and then download each object individually, as the script below does. You will need to have installed the AWS Powershell Tools and use an AWS Access Key that has read and list access to your bucket.

  • Post
  • Sudarshan
    Sudarshan

    Please check $secretAccessKey . it has typo

  • Thanks Sudarshan - Have updated the script!

  • Phil
    Phil

    BTW (and firstly thanks, this helped me alot), if running this in Chef under the powershell_script resource, you need to remove the spaces between the ending ) and the beginning {. I was getting errors about missing blocks, so for example:

    if($somevarcondition -ne ''){

    Cheers

    Phil

  • Ed W.
    Ed W.

    Anyway to have this only copy for a date range for when a file was created on S3?

  • Ed W
    Ed W

    I used your SQL Backup script in another post and altered it to backup all User Databases and to compress them before moving to S3.

    I wanted to give a tool to developers where they can recall the backups from S3. So I took your script here and altered it a touch.

    Its not exactly what I want but it works.  It starts by listing all the files in the S3 bucket. Then prompts user to enter the full KEY. it then moves file from S3 to the local machine.

    Ideally I would like to prompt the user for a date they want and it will pull all files created on S3 for that date or use the date in the file name to pull them down.  I ran out of time to work on this and it works, the developers love it, they don't have to come to me to get back up files so they can restore on there dev boxes. Maybe I will revisit to add the date part, but it works as is.

    ----------------

    # Your account access key - must have read access to your S3 Bucket

    $accessKey = "USER-ACCESS-KEY"

    # Your account secret access key

    $secretKey = "USER-SECRET KEY"

    # The region associated with your bucket e.g. eu-west-1, us-east-1 etc. (see docs.aws.amazon.com/.../using-regions-availability-zones.html

    $region = "us-east-1"

    # The name of your S3 Bucket

    $bucket = "bucket_name"

    # The folder in your bucket to copy, including trailing slash. Leave blank to copy the entire bucket

    $keyPrefix = "bucket folder name/"

    #If PS opened as Admin, then the next line removes the Exection Policy

    powershell.exe Set-ExecutionPolicy Unrestricted

    #List all files in the S3 Bucket so user can pick the one they want to pull down.

    Get-S3Object -BucketName $bucket -KeyPrefix $keyPrefix -AccessKey $accessKey -SecretKey $secretKey -Region $region

    # File Name User Input Prompt

    # Warn user that they needed to open PS as an Admin

    Write-Host "///////////////////////////////////////////\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\"

    Write-Host "//////////////// You need to have Lauched PowerShell as an Admin for this to work! \\\\\\\\\\\\\\\\\\\\\\\\\\\\"

    Write-Host "///////////////////////////////////////////\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\"

    Write-Host "."

    Write-Host "*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*"

    Write-Host "*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*"

    Write-Host "Input full Key name (including path) of file you want from list above"

    Write-Host "Example: developers/DATA_EXAMPLE-20151024131327.zip"

    $key1 =  Read-Host -Prompt 'Enter Key'

    # The local file path where files should be copied

    $localPath = "S:\SQL_Backup_Recovery" #Change to a location that works for you.

    $key2 = $key1 -replace $keyPrefix, ''

    $localFilePath = Join-Path $localPath $key2

    Write-Host $keyfile

    Copy-S3Object -BucketName $bucket -Key $key1 -LocalFile $localFilePath -AccessKey $accessKey -SecretKey $secretKey -Region $region

    #Write-Host $keyfile

    Write-Host "*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*=*=*=*=**=*=*=*=*"

    Write-Host "*=*=*=*=* If there was no errors you can find your file in "$localFilePath" *=*=*=*"

  • steveh
    steveh

    For the same problem domain (related tool), this seems to be working in powershell;

    cd '\SAS Software Depot'

    aws s3 cp . s3://mybucket/SASSoftwareDepot/ --recursive

    .. actually it might have been just the spaces in the windows path that were the problem, but using cwd i.e. '.' in the aws-cli command bypasses a world of potential issues!

  • Karim
    Karim

    How can I use this commands in the user data?

    I need to copy a file from S3 to my ec2 server when this is instantiated for the first time, so I need to add this command in userdata. (The server is Windows Servers).

  • Karim - you can add powershell scripts as User Data to run at start up of your instance using the instructions here: docs.aws.amazon.com/.../ec2-instance-metadata.html

    Thanks, Rhys

  • felipe
    felipe

    how can I set a logfile in this script?

  • felipe - what do you want to log?

  • Mabel
    Mabel

    Thanks for the script :) but it will be really good if you can include something to check if the file has already been copied.  Thanks so much.

  • Bebe
    Bebe

    Thanks for the wonderful script but do you have anything that will only copy the files that I haven't downloaded yet?  thanks so much

  • Shilpa
    Shilpa

    I'm trying to copy a folder from s3 bucket to my windows ec2 instance

  • Lipsa Parida
    Lipsa Parida

    Really really helpful content. Thank you for providing such a convenient and widely used code snippet and making it look so easy.

  • James
    James

    Can you modify this code to download all files and all of their VERSIONS as well?