Copying all files from an AWS S3 bucket using Powershell

The AWS Powershell tools allow you to quickly and easily interact with the AWS APIs.

To save a copy of all files in a S3 bucket, or folder within a bucket, you need to first get a list of all the objects, and then download each object individually, as the script below does. You will need to have installed the AWS Powershell Tools and use an AWS Access Key that has read and list access to your bucket.

  • Post
  • James - you should be able to do this by using the Get-S3Version command (docs.aws.amazon.com/.../index.html, and looping through each of the versions, you can then use the -VersionId parameter when calling Copy-S3Object. Hope this helps!

  • Mabel and Bebe - you could check if the file ($localFilePath) exists, if it does you could then skip the copying of the file. This would allow you to avoid copying files that had already been copied, although if the file had been updated in S3 you wouldn't get the updated version, would need to do some more advanced checks if this needed.

  • Shilpa - this script should work on an EC2 instance. If your instance is running with an IAM profile that has access to the S3 bucket then you could remove the -AccessKey and -SecretKey arguments from then calls as they should be picked up from the profile.