Skip to content

Instantly share code, notes, and snippets.

@haydenk
Created April 28, 2026 14:35
Show Gist options
  • Select an option

  • Save haydenk/9eae7b8211ef7f86cae207310ee9ee8c to your computer and use it in GitHub Desktop.

Select an option

Save haydenk/9eae7b8211ef7f86cae207310ee9ee8c to your computer and use it in GitHub Desktop.
[AWS] Bash script that copies S3 objects from a source bucket to a target bucket, partitioning keys by date (year=YYYY/month=MM/day=DD) and skipping objects that already exist in the target.
#!/usr/bin/env bash
# Copy S3 objects from a source bucket to a target bucket, partitioning the
# destination keys by date (year=YYYY/month=MM/day=DD) and skipping objects
# whose filename already exists somewhere in the target bucket.
#
# Prereq: fetch the source bucket listing first with
# aws s3api list-objects-v2 --bucket your-source-bucket --query Contents > ~/files.json
jsonFile="$HOME/files.json"
sourceBucket="your-source-bucket-name"
targetBucket="your-target-bucket-name"
function keyExists {
local searchKey=$1;
aws s3api list-objects-v2 --bucket $targetBucket --query "length(Contents[?ends_with(Key, \`$searchKey\`)])"
}
for object in `jq -cr '.[]' $jsonFile`;
do
key=$(echo $object | jq -cr '.Key')
searchFilename=$(basename $key)
hasKey=$(keyExists $searchFilename)
# echo $searchFilename
if [[ $hasKey -gt 0 ]]; then
continue
fi
lastModified="$(echo $object | jq -cr '.LastModified')"
copySource="$sourceBucket/$key"
tarketKey="usage/$(date -d "$lastModified" '+year=%Y/month=%m/day=%d')/$searchFilename"
echo "$copySource -> $targetBucket/$tarketKey"
aws s3api copy-object --copy-source $copySource --key $tarketKey --bucket $targetBucket | jq .
done
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment