how to include and copy files that are in current directory to s3 (and not recursively)

I have some files that I want to copy to s3. Rather than doing one call per file, I want to include them all in one single call (to be as efficient as possible).

However, I only seem to get it to work if I add the --recursive flag, which makes it look in all children directories (all files I want are in the current directory only)

so this is the command I have now, that works

aws s3 cp --dryrun . mybucket --recursive --exclude * --include *.jpg

but ideally I would like to remove the --recursive to stop it traversing, e.g. something like this (which does not work)

aws s3 cp --dryrun . mybucket --exclude * --include *.jpg

(I have simplified the example, in my script I have several different include patterns)

Answers 1

  • AWS CLI's S3 wildcard support is a bit primitive, but you could use multiple --exclude options to accomplish this. Note: the order of includes and excludes is important.

    aws s3 cp --dryrun . s3://mybucket --recursive --exclude "*" --include "*.jpg" --exclude "*/*"
    

Related Articles