bash - 'tLooking to download AWS ELB logs from s3 and remove them -
i'm trying aws cli move files s3 ebs on ec2 instance. issue having there no "move' command in aws cli. make life easier if there was.
from logic standpoint, need create script copies data s3 bucket (s3://bucket_name/awslogs/...) , removes file copied. know can set lifecycle pieces expire data, in case script copies data s3 ebs doesn't run, don't want lose data.
the aws cli supports recursive copies , removes, need have type of command execute "aws s3 cp" command filename variable , execute "aws s3 rm" same filename. i've searched on , don't know of tool/script exists this. ianap, wouldn't know how move python boto script, hoping there easy way bash shell script. help. thanks.
i'd suggest use s3cmd task. s3cmd based on boto , supports syncing directories / s3. in case have reliable sync , local path on machine executing s3cmd.
for example:
$ mkdir /home/user/logs/ $ s3cmd sync s3://org.example.mybucket/ /home/user/logs/
after flaw s3cmd sync
has no parameters delete source file after syncing local filesystem. still need script iterating files synced local disk , call example
$ s3cmd del s3://org.example.mybucket/log.txt
so if assume there no subdirectories in synced bucket script should trick:
#!/bin/bash bucket='org.example.mybucket' target='/home/user/logs/' # ensure directory created mkdir -p ${target} # sync files bucket s3cmd sync s3://${bucket}/ ${target} # iterate files , delete bucket filename in ${target}*; s3cmd del s3://${bucket}/$(basename ${filename}) done
please careful wrote script head without testing it. might contain error…
Comments
Post a Comment