Handle outdated s3 backups

Lately I have been using s3 backups as described in the Roots guide:

It works fine, but I don’t know how to remove old backups from s3 bucket. I have no experience with s3 and I tried to manage it with a lifecycle setup, unsuccesfully. Is it posible to remove automatically old backups? (for example, keep the latest 3 backups).

You could use lifecycle methods to get rid of old stuff or transition old stuff to cheaper storage tiers or just delete them

https://docs.aws.amazon.com/AmazonS3/latest/dev/object-lifecycle-mgmt.html

3 Likes

Thank you for the answer! Ok, it is possible, but i prefer to keep n latest files instead expires based on time. The problem I can see with expiration after n days is, if the server stops making backups (for any reason), all backups will end up expiring and disappearing.

I’m reading docs. I will come back to solve thread if I find the right way to solve it.

1 Like

The simplest way I found to keep at least one copy of each backup is:

1. Remove timestamp from file name:

Replace

# site/scripts/backup-to-s3.sh
ARCHIVE_FILENAME=$SITE-$ENVIRONMENT-$TIMESTAMP.tar.gz

by

# site/scripts/backup-to-s3.sh
ARCHIVE_FILENAME=$SITE-$ENVIRONMENT.tar.gz

2. Enable version control for the bucket.

3. Set bucket lifecycle with expiration time for previous versions and leave untouched current version.

Yep looks good to me!

This topic was automatically closed after 42 days. New replies are no longer allowed.