Automate S3 Like a Pro: CLI, Bash Scripts, and Lifecycle Rules

A teal-background illustration showing a navy command window titled ‘AUTOMATE S3 LIKE A PRO’ with the subheading ‘CLI, BASH SCRIPTS AND LIFECYCLE RULES’ and the command $ aws s3 sync . s3://bucket/ displayed. To the right, a bearded avatar in a navy blazer gives a thumbs-up. The left features icons for an S3 bucket, AWS CLI, Bash, a folder, and a lifecycle clock. In the background, a second engineer silhouette sits at a desk. A faint ModernTechOps badge watermark with ‘MTO’ appears in the bottom-right.

Last Updated: June 26, 2025

Before You Begin

Make sure you have:

  • AWS CLI installed (aws --version)
  • An AWS IAM user or role with S3 permissions (s3:PutObject, s3:GetObject, s3:ListBucket, etc.)
  • Access to a Linux terminal (local or EC2)
  • A test S3 bucket already created

Assumptions

I made the following assumptions about you:

  • You are familiar with basic AWS services, especially S3
  • You are comfortable running CLI commands and writing shell scripts
  • You are looking to eliminate manual file uploads, backups, or retention management

Why Automate S3?

S3 is a solid place to store logs, backups, static assets, reports, or exported data. Automating these interactions saves time, prevents mistakes, and gives you more control over your file lifecycle.

With just a few lines of CLI or Bash, you can:

  • Sync directories to or from S3
  • Schedule routine backups
  • Control storage costs with lifecycle rules

Let’s walk through how to set it all up.

Syncing Files with aws s3 sync

The sync command is your go-to for copying entire directories efficiently.

What this does:

  • Uploads all contents of /var/log to your S3 bucket
  • Only updates changed files (based on size and timestamp)
  • Deletes remote files that no longer exist locally

Common flags:

FlagDescription
–deleteDeletes files from S3 that are no longer in your source folder
--exclude / --includeFilters specific file types
–dryrunPreview changes before running them

Uploading and Downloading Files

For one-off uploads and downloads, use the cp command.

You can also use --recursive for directory-level transfers.

Writing a Scheduled Backup Script

Let’s build a Bash script to back up a directory to S3 every night.

Make it executable:

Automating with systemd Timer (Optional)

If you want to automate the script using systemd (instead of cron), here’s a quick setup:

my-s3-backup.service

my-s3-backup.timer

Enable and start:

Managing Old Files with S3 Lifecycle Rules

You can save on storage costs by setting lifecycle rules that:

  • Transition older files to Glacier or Deep Archive
  • Auto-delete files after a set number of days

You can do this from the AWS Console or using a JSON configuration with the CLI.

Example rule (via Console):

  • Prefix: logs/
  • Transition to Glacier after 30 days
  • Expire after 180 days

A Note on Security

Never hardcode credentials in your scripts.

Use one of the following instead:

  • aws configure to set up credentials locally (stored in ~/.aws/credentials)
  • Environment variables:
  • IAM roles if running in EC2 or Lambda

Always apply the principle of least privilege.

Alternatives

You could also look into:

  • rclone – for more flexible syncing and encryption
  • s3cmd – an older CLI tool that some still prefer
  • Lambda + EventBridge – for trigger-based automation (e.g., run a backup on upload)

But for most Linux users and CLI workflows, the native AWS CLI is fast, battle-tested, and widely supported.

Summary

Here’s what we covered:

  • Sync entire folders to and from S3
  • Automate backups using Bash scripts and systemd timers
  • Set up lifecycle rules to save costs
  • Upload/download single files as needed
  • Keep it secure using environment variables or IAM roles

Want the downloadable backup script and .service/.timer files?

You can download both files here as a ZIP archive.

The ZIP includes:

  • s3-backup.sh: the Bash script to sync local logs to S3
  • my-s3-backup.service: the systemd unit to trigger the backup
  • my-s3-backup.timer: the timer to run it daily at 2am

Want more tutorials like this?

Subscribe and get actionable DevOps & Linux automation tips straight to your inbox.

smartphone, hand, inbox, empty, mailbox, digital, mobile phone, screen, lcd, inbox, inbox, inbox, inbox, inbox, lcd

Leave a Comment

Your email address will not be published. Required fields are marked *