Here it's a bash script that needs to use this script simply create a directory backup and moved that compressed backup file to a configured S3 bucket with the help of bash script and AWS IAM User with S3 full access. I have added a new feature the script will be suitable for the ubuntu/RedHat repository and the script will install the dependencies as itself. So, let's roll down
- Easy to configure for anyone
- It generates the directory compressed format
- Which one we entered directory converted to a backup/compressed form to S3
- All the steps I have provided including AWS IAM User and S3 Bucket Creation
- Includes AWSCLI installation which you used debian or redhat repository the script will findout which you used and installed the same
- All the values append your answers while running the script
- This script is not suitable for cronjob and it needs manual job but it just ask some kind of questions other the all things append your answer
- Basic Knowledge of bash
- Basic Knowledge of AWS IAM, S3 service
- Need to change your IAM user creds and then please enter the same while script running time
- log into your AWS account as a root user and go to IAM user
- goto Access Managment >> Users
- Click Add User (top right corner)
- Enter any username as you like and Choose "Programmatic access" >> Click Next Permissions
- Set Permissions >> Select "Attach existing policies directly" >> Choose "AmazonS3FullAccess" >> Click Next Tags
- Add Tags(Optional) >> Enter a key and value as you like either you can leave as blank
- Review your user details and click "Create User"
- Store your credentials to your local
Reference URL:: IAM User creation article
- Go to S3 > Click Create Bucket
- Any bucket name as you wish and then please enable versioning (that you upload same file multiple times or modified versions uploaded that the S3 stored as a version bases like Git)
- Click create bucket
Reference URL:: Creating S3 bucket please use this doc and you can secure your bucket with IAM user using S3 bucket policy
yum install -y git
git clone https://github.com/yousafkhamza/backup-to-s3-bashscript.git
cd backup-to-s3-bashscript
chmod +x backup-to-S3.sh
Change your creds and bucket name in at var.py file
Command to run the script::
[root@ip-172-31-10-180 backup-to-s3-bashscript]# ./backup-to-S3.sh
# --------------------------- or --------------------------------- #
[root@ip-172-31-10-180 backup-to-s3-bashscript]# bash backup-to-S3.sh
[root@ip-172-31-10-180 backup-to-s3-bashscript]# ./backup-to-S3.sh
AWS Package is installed.....
Start the script...
Please configure your IAM user creds on the server
AWS Access Key ID [None]: # <--------- Enter your access_key here.
AWS Secret Access Key [None]: # <--------- Enter your secret key here.
Default region name [None]: ap-south-1 # <--------- Enter your region which you needs
Default output format [None]: json # <---------- Default output format is json
Let's roll to create your backup to S3
Please crosscheck the credentials are given below
aws_access_key_id = < Your accesss_key confirmation on the part of script >
aws_secret_access_key = < Your secret_key confirmation on the part of script >
Do you need to reconfigure the same [Y/N]: n
Let's roll to create your backup to S3
Enter your directory path(The directory will be compressed as a tar.gz file): /root/Python
Taking the directory path to your local as a temporary.........
tar: Removing leading `/' from member names
/root/Python/
/root/Python/test/
/root/Python/test/file.txt
/root/Python/test/two.txt
Local Backup Takes successfully,,,,
Enter your Bucket Name (S3 Destination): yousaf-test
Backup Moving to S3.......
upload: ../../tmp/Python-05082021.tar.gz to s3://yousaf-test/backup/Python-05082021.tar.gz
Removing local backup.....
Local backup removes successfully
vim backup-to-S3.sh
#!/bin/bash
Ubuntu=$(which apt > 2&>1; echo $?)
Redhat=$(which yum > 2&>1; echo $?)
AWS=$(which aws > 2&>1; echo $?)
date=$(date "+%d%m%Y")
if [[ "$AWS" -eq 0 ]]; then
echo "AWS Package is installed....."
echo ""
echo "Start the script..."
if [[ $(ls ~/.aws/credentials > 2&>1; echo $?) -ne 0 ]]; then
echo ""
echo "Please configure your IAM user creds on the server"
echo ""
aws configure --profile backups3
echo ""
echo "Let's roll to create your backup to S3"
fi
elif [[ "$Ubuntu" -eq 0 ]]; then
echo "AWS Package is not installed on your debian distro. Installing AWS package...."
sleep 1
echo ""
sudo apt install -y awscli
echo "Please configure your IAM user creds on the server"
echo ""
aws configure --profile backups3
echo ""
echo "Let's roll to create your backup to S3"
elif [[ "$Redhat" -eq 0 ]]; then
echo "AWS Package is not installed on your ReadHat distro. Installing AWS package...."
sleep 1
echo ""
sudo yum install -y awscli
echo "Please configure your IAM user creds on the server"
echo ""
aws configure --profile backups3
echo ""
echo "Let's roll to create your backup to S3"
else
echo "Please install AWS Package..... and retry the same"
exit 1
fi
isInFileA=$(cat ~/.aws/credentials | grep -c "backups3")
isInFileB=$(cat ~/.aws/config | grep -c "backups3")
CredInServer=~/.aws/credentials
# AWS Configuration on the server
if [ -f $CredInServer ] && [ "$isInFileA" -eq 1 ] && [ "$isInFileB" -eq 1 ]; then
echo ""
echo "Please crosscheck the credentials are given below"
echo ""
cat $CredInServer | grep -A 2 "backups3" | tail -n2
echo ""
read -p "Do you need to reconfigure the same [Y/N]: " con1
if [[ "$con1" =~ ^([yY][eE][sS]|[yY])+$ ]]; then
aws configure --profile backups3
else
echo ""
echo "Let's roll to create your backup to S3"
fi
else
echo ""
echo "Please configure your IAM user creds on the server"
echo ""
aws configure --profile backups3
echo ""
echo "Let's roll to create your backup to S3"
fi
isInFileA=$(cat ~/.aws/credentials | grep -c "backups3")
isInFileB=$(cat ~/.aws/config | grep -c "backups3")
CredInServer=~/.aws/credentials
# Taking a local Backup before S3 upload
if [ -f $CredInServer ] && [ "$isInFileA" -eq 1 ] && [ "$isInFileB" -eq 1 ]; then
echo ""
read -p "Enter your directory path(The directory will be compressed as a tar.gz file): " path
BackupName=$(echo $path | awk -F "/" '{print $NF}')
if [ -z $path ]; then
echo "Please specify a a absalute directory path"
exit 1
else
if [[ "$path" == */ ]]; then
echo "Your entered directory path endswith /, so please remove the last / symbol"
else
if [ -d $path ]; then
echo ""
echo "Taking the directory path to your local as a temporary........."
echo ""
sleep 2
rm -f /tmp/$BackupName-*.tar.gz
tar -cvf /tmp/$BackupName-$date.tar.gz $path/
echo ""
echo "Local Backup Takes successfully,,,,"
# Backup Copy to S3
echo ""
read -p "Enter your Bucket Name (S3 Destination): " bucket
if [ -z bucket ]; then
echo "Please specify a Bucket name"
exit 1
else
if [ $(aws s3 --profile backups3 ls | grep -w "$bucket" > 2&>1; echo $?) -eq 0 ]; then
echo "Backup Moving to S3......."
aws s3 --profile backups3 cp "/tmp/$BackupName-$date.tar.gz" s3://$bucket/backup/
echo ""
echo "Removing local backup....."
rm -f /tmp/$BackupName-*.tar.gz
echo "Local backup removes successfully"
else
echo ""
echo "Please enter a valid bucket name"
exit 1
fi
fi
else
echo ""
echo "Enter a valid directory absalute path"
fi
fi
fi
fi
It's a simple bash script to take backup of directories (compressing) then the same to move your mentioned S3 bucket with the help of AWS IAM User. this script may be helpful who had face issues moving backups to S3 so it might be useful for cloud/linux/DevOps engineers.