Giter Club home page Giter Club logo

vijayram0690 / backup-to-s3-bashscript Goto Github PK

View Code? Open in Web Editor NEW

This project forked from yousafkhamza/backup-to-s3-bashscript

0.0 1.0 0.0 10 KB

Here it's a bash script that needs to use this script simply create a directory backup and moved that compressed backup file to a configured S3 bucket with the help of bash script and AWS IAM User with S3 full access. I have added a new feature the script will be suitable for the ubuntu/RedHat repository and the script will install the dependencies as itself. So, let's roll down

Shell 100.00%

backup-to-s3-bashscript's Introduction

Directory Backup Moved to S3 (BashScript)

Build


Description

Here it's a bash script that needs to use this script simply create a directory backup and moved that compressed backup file to a configured S3 bucket with the help of bash script and AWS IAM User with S3 full access. I have added a new feature the script will be suitable for the ubuntu/RedHat repository and the script will install the dependencies as itself. So, let's roll down


Feature

  • Easy to configure for anyone
  • It generates the directory compressed format
  • Which one we entered directory converted to a backup/compressed form to S3
  • All the steps I have provided including AWS IAM User and S3 Bucket Creation
  • Includes AWSCLI installation which you used debian or redhat repository the script will findout which you used and installed the same
  • All the values append your answers while running the script

Cons

  • This script is not suitable for cronjob and it needs manual job but it just ask some kind of questions other the all things append your answer

Pre-Requests

  • Basic Knowledge of bash
  • Basic Knowledge of AWS IAM, S3 service
  • Need to change your IAM user creds and then please enter the same while script running time

IAM User Creation steps (with screenshot)

  1. log into your AWS account as a root user and go to IAM user
  2. goto Access Managment >> Users alt_txt
  3. Click Add User (top right corner) alt_txt
  4. Enter any username as you like and Choose "Programmatic access" >> Click Next Permissions alt_txt
  5. Set Permissions >> Select "Attach existing policies directly" >> Choose "AmazonS3FullAccess" >> Click Next Tags alt_txt
  6. Add Tags(Optional) >> Enter a key and value as you like either you can leave as blank alt_txt
  7. Review your user details and click "Create User" alt_txt
  8. Store your credentials to your local alt_txt

Reference URL:: IAM User creation article


S3 Bucket Creation (with screenshot)

  1. Go to S3 > Click Create Bucket alt_txt
  2. Any bucket name as you wish and then please enable versioning (that you upload same file multiple times or modified versions uploaded that the S3 stored as a version bases like Git) alt_txt
  3. Click create bucket

alt_txt

Reference URL:: Creating S3 bucket please use this doc and you can secure your bucket with IAM user using S3 bucket policy


Pre-Requested (Dependency packages)

yum install -y git

How to get

git clone https://github.com/yousafkhamza/backup-to-s3-bashscript.git
cd backup-to-s3-bashscript
chmod +x backup-to-S3.sh

Change your creds and bucket name in at var.py file

Command to run the script::

[root@ip-172-31-10-180 backup-to-s3-bashscript]# ./backup-to-S3.sh
# --------------------------- or --------------------------------- #
[root@ip-172-31-10-180 backup-to-s3-bashscript]# bash backup-to-S3.sh

Output be like

[root@ip-172-31-10-180 backup-to-s3-bashscript]# ./backup-to-S3.sh
AWS Package is installed.....

Start the script...

Please configure your IAM user creds on the server

AWS Access Key ID [None]:                                # <---------  Enter your access_key here.
AWS Secret Access Key [None]:                           # <--------- Enter your secret key here.
Default region name [None]: ap-south-1                 # <--------- Enter your region which you needs
Default output format [None]: json                    # <----------  Default output format is json

Let's roll to create your backup to S3

Please crosscheck the credentials are given below

aws_access_key_id = < Your accesss_key confirmation on the part of script >
aws_secret_access_key = < Your secret_key confirmation on the part of script >

Do you need to reconfigure the same [Y/N]: n

Let's roll to create your backup to S3

Enter your directory path(The directory will be compressed as a tar.gz file): /root/Python

Taking the directory path to your local as a temporary.........

tar: Removing leading `/' from member names
/root/Python/
/root/Python/test/
/root/Python/test/file.txt
/root/Python/test/two.txt

Local Backup Takes successfully,,,,

Enter your Bucket Name (S3 Destination): yousaf-test
Backup Moving to S3.......
upload: ../../tmp/Python-05082021.tar.gz to s3://yousaf-test/backup/Python-05082021.tar.gz

Removing local backup.....
Local backup removes successfully

Output be like (ScreenShot)

alt_txt

View of S3 bucket

alt_txt


Behind the code

vim backup-to-S3.sh

#!/bin/bash

Ubuntu=$(which apt > 2&>1; echo $?)
Redhat=$(which yum > 2&>1; echo $?)
AWS=$(which aws > 2&>1; echo $?)
date=$(date "+%d%m%Y")

if [[ "$AWS" -eq 0 ]]; then
	echo "AWS Package is installed....."
	echo ""
	echo "Start the script..."
	
	if [[ $(ls ~/.aws/credentials > 2&>1; echo $?) -ne 0 ]]; then
		echo ""
		echo "Please configure your IAM user creds on the server"
		echo ""
		aws configure --profile backups3
		echo ""
		echo "Let's roll to create your backup to S3"
	fi
elif [[ "$Ubuntu" -eq 0 ]]; then
	echo "AWS Package is not installed on your debian distro. Installing AWS package...."
	sleep 1
	echo ""
	sudo apt install -y awscli
	echo "Please configure your IAM user creds on the server"
    echo ""
    aws configure --profile backups3
    echo ""
    echo "Let's roll to create your backup to S3"
elif [[ "$Redhat" -eq 0 ]]; then
	echo "AWS Package is not installed on your ReadHat distro. Installing AWS package...."
	sleep 1
	echo ""
	sudo yum install -y awscli
	echo "Please configure your IAM user creds on the server"
    echo ""
    aws configure --profile backups3
    echo ""
    echo "Let's roll to create your backup to S3"
else
	echo "Please install AWS Package..... and retry the same"
	exit 1
fi

isInFileA=$(cat ~/.aws/credentials | grep -c "backups3")
isInFileB=$(cat ~/.aws/config | grep -c "backups3")
CredInServer=~/.aws/credentials
# AWS Configuration on the server
if [ -f $CredInServer ] && [ "$isInFileA" -eq 1 ] && [ "$isInFileB" -eq 1 ]; then
    echo ""
	echo "Please crosscheck the credentials are given below"
    echo ""
    cat $CredInServer | grep -A 2 "backups3" | tail -n2
    echo ""
    read -p "Do you need to reconfigure the same [Y/N]: " con1
    if [[ "$con1" =~ ^([yY][eE][sS]|[yY])+$ ]]; then
        aws configure --profile backups3
    else
        echo ""
        echo "Let's roll to create your backup to S3"
    fi
else
	echo ""
    echo "Please configure your IAM user creds on the server"
    echo ""
    aws configure --profile backups3
    echo ""
    echo "Let's roll to create your backup to S3"
fi

isInFileA=$(cat ~/.aws/credentials | grep -c "backups3")
isInFileB=$(cat ~/.aws/config | grep -c "backups3")
CredInServer=~/.aws/credentials
# Taking a local Backup before S3 upload
if [ -f $CredInServer ] && [ "$isInFileA" -eq 1 ] && [ "$isInFileB" -eq 1 ]; then
	echo ""
	read -p "Enter your directory path(The directory will be compressed as a tar.gz file): " path
	BackupName=$(echo $path | awk -F "/" '{print $NF}')			
	if [ -z $path ]; then 
		echo "Please specify a a absalute directory path"
		exit 1
	else
		if [[ "$path" == */ ]]; then
			echo "Your entered directory path endswith /, so please remove the last / symbol"
		else
			if [ -d $path ]; then
				echo ""
				echo "Taking the directory path to your local as a temporary........."
				echo ""
				sleep 2
				rm -f /tmp/$BackupName-*.tar.gz 
				tar -cvf /tmp/$BackupName-$date.tar.gz $path/
				echo ""
				echo "Local Backup Takes successfully,,,,"
				# Backup Copy to S3
				echo ""
				read -p "Enter your Bucket Name (S3 Destination): " bucket
				if [ -z bucket ]; then
					echo "Please specify a Bucket name"
					exit 1
				else				
					if [ $(aws s3 --profile backups3 ls | grep -w "$bucket" > 2&>1; echo $?) -eq 0 ]; then
						echo "Backup Moving to S3......."
						aws s3 --profile backups3 cp "/tmp/$BackupName-$date.tar.gz" s3://$bucket/backup/
						echo ""
						echo "Removing local backup....."
						rm -f /tmp/$BackupName-*.tar.gz 
						echo "Local backup removes successfully"
					else
						echo ""
						echo "Please enter a valid bucket name"
						exit 1
					fi
				fi
			else
				echo ""
				echo "Enter a valid directory absalute path"
			fi
		fi
	fi
fi

Conclusion

It's a simple bash script to take backup of directories (compressing) then the same to move your mentioned S3 bucket with the help of AWS IAM User. this script may be helpful who had face issues moving backups to S3 so it might be useful for cloud/linux/DevOps engineers.

⚙️ Connect with Me


Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.