PDA

View Full Version : How to backup files from linux server to server automatically by cronjob?



Fli
11-04-2014, 02:42 PM
How to setup automatic backup script (backupremotelly.sh) to backup local directory from Linux server to other Linux server regularly

PS: if you need opposite direction backup script, try to modify this one OR use other script for downloading backups (http://internetlifeforum.com/linux-forums/3083-script-regularly-download-backups-remote-server/) (not source, but backup server located script)

Here we are continuing with the script that uploads main server backups to remote backup server. (script is located on main/source server).

if you don't have a backup Linux server yet, get a VPS from: http://instantcpanelhosting.com/cart.php?gid=4
after order, you will receive your SSH login details. You even don't need to login this backup server unless you need to restore backup, just continue this tutorial:

1. login your primary Linux server that you need to backup via SSH
create a directory which will contain this server files backups, example your server has control panel that can do files and mysql backups automatically and store them locally. so set your backup dir to something like /backup (mkdir /backup) for example.
Once backups are there, create script which will copy them regularly to remote server..

2. so enter your local server backup directory (example: cd /backup ), then create the external backup script file and open it:
touch backupremotelly.sh;chmod 700 backupremotelly.sh;vi backupremotelly.sh

paste:

#### ABOUT ####

# This SSH/rsync backup script allows:
# - setup password-less ssh access to remote server (backup server)
# - can be set to backup local server directory to remote server directory via SSH using rsync
# - script can also delete remote server backups older than X hours

#### VARIABLES ####

# if need to backup just one, newest directory which name is not static, (ie cPanel daily full backup .. 2016-03-02), use following 2 commented out lines, command "$(ls -t -A1 /vz/root/860/backup/|head -n1)" returns name of the file/directory that is the latest modiffied/created. One should also enable deleting of old backups if each backup has different name as it would fill HDD indefinitely.:
# src="/vz/root/860/backup/$(ls -t -A1 /vz/root/860/backup/|head -n1)"
# dest=/backup/860/daily/
src=/backup/
dest=/backup/cpanelvps/
destusr=root
destip=DESTINATION_SERVER_IP_WHERE_WE_UPLOAD_BACKU PS
destport=DESTINATION_SERVER_PORT_NUMBER_ie_22

# delete old backup fils older X hours from reote server to prevent disk full
# This can be disk I/O intensive task. If enabled, it runs each time this script run
# deletehours = files older than X hours will be deleted
deleteold=y
deletehours=72

# print variables
echo "Source: $src and destination: $destusr@$destip:$dest"

#### SET WHETHER SCRIPT ASK FOR SSH GENERATION OR DO BACKUP IMMEDIATELLY ###

# generate ssh key and upload to remote server (first time/initial setup) ???
# yes, we setup new remoteserver backup = y
# no, just do rsync, we already have password-less access = n
# ask prompt on script run = <empty> -> keygen=
keygen=n

#### PROMPT TO ASK IF SSH GENERATION IS NEEDED ####

if [ "$keygen" == "" ];then
echo "First time generate SSH access key and copy to remote server to setup passwordless access.
y = yes
other key = just do rsync now"
read keygen
fi

#### SSH ACCESS KEY GENERATION AND UPLOAD ####

if [ "$keygen" == "y" ];then

ssh-keygen
ssh-copy-id -i ~/.ssh/id_rsa.pub "$destip -p $destport"
ssh -p $destport $destip mkdir -p $dest

else

#### PRUNE OLD BACKUPS ON REMOTE SERVER ####

if [ "$deleteold" == "y" ];then
echo "Deleting files older $deletehours hours in $destusr@$destip:$dest"
ssh -p $destport $destip tmpwatch -m $deletehours $dest
fi

#### DOING NEW BACKUP VIA RSYNC ####


rsync -av $src -e "ssh -p $destport" $destusr@$destip:$dest
# -z to compress (higher cpu, less traffic)
# -v verbose (toomany files, unnecessary output)

fi


4. save changes and run the script first time: sh backupremotelly.sh
(the ssh key access will be setup, during key generation do not enter any password so you setup password-less access for future automated backups)

execute "exit" to logout remote server

5. open script again: vi backupremotelly.sh
and change "keygen=y" to "keygen=n" so now script will do only backups, no asking

test first backup by running script again: sh backupremotelly.sh

then u can login remote server (ssh remoteserverip) and check destination directory if backup was created. If ll ok, "exit" out of remote server

6. make symbolic link in cronjob directory of your choice:
/etc/cron.hourly
/etc/cron.daily
/etc/cron.weekly
/etc/cron.monthly

(example: ln -s /backup/backupremotelly.sh /etc/cron.daily/backupremotelly.sh)

this way the "backupremotelly.sh" located in /backup will be run regularly by the linux to send /backup contents to the remote server.

SCRIPT IMPROVEMENT:

Above script using simple rsync -av to download backups. Then it deleting old backups every few days.
What if your backup folders and backup files names are named always the same (no date/time stamp in their names)? In that case, might be good idea to use rsync -au (updating rsync) + deleting files not modiffies/created in last like 30 days instead of deleting all files.

-------
if u have any fixes, advice, please kindly share, thx