Here is a Linux bash script that allows anyone to setup automatic regular backups of his website files and mysql database. Then there is second script which we set on remote server to download these backups so it can be restored in case primary hosting server go offline.

These scripts should work on Linux server (most hosting uses Linux), and i tested it on cPanel hosting account.

SCRIPT 1 (An backup script which back-ups MySQL db each time it is run & backup files approx each 30th time it run)

# VARIABLES ---------------

# path which should be backed up (website directory)

# path where to store all backups (make random number at the end to decrease chance of spiders download your data)

# delete backup files older X days

# files backup filename

# mysql backup filename
mysql_bckp_filename=mysql_bckp_$(date +"%d_%m_%Y").gz

# mysql db scredentials
mysql_pass=$(echo "mypassword187" | tr "1" "9")
# above mentioned "tr" replaces all "1" numbers by number "9", where 9 is correct and 1 is put into phrasse just to not show plain password)



## COMMANDS ----------

find $bckp_store_path -type f -mtime +$daystokeep -delete

# MySQL backup itself
mysqldump -u $mysql_dbuser -p$mysql_pass $mysql_dbname | gzip -c > $mysql_file_path
chmod 644 $mysql_file_path

# Files backup itself (do files backup approx. each 30 days, when rand number between 1 to 30 match value 5))
if [ "$(( ( RANDOM % 30 ) + 1 ))" == "5" ];then
rm -rf $bckp_store_path/*.tar.gz # delete files backup archive assuming it has tar.gz extension
tar czf $filesbckp_file_path $filesbckp_src_path --exclude "$filesbckp_excludepath"
chmod 644 $filesbckp_file_path
So i added this above code to my hosting account where is my website hosted, into directory: /home/MyUsername/ and then change mode to 600 (chmod 600 /home/MyUsername/ to make it harder anyone read your palin mysql data!!

then created directory for backups so it match the variable set in the script:

Above backup script should be run daily or less often, not more often as file naming is for daily backups.

So we have script created, backup directory created, then we create .htaccess file in our backup directory with a line "Options All -Indexes" in it to disallow anyone seeing this dir. contents.

Then setup a "Cronjob" from hosting control panel..
The cronjob command in my case example: sh /home/MyUsername/
and make it run daily on midnight approximatelly, make sure that cronjob reports are sent to your email so you can see if cron is complaining/produce any errors

One should be sure that hosting account has at least 40% free space

SCRIPT 2 (located on external server to download data from website server regularly)

Next step is to setup shell script at another server so we download backups to this external server automatically. if you dont have this external server yet, order cheapest VPS from there.

# Remove backups older 60 days
find /backup/mywebsitename -mtime +60 -delete
# Sleep one hour so backups on source server have time to be created untill we start download (check that this server and source server time is same or adjust to make sure we dont download backups too early when they are not yet created)
sleep 3600
# DL todays created backup files only
wget -P /backup/mywebsitename/$(date +"%d_%m_%Y").sql.gz --no-check-certificate# DL files backup
wget -P /backup/mywebsitename/ --no-check-certificate
example this script will be in /backup/mywebsitename/
We will set it to execute by cron, so add an execute permissions to it: chmod 700 /backup/mywebsitename/
then one can
A) symlink it to the cron directory like: ln -s /backup/mywebsitename/ /etc/cron.daily/
B) setup an cronjob, for example in some hosting control panel or via server CLI: crontab -e, then adding line:
1 1 * * * /bin/sh /backup/mywebsitename/
(1:01 everyday, so backups on source server have time to be created from midnight)

Do You have Your own script or have ideas? Please share..

If You need help (paid) setting up backup scripts, please send request here. Please mention this webpage URL and your hosting account login details.