Hi, just want to share the script i created today.

100% automated way to regularly backup mysql database to a file and backup this file to external server.

Linux bash script to backup mysql to the folder, compress it, chmod it, move to certain WWW directory and keep only files younger X days. THen another script at external server which will download backups and also keep only files younger X days. This way using cron, we automatically backup mysql database tot he external linux server 100% automatically

bash mysql backup script on source server

# how the filename will look like
fname=rtdm_mysql_bckp_$(date +"%d_%m_%Y")

#what is the password, i try to dont call it directly
pass=$(echo "1ad1kcvrmlts" | tr "1" "9")

# dump mysql database to a file
mysqldump -u databaseusername -p$databasepassword databasename > $fname.sql

# compress .sql file
tar cvzf $fname.sql.gz $fname.sql

#remove .sql file keeping only .sql.gz and then change permissions of the file + move it to destination folder meant for backups
rm $fname.sql;chmod 644 $fname.sql.gz;mv $fname.sql.gz /home/myusername/public_html/myfolder/sqlbckp/

# pruna backups folder to delete all backup files older 14 days..
find /home/myusername/public_html/myfolder/sqlbckp/ -type f -mtime +14 -delete
backup script description in another words:
i set variable "fname", it is the file name of the backup which will be named "rtdm_mysql_bckp_todays date" + extension. Example rtdm_mysql_bckp_30.11.2013
then i set password variable, the one which is not loged (i hope so), this "tr" replaces 1 by 9

then im executiong mysqldump command with my mysql db credentials
then i compress .sql file into .sql.gz
then i remove .sql; change access rights of the .sql.gz file to 644 so i can later download it from my website; then i move the file to my website folder (i also forbidden reading this folder contents adding .htacess file with content: Options -Indexes
then finally i use "find" command to prune my folder for backups, because i will setupa cronjob to run this bash backup script every day, the backup folder would be bigger and bigger. so i set to delete files older 14 days

How to automate backups?

running this bash script regularly using cron. command: sh /address/to/the/script.sh

example i did command: "crontab -e" and added: 2 4 * * * sh /home/myusername/sqlbckp.sh
to run backup at 4th AM, second minute every day

Downloading backups to external server

You may then setup another bash script and cronjob at another external server (external backups are MUCH better) where you will download the backup files using WGET command, example script on external server:

wget -P /path/to/mybackupdirectory/ http://mysite.com/myfolder/sqlbckp/rtdm_mysql_bckp_$(date +"%d_%m_%Y").sql.gz --no-check-certificate
find /path/to/mybackupdirectory/ -mtime +90 -delete
What does it do? it downloads file from our source server on which we do backups and then it will do maintenance using "find" command and delete files older 90 days (i have small sql backup files and enough space so i can afford it)

Do you have a better script, or have improovements?