When I was in charge of the web server for Kenzer and Company we used a 7-day rolling backup system. This system insured that, in the case of a problem, we had several days worth of backups to fall back on.
This proved very helpful as we once lost our forum database right before the nightly backup job. This resulted in the most recent backup file to be useless and the previous nights run was used to restore the database.
/bin/rm -f /usr/home/website_name/backup/database_name.1.gz
/usr/local/bin/mysqldump -f -database_password --result-file=/usr/home/website_name/backup/database_name.1 -u databse_username --databases database_name
If you look at the code you will notice that the backup directory is not public_html where your html files go. It is very important to keep your online backups in a directory that is not public!
It is also important to download your backups after you make them to insure against a server losing it's hard drive. You could put a secondary drive on your server and map it to /perm_backups and have this same chron job copy the file there as well. Either way you need to keep your backups stored in at least one additional place to insure against a hard disk failure.
- To use the above cron job
- website_name - the name of your website directory on the server. If you run multiple sites on the same server each one has it's own directory.
- database_name - the name of the database.
- databse_username - the database account used to run the backup. This account only needs read access to the database.
- database_password - the password for the account used to run the backup.
That is about it. Be sure to give the file a name backup_job.sh and add it to your cron jobs. Yes, the file does need to be executable to work!