Jump to content
  • 0

Question

6 answers to this question

Recommended Posts

  • 0
Posted
#!/bin/bash

usr="backup"
pw="yoursqldbpassword"
pw7z="your7zpassword"
dbs=(rathena board website)
date=$(date -I)

for db in "${dbs[@]}"
do
	filename="${db}/${db}_${date}.sql"
	echo "Dumping Database: $db ..."
	mysqldump -u $usr -p$pw $db > $filename # creates a sql dump
	tar cf - $filename | 7za a -si -p$pw7z ${filename}.tar.7z # compresses the dump
#	ncftpput -Vmu [email protected] -p ftp ftp.host.com / ${filename}.tar.7z # copy it to an ftp server
	rm $filename
done

I'm using something like this. I'd recommend to add an explicit database user for backups with read access only. And using 7zip is good for both compression and encryption of the data.

To run it on a daily basis I'd recommend using a cronjob:

30 5 1 * * cd ~/backup && ./backup_full.sh > /dev/null 2>&1
30 5 2-31 * * cd ~/backup && ./backup.sh > /dev/null 2>&1

Mine is split into two parts. backup.sh is the above script and dumps everything except the log database. It runs every day except the first day of the month. backup_full.sh also includes the log table and is run only once every month.

dbs=(rathena rathena_log board website) # dbs entry in full_backup.sh

 

Keep in mind: A backup on the same host will not help you at all if the hard drive is broken.

 

I'm also interested in other solutions, so keep on posting them :D

  • Upvote 1
  • 0
Posted

I use the following to grab both SQL and File based backups:

#!/bin/bash

dbhost='127.0.0.1'
dbuser='user'
dbpass='pass'
dbname='db'
savepath='/home/backups'

date=`date +%Y-%m-%d_%H%M`
month=`date +%Y-%m`
filename="$savepath/$month/"$dbname"_"$date".sql"

tarname="$savepath/$month/"$dbname"_"$date".tar.gz"

if [[ ! -d "$savepath/$month" ]]; then
	mkdir -p "$savepath/$month"
	chmod 700 "$savepath/$month"
fi

mysqldump --opt --host=$dbhost --user=$dbuser --password=$dbpass $dbname > $filename
chmod 400 $filename

dirname='/var/www/htdocs'
tar -cvpzf $tarname $dirname
chmod 400 $tarname

This is ideal for when you also have file based logs (like with FluxCP) so everything is backed up and saved to a location. I have another script running on a separate vps that remotes to the server that has the new backups and copies them. This script doesn't send them anywhere.

  • Upvote 1

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Answer this question...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...