Remote Server Backup (ncftp)

on 28th March
  • ncftp
  • ncftpput
  • database backup
  • shell script
  • remote server
  • remote backup sync
  • linux
We encounter the need to backup the web server's contents to a remote server on nearly every project we work on. Before you head straight into creating hundreds of pages of quality content and building large data-driven websites, it's important to have a solid data backup strategy. If you do not, you will find what many of us over the years have found; that one morning you wake up, attempt to access your website and (eek!) nothing! So, you contact your hosting company and (eek!) nothing, no backups were taken in the last century and your data as you know it is lost to the ages. That is why it is so important, as a primary, not ancillary concern, of any serious developer or programmer to implement a data backup strategy without delay.

If you are using a standard LAMP setup (i.e. Linux, Apache, mySQL, PHP/Perl) as is the mostly implemented web server configuration on the planet, you have a few data backup options:

If you are using cPanel or Plesk, you can 'Generate/Download a Full Backup' from the 'Backups' menu however if your site is over 100MB, this can get very tiresome very quickly, particularly if you've been scared into backing-up your site every day. While this option can facilitate direct transfer to a remote server, it still involves logging into cPanel on a daily/weekly/monthly basis and actually entering in the credentials and clicking a button. I simply don't have the time/memory to do that, moreover, the closer a website tends to being fully automated, the better.

The other option is for you to write a small shell script to generate a backup of your database and your files and store it directly on your web server however if you have a storage capacity limit, you will find that these daily backups are of no help. Also, if for some reason your web hosting company goes down or your account is removed, having a backup of all your data on the server that you can no longer access is positively useless. You could always download the backups via FTP every morning but again, you probably don't have the time or inclination to do so.

Solution:

Construct a shell script to backup all the data in all your databases, collect all the files on your server, 'tar' (tape archive) it all up and transfer it to a remote FTP server. The remote FTP server that you are transferring your data to can be another server that you host other websites on or it can be any other machine with a fixed IP address. Most likely though, you'll want to avail of the facilities of an online storage company like MyNetStorage for example. There are many companies doing it but these guys seem to have a perpetual offer of $4 a month for 5GB. Not bad and well worth it if you're doing any serious web development.

There is one minor thing we must take into account in our script though. We only have 5GB to play with so if we back up our 500MB site daily for 10 days, we're done! At least the site was safe for 10 days, eh? It's probably not necessary to maintain 10 backups of the site in our remote storage facility, we'll say 3 is enough. Note: 1 backup is most certainly not adequate because if the server that we are backing-up from goes down in the middle of the transfer, the backup file on the remote server is now corrupted or incomplete.

Ok, enough of the theory, here's the script:

# We are going to keep 3 backups on our remote server
# so we need to maintain a counter on the local side
# to tell us what the backup index is.
# Adjust these numbers to keep more or less backups.
# For example, to keep backups of the last 5 days
# change "$thisbackup -eq 3" to "$thisbackup -eq 5"

lastbackup=$(cat counter.txt)
thisbackup=`expr $lastbackup + 1`
if [ $thisbackup -eq 3 ]
then
echo 0 > counter.txt
else
echo $thisbackup > counter.txt
fi
echo "Backup Index Is $thisbackup"

# Todays date
mydate=$(date +%s)

# Database username and password
DBUSER=my_db_user
DBPASS=my_db_password

# Name of the tar file we are going to create
FILE=master_backup_$thisbackup.tar

# Name of the database dump file we are going to create
DBFILE=db_backup.sql

# Dump the contents of all databases to the database dump file
mysqldump --opt -u $DBUSER -p$DBPASS --all-databases > $DBFILE

# Create the tar file adding in the contents of 'public_html'
# and the previously database dump file
# Unless you want a huge list of files, redirect
# output to /dev/null
tar -cf $FILE $DBFILE public_html > /dev/null

# Remote host details, IP, USER AND PASSWD
# Excuse the irony of the remote host example
# value being the most local host possible!
HOST='127.0.0.1'
USER='my_remote_user'
PASSWD='my_remote_password'

# Execute the FTP PUT (STOR) command ensuring
# the user has permission to write to my_home_folder
ncftpput -u $USER -p $PASSWD $HOST /home/my_home_folder $FILE

# Delete the files we used so we don't end up backing
# up the backups if the script is inside the folder we're
# backup up... ;-)
rm $FILE
rm $DBFILE


There you go! To get the script to run in your environment, just change the value of every variable with the value my_* and change the value of "HOST" to the IP address of your remote server. Also, make sure your mySQL user has permission to access all databases on your server and this example assumes your mySQL server is running on localhost (relative to where the script is running). After that, just setup a cron job to run the script every whenever and voila, you know have a fully automated backup procedure.

A note on few of the external commands used in this script:

mysqldump
This is a fairly standard database utility for (you guessed it) dumping the contents of any number of databases / tables to a local file. The contents of this file can then be reimported into mySQL by running the mysql command.

tar
Again, standard utility for packaging up loads of files into a single file. tar literally stands for 'tape archive'.

ncftpput
If you run into any problems executing this script, a likely cause will be that the ncftpput command doesn't exist on your server. That is indeed a problem. The command is by no means standard and in my experience, there is probably about a 30% chance of it already being installed on your server. The command is part of a larger library called ncftp which for your convenience we have provided 2 downloads: The first download is an RPM file for Red Hat Linux servers. Upload the rpm file to your server and issue rpm -Uvvh ncftp-3.0.3-6.src.rpm. Once that has finished, you'll need to compile the source code into a binary. cd to the redhat sources directory, run ./configure and make and you should be laughing. If that all sounds like far too much, just download the ncftp binaries, i.e. the second download. One of the binaries included in this library is the one of interest in the above article, i.e. the ncftpput binary.

NCFTP RPM File: ncftp-3.0.3-6.src.rpm
NCFTP Binaries: ncftp.tar.gz