Headline News
GoPro Returns with New Range (September 26, 2016 7:31 pm)
TRAILER: Doctor Strange (April 13, 2016 8:23 am)
TRAILER: Dawn of Justice Trailer 2 (December 3, 2015 6:25 pm)
Luther Season 4 Returns to the BBC (December 2, 2015 9:13 pm)
Sherlock Special (December 2, 2015 9:01 pm)

A Guide to Linux: Backing up files on Arch Linux

April 9, 2014
605 Views

You can regularly backup any files / directories on your Arch Linux machine to a USB storage device by creating a cron job (you may need to be root for these commands to work).

Create a directory in /etc called cronscripts.

cd /etc
mkdir cronscripts

In the directory, create a script that can be run to copy the files and directories you want to backup to your external storage device.

cd cronscripts
vi websitebackup.sh

Add something like this to the file.

#!/bin/bash
tar cvzf /storage/websitebackup/websitebackup-$(date +\%d-\%m-\%Y).tar.gz /srv/http
tar cvzf /storage/websitebackup/nginxbackup-$(date +\%d-\%m-\%Y).tar.gz /etc/nginx
find /storage/websitebackup/*.tar.gz -mtime +30 -exec rm {} \;
  • The first line is required to tell the script to run in the bash shell.
  • The second line is using tar with the c option for create, the v option for verbose (show us what it is doing), the z option for gzip and the f option for file. It then lists where to create the backup file and what to call it, in this case it appends the date to the file name, and then list the directory or file you want it to copy. My example here creates a gzipped tape archive file (tar cvzf) in the /storage/websitebackup directory and calls it websitebackup-date.tar.gz (e.g. websitebackup-11-03-2014.tar.gz). It copies the contents of /srv/http to that file.
  • The third line does the same thing to backup the /etc/nginx directory.
  • The forth line finds all tar.gz files in the /storage/websitebackup directory, and checks to see if they are more than 30 days old, if they are, it deletes them.

The whole thing gives us a backup of our website directory, and the Nginx config files, and gets rid of any backup older than 30 days.

Now make the script executable.

chmod 755 websitebackup.sh

At this point it is a good idea to test the script to make sure it works.

./websitebackup.sh

Now we just need to make sure it runs once a day. To do this, edit the anacrontab file in /etc.

vi /etc/anacrontab

You should add a line at the bottom of the file to tell cron to run the script once a day.

@daily                  10                      websitebackup.daily             /bin/sh /etc/cronscripts/websitebackup.sh

The whole anacrontab file snow looks like this.

# See anacron(8) and anacrontab(5) for details.
SHELL=/bin/sh
PATH=/sbin:/bin:/usr/sbin:/usr/bin
MAILTO=root
# the maximal random delay added to the base delay of the jobs
RANDOM_DELAY=45
# the jobs will be started during the following hours only
START_HOURS_RANGE=3-22
#period in days         delay in minutes        job-identifier                  command
1                       5                       cron.daily                      nice run-parts /etc/cron.daily
7                       25                      cron.weekly                     nice run-parts /etc/cron.weekly
@monthly                45                      cron.monthly                    nice run-parts /etc/cron.monthly
@daily                  10                      websitebackup.daily             /bin/sh /etc/cronscripts/websitebackup.sh

That’s it. The script will be run once a day to backup the site and the Nginx config files. If you need to restore the files at any point you can go to the backup directory and do something like the following to extract the files to the directory you are in, then copy them back into the right place.

tar xvzf websitebackup-11-03-2014.tar.gz