1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Using rsync for backup

Discussion in 'General Linux Discussion' started by Daerandin, Dec 2, 2013.

  1. Daerandin

    Daerandin Well-Known Member

    Joined:
    Oct 18, 2013
    Messages:
    1,130
    Likes Received:
    243
    Trophy Points:
    63
    Location:
    Northern Norway
    Home page:
    I just wanted to share something I consider to be a very easy and good way of keeping backups using something called rsync. It is well known and should be available in official repos for most distros.

    If you have an external drive, or another partition that you use specifically for backups, you can easily make a full system backup with a single command. As long as the storage media for your backup is a Linux filesystem (such as ext4, NOT fat32 or ntfs), then this will work. Everything in Linux is a file, and as long as file permissions and ownership is retained then it will yield a functional backup which can be used to restore the full system to a previous point.

    The command is fairly simple, just make sure to run it as root (or with sudo):

    Code:
    rsync -aAXv --delete /* /path/to/backup/directory --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found}
    It is very important that the location you backup to, is located in /media or /run, if not then you must make sure to specify the backup location among the excluded directories.

    This will retain all file permissions and ownership. The excluded directories are populated on boot, which is why you should not copy their contents when doing a backup, and lost+found is filesystem specific.

    This can be run again, pointing to the same backup directory, and it will simply update the backup to the current system state, which will take much shorter time than the initial backup. The --delete option is important if you want the backup to always mirror your current system state as it will delete old files from a past backup that no longer exist on your system when doing a new backup.

    You can then easily grab lost files from the backup if needed, or even do a complete system restore at a later point by running the above command, but instead of

    Code:
    /* /path/to/backup/directory
    You change their place to look like this:

    Code:
    /path/to/backup/directory/* /
    And your full system will revert to the exact state as when the last backup was done.
  2. Gizmo

    Gizmo Chief Site Administrator Staff Member

    Joined:
    Dec 6, 2012
    Messages:
    2,230
    Likes Received:
    156
    Trophy Points:
    63
    Location:
    Webb City, Missouri
    Home page:
    I came across a script some years ago that used a combination of rsync and tar to do a full backup + versioning system so that you could actually restore a previous version of a file back to when you started your backups. It took advantage of rsync's ability to copy only the parts of the file that had actually changed. Let me see if I can find it again.

    Edit: This isn't the one I remember, but it seems very nice: https://github.com/schlomo/rbme
  3. Daerandin

    Daerandin Well-Known Member

    Joined:
    Oct 18, 2013
    Messages:
    1,130
    Likes Received:
    243
    Trophy Points:
    63
    Location:
    Northern Norway
    Home page:
    For things like this, I prefer to either do it manually, or writing my own script. Because of the rolling release nature of my distro, keeping old backups is not very useful. I run the backup weekly, as long as I have not had any issues with the current state of kernel and driver versions.

    But I can certainly see the use for backup + versioning with most other distros, especially on computers used for work or servers.

    I have not thought of involving tar in this, but I suppose a way would be to make a tarball after running rsync, and using the current date as name for the tarball. You could exclude the home folder from this if you'd be interested in only keeping a backup of the system, and not specific user data, which would save space for the backup. Or you could exclude /home from just the tarball.

    You know this really got me thinking. Up to now I've just used a .bashrc alias to run the rsync command, but I think I'll write a simple bash script to make tarballs, and also limit the amount of tarballs to a certain number.
  4. booman

    booman Grand High Exalted Mystic Emperor of Linux Gaming Staff Member

    Joined:
    Dec 17, 2012
    Messages:
    8,278
    Likes Received:
    614
    Trophy Points:
    113
    Location:
    Linux, Virginia
    Home page:
    Reminds me of Robocopy for Windows.
    I actually use FWBackups on my Fedora Server. I don't use compressed folder for my backup, but instead do a full copy of every directory.
    Since I rarely have my server on, I only run it every once-in-a-while manually.
    But it works perfectly!
    I copy to an internal 1 Terabyte drive.
  5. Daerandin

    Daerandin Well-Known Member

    Joined:
    Oct 18, 2013
    Messages:
    1,130
    Likes Received:
    243
    Trophy Points:
    63
    Location:
    Northern Norway
    Home page:
    If anyone are interested, I made a script to automate the process. After using rsync to do a full copy of your entire system, you will be asked if you want to create a versioned archive, and it will further ask if you want to include the home folder in the versioned archive.

    The script is by default set to only leave one old archive when you create a new archive, but can be modified to include as many or few as you'd like. I believe my comments should make it easy for people with no knowledge of bash scripts to make the changes they would need.

    I am personally quite inexperienced with bash scripting, so if anyone spot anything that could be done better, I'll happily take the critique.

    Code:
    #! /bin/bash
     
    # This script use rsync to do full system bakcups. It is distributed under the WTFPL
    # In short it means you can do whatever you want with this script
    # This script is written with the assumption that you specify an existing DESTDIR which is where the backup will be copied to
    # And you must also specify an existing TARDIR which is where optional versioned archives will be put, you should not keep
    # any other files in the TARDIR as they will be deleted by this script
    #
    # To unpack the versioned archives, use 'tar xpvzf' so file and folder permissions and ownership is retained
     
    FILE="/tmp/out.$$"
    GREP="/bin/grep"
    # make sure only root can run this script
    if [ "$(id -u)" != "0" ]; then
      echo "This script must be run as root" 1>&2
      exit 1
    fi
    # modify these lines for your system
    DESTDIR=/path/to/backup/directory # directory the backup will copy to
    TARDIR=/path/to/archive/directory # directory optional archvies will go into
    # testing if backup directory is available
    if [ ! -d "$DESTDIR" ]; then
        echo "The backup directory is not available."
        exit 1
    fi
    # doing backup
    START=$(date +%s)
    rsync -aAXv --delete /* $DESTDIR --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found}
    FINISH=$(date +%s)
    while true; do
        echo "Total time: $(( ($FINISH-$START) / 60 )) minutes, $(( ($FINISH-$START) % 60 )) seconds."
        read -p "Would you like to create a versioned tarball? (y/n)" yn
        case $yn in
            [Yy]* )
                # the following command specifies how many old archives you want to keep before creating the new one
                # if you want to keep just one, leave it as is it
                # if you want to keep two, change '1,1d' to '1,2d' and so on
                # if you want to remove all older archives, remove the whole " | sed '1,1d' " part
                cd $TARDIR
                ls -t $TARDIR | sed '1,1d' | xargs rm
                while true; do
                    read -p "Include /home? (y/n)" ynn
                    case $ynn in
                        [Yy]* )
                            NAME=$(date +%Y%m%d-%H%M%S)
                            tar cvpzf $TARDIR/$NAME.tar.gz $DESTDIR
                            break
                            ;;
                        [Nn]* )
                            NAME=$(date +%Y%m%d-%H%M%S)
                            tar cvpzf $TARDIR/$NAME.tar.gz $DESTDIR --exclude=$DESTDIR/home
                            break
                            ;;
                    esac
                done
                break;;
            [Nn]* )
                break;;
        esac
    done
    To use this script, simple create a new file, which you can name 'backup.sh', then open it with any text editor, like gedit, and just copy my code into the file and save it.

    Then open a terminal in the directory you have the script, and type:

    chmod +x backup.sh

    This will allow the script to run.

    To run this script, open a terminal in the directory where you have it and type:

    sudo ./backup.sh

    This is of course assuming you named the file 'backup.sh'
  6. booman

    booman Grand High Exalted Mystic Emperor of Linux Gaming Staff Member

    Joined:
    Dec 17, 2012
    Messages:
    8,278
    Likes Received:
    614
    Trophy Points:
    113
    Location:
    Linux, Virginia
    Home page:
    Awesome, I wonder if this could be a guide?
    Maybe we should have a forum for Other Guides?

    Thank you for posting this script.
    You might want to show beginners how to save it with gedit as an .sh
  7. Daerandin

    Daerandin Well-Known Member

    Joined:
    Oct 18, 2013
    Messages:
    1,130
    Likes Received:
    243
    Trophy Points:
    63
    Location:
    Northern Norway
    Home page:
    Just made some modifications in the script that I noticed were wrong. Permissions were not retained when extracting the archive, I did some googling and made the changes I found, am testing them at the moment.

    Edit: All file and folder permissions are kept intact when extracting that versioned tarballs.
  8. Gizmo

    Gizmo Chief Site Administrator Staff Member

    Joined:
    Dec 6, 2012
    Messages:
    2,230
    Likes Received:
    156
    Trophy Points:
    63
    Location:
    Webb City, Missouri
    Home page:

    Where do you think the idea of Robocopy came from? MS got tired of Unix folks b****ing about not having rsync on Windows. :p
    booman likes this.
  9. booman

    booman Grand High Exalted Mystic Emperor of Linux Gaming Staff Member

    Joined:
    Dec 17, 2012
    Messages:
    8,278
    Likes Received:
    614
    Trophy Points:
    113
    Location:
    Linux, Virginia
    Home page:
    Somehow I knew you were going to say that!;)
  10. ThunderRd

    ThunderRd Irreverent Query Chairman Staff Member

    Joined:
    Dec 17, 2012
    Messages:
    2,756
    Likes Received:
    87
    Trophy Points:
    48
    Location:
    Northern Thailand, the Land of Smiles
    Home page:
    I've tried several backup solutions with mixed results. My usual problem is that, at least for me, there seem to always be some permission issues that come to light. For some reason, it's usually sometime down the road, so I don't realize the issues exist right away. I'm too time-limited [read: lazy] to investigate exactly what they have been.

    I find that imaging the entire system disc once a week is the easiest for me. I use Clonezilla, I know others here do as well, and it is absolutely flawless. I have never, not even once, had a problem with saving or restoring a CZ backup. Its only downside is that you have to boot to it, so it can't be done with your system running. OTOH, it's a bit-by-bit copy, so it includes the boot records, MBR, etc, not only the existing files. I keep it on a thumb drive, and it does the job on a 500MB drive in about 10-15 minutes or so.
  11. Gizmo

    Gizmo Chief Site Administrator Staff Member

    Joined:
    Dec 6, 2012
    Messages:
    2,230
    Likes Received:
    156
    Trophy Points:
    63
    Location:
    Webb City, Missouri
    Home page:
    I just use dd. ;)
  12. ThunderRd

    ThunderRd Irreverent Query Chairman Staff Member

    Joined:
    Dec 17, 2012
    Messages:
    2,756
    Likes Received:
    87
    Trophy Points:
    48
    Location:
    Northern Thailand, the Land of Smiles
    Home page:
    I once played around with dd, figured out how to make an image the way I wanted to, but had problems with the restore syntax, so I gave up. Maybe it's time I tried again.
  13. booman

    booman Grand High Exalted Mystic Emperor of Linux Gaming Staff Member

    Joined:
    Dec 17, 2012
    Messages:
    8,278
    Likes Received:
    614
    Trophy Points:
    113
    Location:
    Linux, Virginia
    Home page:
    I use clonezilla at work! Its perfect for restoring Win XP!
    But the reason I don't use it at home is because I easily fill up 150 Gigs with games, so imaging them would take hours.
    Restoring would take hours.
    I guess I could install root on its own partition and home on its own partition so then I could image just root or just home.
    But home is where all my games and files are.

    Maybe if I made a partition for root, home and files?
    I don't know... Using my server for backing up games, movies & pictures seems the best way because its all internal.
    I also backup to external drive too.
  14. Daerandin

    Daerandin Well-Known Member

    Joined:
    Oct 18, 2013
    Messages:
    1,130
    Likes Received:
    243
    Trophy Points:
    63
    Location:
    Northern Norway
    Home page:
    I have made a few modifications to the script. Specifically, it now handles hard links on your system properly. I also made it so the optional tar archives are cleaner. I also decided to provide a better explanation of what the script does and how it can be modified.

    Code:
    #! /bin/bash
    
    # This script use rsync to do full system bakcups. It is distributed under the WTFPL
    # In short it means you can do whatever you want with this script
    # This script is written with the assumption that you specify an existing DESTDIR which is where the backup will be copied to
    # And you must also specify an existing TARDIR which is where optional versioned archives will be put, you should not keep
    # any other files in the TARDIR as they will be deleted by this script
    #
    # To unpack the versioned archives, use 'tar xvpzf' so file and folder permissions and ownership is retained, and remember to
    # extract the tar as ROOT! Since this script is run as root, ONLY root can extract the tarballs with correct permissions
    
    FILE="/tmp/out.$$"
    GREP="/bin/grep"
    # make sure only root can run this script
    if [ "$(id -u)" != "0" ]; then
      echo "This script must be run as root" 1>&2
      exit 1
    fi
    # modify these lines for your system
    DESTDIR=/path/to/backup/directory # directory the backup will copy to
    TARDIR=/path/to/archive/directory # directory optional archvies will go into
    # testing if backup directory is available
    if [ ! -d "$DESTDIR" ]; then
        echo "The backup directory is not available."
        exit 1
    fi
    # doing backup
    START=$(date +%s)
    rsync -aAHXv --delete /* $DESTDIR --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found}
    FINISH=$(date +%s)
    while true; do
        echo "Total time: $(( ($FINISH-$START) / 60 )) minutes, $(( ($FINISH-$START) % 60 )) seconds."
        read -p "Would you like to create a versioned tarball? (y/n)" yn
        case $yn in
            [Yy]* )
                # the following command specifies how many old archives you want to keep before creating the new one
                # if you want to keep just one, leave it as is it
                # if you want to keep two, change '1,1d' to '1,2d' and so on
                # if you want to remove all older archives, remove the whole " | sed '1,1d' " part
                cd $TARDIR
                ls -t $TARDIR | sed '1,1d' | xargs rm
                while true; do
                    read -p "Include /home? (y/n)" ynn
                    case $ynn in
                        [Yy]* )
                            cd $DESTDIR
                            NAME=$(date +%Y%m%d-%H%M%S)
                            tar cvpzf $TARDIR/$NAME.tar.gz ./
                            break
                            ;;
                        [Nn]* )
                            cd $DESTDIR
                            NAME=$(date +%Y%m%d-%H%M%S)
                            tar cvpzf $TARDIR/$NAME.tar.gz ./ --exclude=./home
                            break
                            ;;
                    esac
                done
                break;;
            [Nn]* )
                break;;
        esac
    done
    The ideal usage here is to have the DESTDIR and the TARDIR on an external drive. The most important is that they are in /mnt or /run or /media (which are usually the default locations external drives are mounted), otherwise the script will run in a never ending loop. Personally I have an external drive where I have two folders, one I use as the DESTDIR and another as the TARDIR.

    Do not use these folders for any other purpose, do not put other files there as the script will delete them. The DESTDIR is synced to your full system, meaning if there are files there that don't exist on your system then they will be deleted. The TARDIR is used to store snapshots of your system in a tar archive, this is optional. However when the script does this, by default it sorts the contents of the TARDIR and deletes everything but the most recent archive. If you have other files there they will be deleted and may interfere with the script.

    This script will take a long time the first time you run it, but on subsequent runs it will run much faster since it only updates changes since the last backup. If you run this weekly, it should be fairly quick. The optional tar archive which you are asked if you want to create, can be considered to be a frozen snapshot of your system. You are also asked if you wish to include /home in the tar archive. The contents of /home are not system critical and as such really not required in such a frozen snapshot. However, I still included the option to have /home in the tar archive if so desired.

    Whenever you choose to create a tar archive, it will first delete all but the most recent of the already existing archives. Then it will create a new one. This behavior can be changed in the script. I have commented where and what you must edit in the script to keep more archives, or even if you wish to keep no older archives.

    When you unpack a tar archive created by this script, it is very important that you do so as root. This is mentioned in the initial comments in the script, but I will mention it here as well for clarity. Since the script must be run as root, that means the tar archive is created as root. So only root can extract the contents and keep all permissions intact. The proper command then would be:

    sudo tar xvpzf /path/to/archive

    This script really is just an automation of commands you could easily run yourself to make backups. It is really just a very bare-bones backup script without any fancy features. However it does do the job of creating a complete backup of your system with all symlinks, hardlinks and permissions intact. The backup can even be booted from (although this requires changes to your bootloader and the fstab in the backup so I will not go into that).

    If you do not wish to include certain folders in the base backup, such as your /home. Then you simply edit the line:

    rsync -aAHXv --delete /* $DESTDIR --exclude={/dev/*,/proc/*,/sys/*,/tmp/*,/run/*,/mnt/*,/media/*,/lost+found}

    and add what you do not wish to include in the backup. Maybe you have a folder in your home that you don't want in the backup, then you simply add to the exclude section the path, example: /home/username/folder/*
    Last edited: Feb 24, 2014
  15. Aedan

    Aedan Administrator Staff Member

    Joined:
    May 10, 2013
    Messages:
    1,058
    Likes Received:
    4
    Trophy Points:
    38
    I just use dump... but then again, this is on a *BSD box.

Share This Page