Skip to main content

A simple script to use either robocopy or xcopy to backup files

Under various circumstances, I've found it useful to cobble together a script to do a sync backup across the network from one Windows server to another. Usually this is for files only and is for either a mirror or a daily, full backup of data. Obviously there are some great backup tools available that make something like this largely unnecessary, however, this is quick, simple and gives you an email output of what has happened. The first example below uses robocopy (Robust Copy) which is a very nice bit of kit indeed. It's a bit more useful than xcopy and handles larger numbers of files better. Don't get me wrong, I love xcopy, but it has it's limitations. I use rsync a lot on Linux servers and robocopy gives me many similar options for how I want to handle files.

The destination directory could be anything - another folder on the same PC, a removable disk, a mapped share or even a straight UNC path e.g. \\server\share - flexibility is the key for this script, once the basic variables are right and you've decided to use robocopy or xcopy then off you go.

So to the script:

Open notepad and put this info in - note where things are comments and what variables you'll have to change:

Script start:
echo on
REM Set up some time variables
for /f "Tokens=1-4 Delims=/ " %%i in ('date /t') do set dt=%%l-%%k-%%j
for /f "Tokens=1" %%i in ('time /t') do set tm=-%%i
set tm=%tm::=-%
set dtt=%dt%%tm%
REM set up variables for log files, source and destination - change this variable
set log="C:\Users\owner\Documents\Scripts\Logs\%dt%.log"
REM local stuff to be backed up - change this variable
set src="c:\documents"
REM remote location to put backups - change this variable
set dest="I:\backups\server"
REM now for the actual work - change switches as required - explanation of switches is below.
robocopy %src% %dest% /E /Z /MIR /R:1 /LOG:%log%
REM I'd like to know how it went (this file can be big if there are a lot of files copied)
echo Backup Logs attached | blat - -subject "Sync Log Report for %dt%" -to "me@mydomain.com" -attach %log% -f user@domain.com

Use blat to send the email - grab it from www.blat.net (great program!) It sends an email with a header that will look like this:
Sync Log Report for 2012-02-10
and an attachment of your log file. You can add different things to this - for example I'll often use a [servername] tag after the date.

The robocopy switches used are:

  • /E = copy sub-directories, including empty ones
  • /Z = copy files in restartable mode (in case the network drops out or something similar)
  • /MIR = MIRror a directory tree (which is /E plus /PURGE)
  • /R:1 = number of retries on failed copies. It's best to set this - by default it's 1 million (!)
I run this from the Windows Scheduler and have a mirrored copy of data files each night. It's quite a useful little tool. If you'd like to use xcopy instead there are a few things to consider:
  • the src and dest variables need to have a trailing backslash and a wildcard
    • set src="c:\documents\*"
    • set dest="i:\backups\server\*"
  • and the command to insert would be:
    • xcopy %src% %dest% /C /D /E /H /Y > %log%
    • where the switches are:
      • /C = continue copying even if there are errors
      • /D = copies files whose source is newer
      • /E = copies directories and sub-directories (even if empty)
      • /H = copies hidden and system files
      • /Y = suppresses prompting to overwrite files
    • the > redirects xcopy's output to the %log% variable we configured earlier in the script, and then blat will email the resulting file out.
If you find this useful in anyway, please let me know in the comments.

Comments

Popular posts from this blog

Plone - the open source Content Management System - a review

One of my clients, a non-profit, has a lot of files on it's clients. They need a way to digitally store these files, securely and with availability for certain people. They also need these files to expire and be deleted after a given length of time - usually about 7 years. These were the parameters I was given to search for a Document Management System (DMS) or more commonly a Content Management System (CMS). There are quite a lot of them, but most are designed for front facing information delivery - that is, to write something, put it up for review, have it reviewed and then published. We do not want this data published ever - and some CMS's make that a bit tricky to manage. So at the end of the day, I looked into several CMS systems that looked like they could be useful. The first one to be reviewed was OpenKM ( www.openkm.com ). It looked OK, was open source which is preferable and seemed to have solid security and publishing options. Backing up the database and upgradin

Musings on System Administration

I was reading an article discussing forensic preparation for computer systems. Some of the stuff in there I knew the general theory of, but not the specifics of how to perform. As I thought about it, it occurred to me that Systems Administration is such a vast field. There is no way I can know all of this stuff. I made a list of the software and operating systems I currently manage. They include: - Windows Server 2003, Standard and Enterprise - Exchange 2003 - Windows XP - Windows Vista - Windows 2000 - Ubuntu Linux - OpenSuSE Linux - Mac OSX (10.3 and 10.4) - Solaris 8 - SQL 2005 - Various specialised software for the transport industry I have specific knowledge on some of this, broad knowledge on all of it, and always think "There's so much I *don't* know". It gets a bit down heartening sometimes. For one thing - I have no clue about SQL 2005 and I need to make it work with another bit of software. All complicated and nothing straightforward. Irritating doesn&

Traffic Monitoring using Ubuntu Linux, ntop, iftop and bridging

This is an update of an older post, as the utilities change, so has this concept of a cheap network spike - I use it to troubleshoot network issues, usually between a router and the network to understand what traffic is going where. The concept involves a transparent bridge between two network interface cards, and then looking at that traffic with a variety of tools to determine network traffic specifics. Most recently I used one to determine if a 4MB SDSL connection was saturated or not. It turned out the router was incorrectly configured and the connection had a maximum usage under 100Kb/s (!) At $1600 / month it's probably important to get this right - especially when the client was considering upgrading to a faster (and more expensive) link based on their DSL provider's advice. Hardware requirements: I'm using an old Dell Vostro desktop PC with a dual gigabit NIC in it - low profile and fits into the box nicely. Added a bit of extra RAM and a decent disk and that&