Skip to main content

OTRS Restore Procedure and backup script

As I note in my previous post, I managed to kill my OTRS install and as usual had to trawl around the net to remember how to restore it. In a nutshell:

# mysql -u root -p
msyql> drop database otrs;
mysql> create database otrs;
mysql> ext
# /opt/otrs/scripts/restore.pl -d path_to_backup /opt/otrs

You did back up right?

Nightly I run a script with the following in it:

otrs_backup.sh


#!/bin/bash
# Variables below - change these to suit
NOW=$(date +"%Y-%m-%d_%H-%M") # this gets the correct file name for OTRS backup
LOCAL=/root/backup # a local directory for OTRS to backup to
REMOTE="user@backupserver:~/backup/OTRS/" # remote backup dir - nfs share, ftp or cifs
/opt/otrs/scripts/backup.pl -d $LOCAL # OTRS internal backup (files and DB)
tar -cf $LOCAL/$NOW.tar $LOCAL/$NOW # creates a file from the OTRS backup folder - more efficient to copy over a network
gzip $LOCAL/$NOW.tar
rm -rf $LOCAL/$NOW # tidy up
scp -r $LOCAL/$NOW.tar.gz $REMOTE # scp to remote directory


You may wish to run this from crontab after copying otrs_backup.sh to /usr/local/bin:

0 20 * * * /usr/local/bin/otrs_backup.sh

This will run at 10pm each night - theoretically you could run it more frequently. OTRS databases will a lot of attachments get quite large though so be mindful of that (I have a couple I manage that are 1GB and are only 5 months old)

Enjoy

Comments

  1. Hi, Is there any script for automatic daily restore also? We want our backup OTRS server to be updated daily.

    ReplyDelete
  2. Hi Irini, yes I'd imagine you could use the built in restore command to push the backed up data to your secondary server. I'll put a script together and post it to the site today.

    Ryv

    ReplyDelete

Post a Comment

Popular posts from this blog

Plone - the open source Content Management System - a review

One of my clients, a non-profit, has a lot of files on it's clients. They need a way to digitally store these files, securely and with availability for certain people. They also need these files to expire and be deleted after a given length of time - usually about 7 years. These were the parameters I was given to search for a Document Management System (DMS) or more commonly a Content Management System (CMS). There are quite a lot of them, but most are designed for front facing information delivery - that is, to write something, put it up for review, have it reviewed and then published. We do not want this data published ever - and some CMS's make that a bit tricky to manage. So at the end of the day, I looked into several CMS systems that looked like they could be useful. The first one to be reviewed was OpenKM ( www.openkm.com ). It looked OK, was open source which is preferable and seemed to have solid security and publishing options. Backing up the database and upgradin

Musings on System Administration

I was reading an article discussing forensic preparation for computer systems. Some of the stuff in there I knew the general theory of, but not the specifics of how to perform. As I thought about it, it occurred to me that Systems Administration is such a vast field. There is no way I can know all of this stuff. I made a list of the software and operating systems I currently manage. They include: - Windows Server 2003, Standard and Enterprise - Exchange 2003 - Windows XP - Windows Vista - Windows 2000 - Ubuntu Linux - OpenSuSE Linux - Mac OSX (10.3 and 10.4) - Solaris 8 - SQL 2005 - Various specialised software for the transport industry I have specific knowledge on some of this, broad knowledge on all of it, and always think "There's so much I *don't* know". It gets a bit down heartening sometimes. For one thing - I have no clue about SQL 2005 and I need to make it work with another bit of software. All complicated and nothing straightforward. Irritating doesn&

Traffic Monitoring using Ubuntu Linux, ntop, iftop and bridging

This is an update of an older post, as the utilities change, so has this concept of a cheap network spike - I use it to troubleshoot network issues, usually between a router and the network to understand what traffic is going where. The concept involves a transparent bridge between two network interface cards, and then looking at that traffic with a variety of tools to determine network traffic specifics. Most recently I used one to determine if a 4MB SDSL connection was saturated or not. It turned out the router was incorrectly configured and the connection had a maximum usage under 100Kb/s (!) At $1600 / month it's probably important to get this right - especially when the client was considering upgrading to a faster (and more expensive) link based on their DSL provider's advice. Hardware requirements: I'm using an old Dell Vostro desktop PC with a dual gigabit NIC in it - low profile and fits into the box nicely. Added a bit of extra RAM and a decent disk and that&