Skip to main content

Experiences with Virtualisation - XenServer

I have been experimenting with different virtualisation technologies lately. At work I already run a Microsoft Hyper-V server with two Ubuntu servers running on it. One is an OTRS (ticketing system) server and the other an FTP / DNS server. This server is a 1RU rackmount box with no virtualisation supported on the hardware. Oh well... it still runs reasonably well, however, I've found that under high disk / network load, the virtual machines grind to an absolute halt and I have to reset them. So I've begun expanding my horizons.

Recently I searched for and found some small form factor desktops with support for virtualisation - namely: HP Compaq dc5750 Small Form Factor Black AMD Athlon(tm) 64 X2 Dual Core Processor 3800+ Dua. I bought two (for the princely sum of $9 each plus postage so $130 (!) delivered) - they came with 1GB of RAM and 80GB HDD. I've upgraded the RAM in both to 4GB and I've got more on order. I also turfed the disk drives - one had a dodgy sector and the other was just plain dodgy. I slammed a couple of 200GB disks in and away we went.

XenServer is produced by Citrix, a well known company for remote access solutions and now more so for their virtual server solutions. I've used VMware before on the desktop and still use the VMware Player on the desktop for various things, but I had not looked into XenServer before. I started with the Live CD and was reasonably encouraged. The information out of it looked pretty good so I thought I'd install the XenServer operating system on some machines and see how it went.

The install on my dc5750's went very smoothly - all hardware detected and accounted for. The dc5750 supports AMD virtualisation so it went very nicely and XenServer ran very happily. On the first server - xenserver1 (very imaginative naming) I neglected to set the timeserver or enable NTP and this did come back to bite me later on. After the initial set up, I installed XenCenter on my Windows 7 notebook. It's a slick interface and once I put the IP of xenserver1 in it detected it without issue. My notebook and both servers are on a gigabit interface so it all runs pretty fast. I started the install on the second dc5750 (xenserver2 - more imagination there) while I added a pool in XenCenter and put xenserver1 in to it as the master.

After xenserver2 was installed I added it to the pool and noticed that the tool wasn't reporting the RAM usage on the second server. I had fully updated both servers and XenCenter. Kind of strange - then I got messages about the clocks not being synchronised. I went back and reset the NTP servers on both machines and it turned out that xenserver1 was an hour ahead - once that was reset both servers reported CPU / RAM / Network and disk usage quite happily. So now to the installation of virtual machines - but where to put the virtual disks? Aha! I added a storage pool via NFS on our FreeNAS server - and although this in itself caused some issues until I sorted the NFS share out, eventually it was all good.

XenServer has templates that are used to create the virtual machines. There is, naturally, a blank template for unsupported operating systems (like *BSD?!). I started with an Ubuntu Server 11.10 install - the template suggested RAM usage, disk size etc and I created the the virtual machine very quickly. I had previously added a file storage pool for iso images - I pointed the server template at the appropriate ISO image and declined to point the VM at a particular xenserver, opting to allow it to choose one with the available resources. It chose xenserver2 and the installation began. I undocked the console so I could watch it and returned to the XenCenter to watch the load and usage on the servers. I also started a Windows Server 2008 R2 installation from the template for the hell of it (I love Microsoft TechNet Direct). Again, I allowed the template to set the configuration for the server and again allowed XenCentre to pick the server with the available resources - it chose xenserver1 and the installation began.

Both installs ran through their usual routines, until the Ubuntu server reached the disk partitioning stage and it stalled. The Windows 2008 R2 server install ran perfectly. It detected all the hardware properly and I installed the Xen tools on it without issue - the reporting detail in XenCentre improved markedly after that - individual cpu and RAM usage and network/disk usage too. The install was actually pretty quick across the network (I was surprised to say the least). After I restarted the Ubuntu install it ran again and finally completed. While this was happening I was updating the Windows 2008 R2 Server and I began an install of FreeBSD in a new server under the default template. It installed perfectly and once again, detected the hardware properly (detecting the network card as a RealTek device) and I was left with a fully functional FreeBSD Release 9.0 server. Eventually the Ubuntu Server finished installing too and it was working properly.

My initial impressions were good. The software was clear to understand, the virtual machines easy to manipulate and work with and the support for the hardware in the virtual machines was all good. Over the next few weeks I'll continue testing them and record my impressions here. Then, I'll take the disks out, install VMware's offering and test them too.

Microsoft's Hyper-V server is not really the system I wish to run - while it's great for Microsoft products, they aren't the only operating systems we run (for a variety of reasons). I like to be able to deploy the best suited OS to the requirement and I hate being locked in to anything - I really prefer to be flexible. I'll also cover some of the licensing costs as we go along - how much and how it's all costed out. Stay tuned!


Comments

Popular posts from this blog

Plone - the open source Content Management System - a review

One of my clients, a non-profit, has a lot of files on it's clients. They need a way to digitally store these files, securely and with availability for certain people. They also need these files to expire and be deleted after a given length of time - usually about 7 years. These were the parameters I was given to search for a Document Management System (DMS) or more commonly a Content Management System (CMS). There are quite a lot of them, but most are designed for front facing information delivery - that is, to write something, put it up for review, have it reviewed and then published. We do not want this data published ever - and some CMS's make that a bit tricky to manage. So at the end of the day, I looked into several CMS systems that looked like they could be useful. The first one to be reviewed was OpenKM ( www.openkm.com ). It looked OK, was open source which is preferable and seemed to have solid security and publishing options. Backing up the database and upgradin

Musings on System Administration

I was reading an article discussing forensic preparation for computer systems. Some of the stuff in there I knew the general theory of, but not the specifics of how to perform. As I thought about it, it occurred to me that Systems Administration is such a vast field. There is no way I can know all of this stuff. I made a list of the software and operating systems I currently manage. They include: - Windows Server 2003, Standard and Enterprise - Exchange 2003 - Windows XP - Windows Vista - Windows 2000 - Ubuntu Linux - OpenSuSE Linux - Mac OSX (10.3 and 10.4) - Solaris 8 - SQL 2005 - Various specialised software for the transport industry I have specific knowledge on some of this, broad knowledge on all of it, and always think "There's so much I *don't* know". It gets a bit down heartening sometimes. For one thing - I have no clue about SQL 2005 and I need to make it work with another bit of software. All complicated and nothing straightforward. Irritating doesn&

Traffic Monitoring using Ubuntu Linux, ntop, iftop and bridging

This is an update of an older post, as the utilities change, so has this concept of a cheap network spike - I use it to troubleshoot network issues, usually between a router and the network to understand what traffic is going where. The concept involves a transparent bridge between two network interface cards, and then looking at that traffic with a variety of tools to determine network traffic specifics. Most recently I used one to determine if a 4MB SDSL connection was saturated or not. It turned out the router was incorrectly configured and the connection had a maximum usage under 100Kb/s (!) At $1600 / month it's probably important to get this right - especially when the client was considering upgrading to a faster (and more expensive) link based on their DSL provider's advice. Hardware requirements: I'm using an old Dell Vostro desktop PC with a dual gigabit NIC in it - low profile and fits into the box nicely. Added a bit of extra RAM and a decent disk and that&