I setup a large network from the ground-up at the end of last year. Part of the design was to provide 99.99% availability 24/7, which is a significant requirement. However, using MS Windows Server 2012 R2 and a combination of DFS, HyperV replication and Live Migration, I managed it. Although replication seems to be the “new” backup strategy – look at Office365 for a good example of this – I proposed that a traditional backup was also needed for historical restores and as a safety net. Offsite backups was not an option for a couple of very good reasons, plus the amount of data being backed-up was around the 2TB mark at inception, rising to an estimated 5TB in a couple of years, and so a couple of NAS boxes configured in a RAID 1 array were put on the network to host backups, with an RDX drive providing end of week full backups to be locked away. All looked good from a high-level perspective…
…and then came Veritas BackupExec. Now me and BackupExec have had a very rocky relationship, starting from the early days when it was Symantec BE. I was using LTO, SDLT and DDS tapes back then which was awfully long-winded when restoring data, but it was the industry norm, and so was BackupExec. Anyway, I’ve had bad experiences with Acronis of late, most notably their shitty customer service and flaky support for RDX carts, and other providers wouldn’t work with the requirements of the network and so BackupExec seemed to be the reluctant winner of “who gets my backup money”.
I installed the trial version of 14.2 on the servers and started to hit all kinds of problems, most notably:
- I kept getting an error that stated the SQL database was unavailable. It happened extremely suddenly after about five days of flaky operation. After an entire day troubleshooting SQL Express, reinstalling, checking the services and the system account, it turns out that there is a Nov 16 Windows Server update that causes BE to fail. Removing said-update made BE work again. IS this Veritas’ fault or MS? I blamed both.
- Services crashing. This is my main experience with BE, from as early as I can remember; services that will stop for no obvious reason. I’ve never known a piece of software have so many issues that will stop it from working. To really take the piss, Veritas must realise how much of a bag of shit BE is because it provides a handy quick-button to the services. It’s probably the most-used interface button in the entire application.
- RDX drive lock-ups. I love RDX drives because it’s not tape, and it’s not a flimsy USB drive. Like I mentioned before, Acronis got on my wrong side with their “we don’t support RDX as removable drives but we do state that we support it on our site” and I did have an initially-good experience with BE and RDX. However, BE would start to back up to RDX and then stop, get to the expiry time for the backup job, and then crash its services. I put some RDX carts aside thinking that they were faulty but diagnostics didn’t reveal any issues with the drives, and it happened on proven good drives. Be just doesn’t like writing continuous to RDX it seemed.
- NAS boxes. So to get BE to back up to a NAS box (according to their support site), BE must be running as a user account that has permissions to the online location. Pretty easy, right? It’s sysadmin 101 stuff to be honest. However, in true BE form, it doesn’t work. I setup an account, gave the account permissions to access the NAS box (which was joined to the domain) could browse to the location as the user, and yet BE could not talk to the location. I tried hosting a location on the DR server – no dice. Even giving an Everyone permission on the NAS would not allow BE to write to the NAS. As an aside I have managed to get BE talking to a NAS location before so was surprised that I encountered this issue.
- Licensing. BackupExec gives you a 60 day trial of “everything” BE can do, after which you need to pay up. It is very possible, nay certain, that you won’t want to buy all of the functionality that BE offers. In my instance, I wanted to backup HyperV machines running on the host where BE was installed, and in the trial version it worked VERY well. I was expecting to install the client on the VMs and point it to the host through the network, but no – BE can see what’s hosted in HyperV and back it up. I purchased BE with this backup functionality, received the backup keys, installed them – and received a 30 day trial for HyperV backups. I went to Veritas who confirmed that the keys allow HyperV backup. My BE installation confirmed that I didn’t have this functionality. And so backwards and forwards we went, confirming what was supposed to be true and what was actually true.
Anyway, I am sure that if you have used BE in a production environment you’re not hearing anything new. The software is and always has been a house of cards. However, what’s the alternative? Acronis was my favourite but then started an aggressive licensing campaign, Windows Backup doesn’t support RDX drives, and Simpana CommVault is horrendously expensive and convoluted (but works exceptionally well. Seriously, if you can afford it then CommVault’s the way forward for Enterprise). There is another alternative.
A lot of people have been using Microsoft Azure as a throwaway phrase to solve IT hosting issues. Want to reduce budget? Azure. Need to refresh your servers? Azure. Got a project that needs a platform? Azure Azure Azure. I don;t think a lot of people actually know Azure in any real detail, both technically and financially. It is powerful but confusing to the lay person. It is also very expensive to SOHO customers that just need a server. A low-grade server running 2008R2 costs £90 per month; the same server running Ubuntu costs £40. In comparable terms a good small server – the Dell T20 with 32GB of RAM, Xeon CPU and a couple of SSDs – will cost about £800 with Server 2012R2, but no backup, which is one of Azure huge strengths, for replication and resiliency is incredibly easy to build into an Azure offering.
The one feature of Azure that has gotten me on board with the Microsoft platform is Azure Backup. You can use Azure as a dumping ground for your data. In its simplest terms, you setup a backup “area” in Azure, download and install the client on your target machine, feed in a key to prove security, and then configure a schedule. The client will then send all the files to Azure and keep them there. It really is that simple to setup. If you want to do some more complex backup operations, such as sending VMs and system states to Azure then you will need to install the full version of Azure Backup which, to be fair, starts to feel a lot like BackupExec. For example it’s a whopping 3GB of program that needs to be hosted on a non-DC server, although it could be argued that you could set this up on an Azure-hosted server (but I spun up an on-prem VM in HyperV for the task). It needs good ol’ .NET 3.5 to be enabled in Roles and Features which is so common that MS probably need to re-enable it as standard in Server. And I couldn’t get Backup to backup to Azure because it wouldn’t accept the key file from my Azure account, and yet the Azure Backup “light” client application did accept it. This seems to be an accepted problem for a lot of people at the moment.
Despite all of this, the backup service, for files only, works like a dream. It’s an off-site backup for your servers, and the costs seem to be extremely reasonable. 500GB of data transferred is apparently £18 per month, a bargain considering that I have been quoted £800 for 250GB of backup storage using a rival platform.
Don’t take my word for it – try it yourself. Sign up for a free one month Azure account here. They will ask for a credit or debit card, but it won’t be to take money, just to prevent accounts from being abused, and they give you £125 of services during that month.