This week, Amazon Web Services rocked the longterm mission-critical data backup market with the launch of Amazon Glacier, an extremely-low-cost, purposely-low-performance, high-durability cloud storage offering that’s gunning for those enterprises that are still using tape for archival purposes.
ZDNet’s Jack Clark has an excellent overview of the complete Amazon Glacier value proposition, but it essentially breaks down like so: Amazon charges as little as a penny per gigabyte to shove data into the cloud. But if you need to get that data out, you’re going to pay a retrieval fee, and your request is going to be low-priority. Obviously, it’s not aimed at replacing the higher-performance Amazon S3, though Amazon is admittedly cannibalizing some of its own business.
But while all the chatter is around Amazon Glacier, it’s worth noting that AWS isn’t the only vendor/service provider trying to help enterprises and SMBs address the complexity and expense that physical media (especially tape) backups can bring. Keeping with AWS’ code of secrecy, Amazon only says that Glacier runs on “inexpensive commodity hardware,” so while it’s possible that it’s itself running on tape, we may never know.
Here’s a by-no-means comprehensive list of three alternatives to Amazon Glacier.
Managed Tape Storage
Infinidisc is a good example of a managed cloud tape backup service, representing kind of a hybrid between what Amazon Glacier represents and the old guard. Files are uploaded to Infinidisc’s cloud facility through the “Infinidisc Appliance, through our secure web portal or via FTP,” according to the provider’s website.
The backups are kept on dedicated hardware infrastructure, and every tape is duplicated and sent to an off-site facility, all on Infinidisc’s end. The vault gets mounted on your network as a fileserver to retrieve the backups as needed, and can be accessed from anywhere, or so goes Infindisc’s pitch.
New-Generation Virtualization/Storage Arrays
Storage vendors like Dell, IBM and EMC are looking to solid-state drives (SSDs) to address the challenges of modern, data-driven applications and their vastly increased data IOPS needs. At the same time, deduplication methodology is helping optimize data transfer for archival purposes.
Combining virtualization technologies with these new SSD storage arrays can lead to storage environments that can handle vast amounts of data with lowered complexity, lowered operating expenses, and a generally lower need for storage infrastructure. Recently, SiliconAngle wrote about an EMC case study where the adoption of EMC’s VNX unified storage and DataDomain deduplication technology for a massive Canadian school district helped boost performance and cut costs massively, compared to their legacy tape backup system.
Unless you invest in services (a major part of EMC’s current storage push, for instance), you’ll have to manage it yourself – unlike Amazon Glacier. But it’s a way to retain control and cut costs at the same time.
Just Use Tapes
For the most cynical of the cynics: If you think cloud, virtualization and big data are fads, and you already have the infrastructure in place -and let’s face it, if you think this way you probably already do – by all means, continue to keep using tapes.
Fair is fair – tapes are very much a known quantity, and have the advantage that you can literally juggle hand-labeled backups of your organization’s mission-critical data if you so desired (I have). You may have to trust Symantec Backup Exec, but you don’t have to rely on costly virtualization licenses, and the only thing that could keep you from your data would be the apocalypse, or at least bank robbers getting into your local vault.
In other words, you lose a little flexibility and a lot of scalability, but if you have really stringent security needs or are just a little more on the cautious side, well, if it ain’t broke, don’t fix it.