Tegile Systems, a maker of medium-range unified hybrid arrays, says that traditional technologies don’t deliver the functionality that’s needed to optimize storage infrastructure for modern workloads. Mainstream solutions such as VMware’s vStorage APIs for Array Integration (VAAI) and Windows Server 2012, which only delivers a limited amount of support for hyperscale environments, simply don’t cut it anymore according to the company.
Tegile stated in its findings that users require a “new class” of virtualization-aware storage arrays to realize the maximum return on their hardware investments. This approach offers more transparency and manageability than siloed alternatives.
The vendor touts that virtualization-aware storage enables admins to identify the VMs that demand the most I/Os per second, the machines that achieve the best cache hit ratio, and the ones with best deduplication rate. It lists five additional reasons why enterprises should adopt virtualization-aware storage:
Improved VM Troubleshooting
The technology enables admins to correlate virtual machine and storage performance, a feature that can simplify troubleshooting a great deal.
Streamlined storage provisioning
Virtualization-aware arrays give the user visibility into how many IOPS are being consumed by specific workloads a key metric that can be leveraged to streamline troubleshooting even further.
Simplified capacity planning
Tegile says that the transparency built into VM-aware storage eliminates a lot of the hassle associated with long-term capacity planning. By replacing manual monitoring and tracking with so-called “at-a-glance dashboards”, admins can take more immediate action.
The company stresses that visibility is vital for users seeking to make the most out of their raw disk space. The ability to understand deduplication and compression rates on a per-VM basis is an important step towards achieving that goal.
The final benefit of VM-aware storage is that it supports third party tools that can introduce great deal of added abstraction and integration to the environment.
Making storage smarter
More and more organizations are adopting storage virtualization. Tintri, an emerging provider of VM-aware, software-defined storage solutions, saw sales quadruple last year thanks to a surge in demand from enterprise users.
Tim Russel, the vice president of Data Lifecycle Ecosystem Solutions for NetApp, argues that virtualization is disrupting the storage paradigm because it’s accelerating the adoption of shared storage architectures. Unlike traditional information silos, this flash-driven model simplifies operations, increases cost efficiency and boosts overall business agility.
Chris Archinaco, product marketing manager at Avere, voiced a similar view in a recent interview with Dave Vellante and Jeff Frick on theCUBE. He stated that the “cloud is the only architecture that can deliver high performance” while discussing Amazon S3, the speedy object storage service that his firm is integrating into client organizations’ on-premise environments.
Cloud computing and flash are the two big enablers of this industry-wide transition to virtualized storage. In corporate datacenters, the performance and power-efficiency of solid-state memory make it possible to run mission-critical apps far more effectively than before, all the while creating new business opportunities for vendors.
Last month, IBM introduced an entire set of new flash solutions and announced that it will shell out $1 billion on solid-state R&D. SiliconAngle CEO John Furrier and Wikibon co-founders David Floyer and Dave Vellante stressed the importance of this move in a recent panel discussion, and emphasized that vendors must investment in flash innovation or risk falling behind the curve.
Latest posts by Maria Deutscher (see all)
- LinkedIn previews upcoming enhancements for Apache Kafka - September 2, 2015
- Syncsort open-sources Spark connector for mainframes - September 1, 2015
- SAP debuts an in-memory query engine for Spark in renewed IoT push - September 1, 2015