Is Optimization BS? Where Does Optimization Really Belong?
Over the past five years the notion of capacity optimization has become a hot topic with storage vendors. And let’s face it; since the vendors are hocking it, the users are buying it so they are becoming more aware as well.
It started out with compression for tapes in the late 70’s early 80’s. The issue here was twofold. First, where does the compression take place to ensure high performance of the overall “system” (I say this because I/Os per sec matter, all vendors publish BS numbers and what matters is, when the backup kicks off, does it finish before the backup window closes, that is all the IT guy cares about). The second issue is the reliability of being able to get your data back.
Early versions of tape compression were done in software which had a big impact on performance, sometimes making it almost unusable. From a reliability perspective, while the actual compression may have worked, the inability to do it fast enough, causing wired scenarios such as system hangs and crashes leaving you without your data when you wanted it isn’t really that reliable.
The next scenario is data deduplication for backup. (We all know who is winning that ground war.) At a functional level though, deduplication is first off a great fit for backup as the backup sees so much duplicate data. Second, vendors can control things such as performance and reliability because the functions take place in a closed system. Combine this with the fact that hardware is becoming more commoditized, it is actually pretty easy to build a high performance system with good reliability. Keep in mind however it is a closed system, these solutions at most, rely on a small handful of applications (backup applications) to feed them data. There is a lot of control here.
Fast forward to 2010. The new ‘Hot Topic’ is optimization for primary capacity. The rules haven’t changed by the way. Users are still going to demand performance and reliability (of their data). This begs the question then, “’Where’ does optimization really belong in order to maximize performance and reliability?” The answer is simple – in silicon. Now no vendor will tell you this because no one is doing it today (yes there are some folks doing off load engines but none of them have any real enterprise traction). Instead vendors will talk about optimization technology they have acquired and ‘integrated’ into their firmware (fancy software). The reality is this WILL HAVE AN IMPACT ON PERFORMANCE! Yes some people will say, “Yeah, but it’s only 5%”. To that I say, okay, what is next? If I let the array vendor get away with 5% today, what do they want to do tomorrow that will impact performance? Also, if I can’t optimize efficiently, doesn’t that impact the reliability of my data? This becomes a very slippery slope.
Lastly I will remind people of a saying my grandfather told me and one my father drilled into me. You get what you pay for. Aren’t most vendors “giving it (optimization) away for free?” First, free means sacrifice. You WILL sacrifice something, performance, reliability, something. Second, and perhaps more importantly, do you really think they are giving it away for free? Do you really think these companies spent millions of dollars on R&D or M&A to give their customer something for free? I am sure the story to Wall Street discusses how these vendors are going to use optimization to help them make more money.
Lets break this down a bit. If the array vendor is giving you more capacity per drive then you buy less drives right? If buy less drives, then so does the array vendor, cutting their COGS down. Are they passing this savings on to you? No. One more point. If the vendor can make a 1TB SATA drive look like a 2TB SATA drive and wait to buy the newer, more expensive, less reliable drive until the price declines, then the vendors can give the customers what they want for price and reliability and keep margins higher.
Putting optimization in silicon is hard, will cost a lot of money and it will take some time, 18 months perhaps before it could be reality. That said, it is the optimal place for it. The question is, will the appearance of giving it away for free create a situation that doesn’t allow the solution to make it to its rightful place because the vendors can’t get ahead of it?
{Editor Note: I have been emailing with an anonymous person who is a CTO in the industry and this person wants to contribute to the community here at SiliconANGLE. After vetting this person it is clear that this person is deep in enterprise and service provider market from a technical perspective. My goal is to post these posts to drive conversations. I welcome the Secret CTO to the group and hope to see professional and raw commentary on trends and companies that we cover here. If the Secret CTO becomes “over the top” or unprofessional in any way he/she will be bounced out of the group. }
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU