

Hardware is the backbone of software running efficiently.
Nevertheless, the constrained nature of the hardware sector makes affordability and accessibility a tall order. This is why democratization is needed — so that smaller organizations and startups can experience a better onboarding process, according to Brian Beeler (pictured), editor of StorageReview.com.
“You talked about democratization; I think this is so important, because I think there’s a ton of gatekeeping in this industry that I find really frustrating,” Beeler said. “Professionals that have been doing this for years will look at these infrastructures or storage or GPUs or servers and say, ‘Well, this is the way to do it with this software stack and this is the output and here’s what you get.’ If you’re not doing it that way, you’re doing it wrong, and I think it’s a little sickening, honestly.”
Beeler spoke with theCUBE industry analysts Savannah Peterson and David Nicholson at SC23, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the importance of accessibility and affordability in hardware, as well as why democratization is needed. (* Disclosure below.)
With affordability and accessibility being a burning issue within the hardware industry, the cloud is filling this void, according to Beller. As a result, cloud-based graphics processing unit computing should be top of mind for enterprises.
“The cloud, of course, is very much a player,” he said. “We recently did some work with OVHcloud, who’s got what would be considered by this audience as sort of lower-end V100s in the cloud … it’s a whole different set of metrics than if you’re already succeeding and already have AI teams that can go in and leverage an Oracle cloud bare metal eight by H100.”
For the hardware field to go to the next level, accessibility is critical, according to Beeler. This is because even organizations starting a workstation with a couple of Nvidia A6000s should be able to set the ball rolling without worrying about a $50,000 to $60,000 investment in hardware, since the artificial intelligence models that can be created today are better than those that have to wait for GPUs to show up, he added.
“In terms of the where question, can I get access to the hardware?” he said. “We were told just a week and a half ago that the wait for H100 systems, if you place your order today, is somewhere around 40 weeks. Not everyone needs H100, and Nvidia says, ‘Well look, we’ve got this shiny L40S; it’s available now.’ It’s opening up the market. I think so too for AMD, Intel and others with accelerators to come in with perhaps lower cost options that may not be quite so powerful.”
For hardware to be in its A game, serious emphasis should be laid on networking and storage. This is because GPU servers are not specifically storage heavy, Beeler pointed out.
“How do we fuel these GPUs?” he asked. “If I’m going to drop a million or a couple million dollars from an enterprise perspective into an enterprise data center to research or to use AI to help research or improve our businesses, all the infrastructure’s got to be there to support it.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of SC23:
(* Disclosure: TheCUBE is a paid media partner for SC23. Neither Dell Technologies Inc., the main sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU