Industry leaders pioneer AIOps era by integrating compute power with data
The rapid acceleration of AI in enterprise environments is bringing data sovereignty, location and processing challenges to the forefront. Companies are seeking to manage the complexities of enterprise AI data sovereignty within a globally distributed landscape.
Such issues are driving a shift from centralized cloud solutions to hybrid approaches, which help organizations address enterprise AI data sovereignty by keeping operations closer to where data resides. As digital borders blur, data fluidity has sparked global discussions on sovereignty, with companies like Dell Technologies Inc. prioritizing enterprise AI data sovereignty as crucial for maintaining control, compliance, and trust within AI-driven innovation.
“AI is transforming business at an unprecedented pace. Data centers must be designed from the ground up to handle AI’s speed and scale while new AI PCs are transforming productivity and collaboration,” said Jeff Clark, vice chairman and chief operating officer of Dell, in a release. “What’s needed are new IT infrastructure and devices purpose-built to meet the specific demands of AI.”
This feature is part of SiliconANGLE Media’s exploration of high-performance computing advancements. Be sure to watch theCUBE’s analyst-led coverage of SC24 from November 19-21. (* Disclosure below.)
Companies prioritize on-prem and hybrid solutions to bring compute closer to data
The shift toward hybrid approaches has led companies to rethink their data strategies, prioritizing on-premises and hybrid solutions. Those solutions keep compute power close to the data, rather than relying only on cloud-based processing.
“Organizations are rushing to experiment with AI, but there are many challenges to achieving ROI. Data sovereignty issues, legal and compliance concerns and data quality are all top of mind,” theCUBE Research Chief Analyst Dave Vellante said in March. “Our research shows that companies are turning to industry leaders like Dell and Nvidia to help provide AI expertise and services to lower risk and get to ROI sooner.”
Companies can now bring AI operations directly to where data resides, which can reduce latency and improve security, among other benefits. There’s a shift from an era of training to an era of actualizing the benefit of AI, according to Savannah Peterson, principal analyst at theCUBE Research.
“The key to making AI real is inference. Meeting your data where it is — on-prem, in the cloud, at the edge, etc. — and communicating back to a central point of strategic truth is critical for companies looking to achieve the level of impact they desire with AIOps,” Peterson said.
The key is eliminating data silos swiftly and securely, according to Peterson. It’s less about where and more about what companies are doing.
Along with Dell, it’s also been a focus for other companies. Many, including IBM Corp., see enterprise AI data sovereignty as essential for trust and compliance.
“With the rise of cloud computing and global data flows, the traditional boundaries between nations and jurisdictions are no longer sufficient to protect sensitive digital resources like data within a given geographical boundary,” IBM wrote in a recent blog. “By prioritizing data sovereignty, organizations can build trust with customers and stakeholders, enhance brand reputation and avoid costly legal and reputational consequences.”
Companies such as Dell, IBM and Hewlett Packard Enterprise Co. have been looking to address these issues through specialized high-performance computing, networking and storage solutions, many of which will be detailed at the upcoming SC24 event. With that event on the radar, it’s worth taking a closer look at how industry leaders are looking to pioneer a new era of AIOps and where enterprise AI data sovereignty is heading next.
Addressing data proximity and network demands
For Dell, the goal has been to integrate compute, storage and networking for seamless hybrid AIOps. Up until recently, almost all of the focus has been on doing gen AI in the cloud, according to Bob O’Donnell, president and chief analyst at TECHnalysis Research LLC.
“I think the story that Dell started to tell is, ‘Hey, wait a minute. Maybe we can do some of this sort of stuff on-prem.’ The basic message … is ‘Why move your data to the AI? Why not move the AI to your data?’ The vast majority of most organizations’ data is still behind their firewall, so it just makes logical sense to do that,” O’Donnell told theCUBE in May.
IBM has also been seeking to deliver future-ready storage solutions. IBM’s investments in infrastructure have positioned it as a leader in the race to harness the full potential of AI, according to John Furrier, theCUBE Research executive analyst.
“We’re at a generational shift here, both personnel, the kinds of computing that’s done,” he said. “IBM will talk about quantum and other conversations. You’ve got HPC and AI, you’ve got cloud-native developers, you’ve got gen-AI applications. AI infrastructure is the key here, and that’s the future.”
AI workloads have also been placing enormous demands on the back-end networks supporting them, according to Bob Laliberte, principal analyst at theCUBE Research. Networks must accommodate massive data traffic, providing up to 800 gigabytes end-to-end.
“While organizations have traditionally used InfiniBand in HPC environments, and Nvidia has bundled InfiniBand with its solutions, enterprises prefer using Ethernet networks. In a recent survey, almost two-thirds of enterprises said they would prefer to use Ethernet over Infiniband in gen AI environments,” Laliberte said, referencing a recent collaborative research project between theCUBE Research and ZK Research.
Respondents indicated their preference was based on having Ethernet currently deployed in other parts of their network, having resources with the skill-sets to support the environment, according to Laliberte. They also believed that RDMA over Converged Ethernet can provide comparable performance.
“Vendors proposing Ethernet networks for back-end gen AI environments should publish Ethernet performance results and create validated blueprints or designs to help accelerate the adoption of these Ethernet-enabled AI environments,” Laliberte said.
Hybrid AI and partnerships key to hurdles
Though gen AI for enterprises is a major focus for companies, many don’t know where to start when it comes to implementation. That’s where hybrid AI comes into focus, according to O’Donnell.
“Just the same way we’ve seen hybrid cloud, there’s going to be hybrid AI, and the tools are becoming more widely available,” he said in May. “The deals that Dell announced with Hugging Face and Meta on leveraging open-source tools, which can be run locally, are very interesting. That notion of opening up this concept of hybrid AI and making it something that a lot of organizations can consider.”
In March, Dell also announced it was expanding its infrastructure portfolio with new Nvidia-powered AI platforms. The company said its new servers were compatible with Nvidia’s B200 Tensor Core graphics card and added it was rolling out the machines alongside a data lake platform and upgrades to its storage portfolio.
There’s no one who can do it all by themselves, according to Laliberte. It’s going to be about how organizations and vendors can work together to provide an end-to-end solution.
“I like what Dell’s doing with the AI Factory. I think it’s a little bit of a misnomer when you think about the factory, you think maybe, ‘Hey, this is a big data center thing.’ But, in reality, they are talking about extending those AI capabilities across the entire enterprise, through workstations and laptops, out to the edge and at retail,” he said in May.
New era of enterprise AI data sovereignty
As organizations increasingly integrate AI into their operations, the evolution of AIOps and hybrid AI solutions appear set to redefine enterprise IT landscapes amid a new era of enterprise AI data sovereignty. There are still questions to solve, however: Given that the vast majority of most organizations’ data is still behind their firewall, how do companies move the AI to their data?
“Number one, you need the compute, you need the network, you need the storage, and you need the software and platforms,” O’Donnell said in May. “And then on top of that, you start to build in things like software applications designed to work with workstations, to do RAG, which is this hugely important deal as companies start to fine-tune their existing models, and even AI PCs, where theoretically you’re going to eventually move them.”
Companies who put all of that together can start to consider this idea of a hybrid AI model. Embracing a hybrid AI model can enable organizations to integrate on-prem, cloud and edge solutions for both operational efficiency and regulatory compliance.
Moving forward, collaboration between industry leaders and technology providers will be increasingly essential. As AIOps and hybrid AI continues to mature, innovations appear set to reshape enterprise AI and enable companies to unlock AI’s full potential in a new era of enterprise AI data sovereignty.
(* Disclosure: TheCUBE is a paid media partner for SC24. Neither Dell Technologies nor WekaIO Inc., premier sponsors of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Image: Getty Images
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU