UPDATED 12:17 EDT / SEPTEMBER 11 2024

Cole Crawford, founder and CEO of Vapor IO, talks about scalable edge computing with theCUBE. INFRA

Vapor IO and Vast Data partnership accelerates AI deployment through scalable edge computing

As industries increasingly rely on scalable edge computing to meet the demands of real-time data processing and low-latency applications, this technology is driving innovation and efficiency across a variety of sectors.

Vapor IO Inc. recently forged an alliance with Vast Data Inc. to integrate Vast Data’s AI into Vapor IO’s Zero Gap AI. This collaboration aims to simplify and accelerate AI deployments, offering enterprises the flexibility to optimize for factors such as cost, latency and accuracy across diverse environments.

“AI is largely centralized … you’ve got big data centers, 500-billion-parameter [large language models] that we’re training with gigawatts of power,” said Cole Crawford, founder and chief executive officer of Vapor IO. “That’ll shift back out to being distributed … just in infrastructure, it seems to me that the fun times and transformative times shifts from infrastructure to social … and we’re very much in the infrastructure.”

During a live, on-the-set event at the New York Stock Exchange, Crawford spoke with John Furrier, executive analyst at theCUBE Research, SiliconANGLE Media’s livestreaming studio. They discussed the company’s role in democratizing network infrastructure through low-latency, distributed compute networks essential for edge computing and AI. The discussion was a lead-up to theCUBE’s and Vast’s Enter the Cosmos event on Oct. 1, as well as Vast’s Experience the Cosmos event, from Oct. 1-2.

Edge computing infrastructure: The foundation of modern enterprises

Traditional centralized cloud solutions often fall short when handling the sheer volume and complexity of data generated by cross-industry modern enterprises. Scalable edge computing steps in with a strong solution, enabling organizations to manage data closer to its source — thereby gaining the speed and capacity required to handle intricate data processing tasks.

“Clearly, they’re going to play a big role for a long time,” Crawford said, referring to centralized cloud solutions. “The question is how strategic are they when you need that low-latency experience? If you think about where these large real estate trusts are deploying big data centers, it’s about 12 U.S. markets.” He added that many markets remain underserved by these centralized solutions, highlighting the necessity of localized infrastructure that edge computing provides to meet various industries’ demands effectively.

Scalable edge computing has emerged as a critical component for enterprises, enabling them to meet increasingly complex data processing needs, according to Crawford. In sectors such as healthcare, where split-second decisions can save lives, the ability to process data at the edge rather than in centralized clouds is essential.

“We’re actually building the 5G network in Vegas for the medical district,” he said. “A lot of the economic development for the hospitals and cancer treatment centers and research centers that are coming into town, they’re going to have access to this network. And why is that powerful? That’s powerful because … you need horsepower and data and time.”

The transportation industry also relies on edge computing to manage real-time data from thousands of connected devices, ensuring safer and more efficient operations. This technological shift is providing innovative tools that improve services, according to Crawford.

“If you’re a big retail enterprise or transportation company, or a computer vision company, if you want U.S. scale and you want that same ubiquitous experience for that low-latency application, you can’t deploy it in those facilities because you don’t have that latency profile that I can give you,” he said.​

Cole Crawford, founder and CEO of Vapor IO, talks with theCUBE about Vapor IO’s scalable edge computing solutions during a CUBE Conversation.

Vapor IO’ Cole Crawford is live on set at the NYSE, talking about the company’s pivotal role in reshaping edge AI computing.

Overcoming challenges in the path to scalable edge computing

The path to scalable edge computing is fraught with challenges, many of which aren’t immediately apparent, according to Crawford. Beyond the technical hurdles, companies must navigate significant regulatory barriers, including zoning laws. For example, in urban centers, where space is limited and regulations are complicated, deploying edge infrastructure requires careful planning and coordination with multiple stakeholders.

“The zoning and permitting process in San Diego … it took us four years,” he said. “We had to do some pretty incredible things to actually get that infrastructure into San Diego.”

Moreover, integrating new technologies with existing systems often requires careful consideration of factors such as latency, network topology and the unique requirements of each application. These complex challenges demand technical expertise and a strategic infrastructure development approach.

“It’s not that they can’t do what we’re doing; it’s just going to take them 10 years to get to where we started 10 years ago,” Crawford said. “The reality is if you want to put up a data center in Austin, Texas, it’s going to take time. Google’s a good example of this because they learned this with Google Fiber. It just takes time to do these things.”

Edge computing infrastructure redefines AI and data center strategies

As AI continues to integrate with various industries, edge computing is set to play an increasingly critical role in supporting these advanced workloads, according to Crawford. AI applications that require real-time processing and low-latency responses are particularly well-suited to edge environments. This shift is forcing data centers to rethink strategies to account for distributed computing and the need for more localized processing power, making edge computing infrastructure a cornerstone of future AI strategies.

“Remember AI, everything, needs to be centrally trained,” he said. “But central doesn’t need to be in the same physical location. Central can be a centralized logical network where that lab in Seattle is attached to the lab in Florida, and you have a high-speed interconnect for those two labs to be communicating in real-time with lots of AI-enabled data. This is what Vapor builds.”

These broader strategic shifts are reflected in the emerging use cases that demonstrate edge computing’s versatility. From edge nodes in phones and cars to localized data processing for real-time analytics, edge computing is reshaping how industries approach AI-driven technologies, according to Crawford. These applications highlight the importance of data sovereignty and the ability to keep data within specific regions, which is crucial for privacy and compliance.

“There’s a couple [of] use cases that are emerging,” he explained. “One is the idea of edge nodes devices. We all have phones; cars now are there. So that’s one pretty obvious. Two other non-obvious use cases is the notion of sovereignty, localization of data … third is this idea that AI governance is going to be a big deal.”

Here’s the complete interview with Cole Crawford:

Image: SiliconANGLE/Bing

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU