Navel gazing serves IBM well for digital transformation best practices
While the urge to be data driven is reaching a fever pitch in the modern world of business, the actual implementation of this concept is usually fraught with the challenges of reworked computing architectures and evolving corporate cultures. But those computing giants delivering the infrastructure for data-first business such as IBM are also looking to pave the way through internal transformations.
Ed Walsh (pictured, left), general manager of storage, IBM Systems, at IBM, and Steven Eliuk (pictured, right), vice president of deep learning, Global Chief Data Office, at IBM, provided a blueprint of an environment that has worked well for IBM, as well as the challenges other companies face.
“Everyone needs to be data-driven. Everyone wants to be data-driven. But it’s … really challenging for organizations,” Walsh said. “We’re being transparent about what we’re getting internally for our own transformation as IBM. Because, really, if we looked at this as a platform, it’s really an enterprise cognitive data platform that all of IBM uses on all our transformation work.”
Walsh and Eliuk spoke with Dave Vellante (@dvellante), host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, at the IBM Chief Data Officer Strategy Summit in San Francisco. They discussed the challenges of being data-driven and the kind of tech environment that most facilitates these challenges. (* Disclosure below.)
Biggest data-driven challenges
Big Blue’s blueprint for being data-driven was inspired by the needs in Eliuk’s department at IBM. They needed the hardware to be sufficient, as well as the bandwidth. “That’s the beauty of working in this domain — is that I have those hundreds of use cases,” Eliuk said. “And it means that I’m hitting … low-latency requirements, bandwidth requirements, extensibility requirements because I have a huge number of headcount that I’m bringing on as well.”
The biggest challenges come from effectively curating, locating, governing, and the quality aspects of that data, according to Eliuk.
So what’s the fundamental aspect of the infrastructure that help support this kind of network? At the high level, IBM uses a storage infrastructure that’s built for AI workloads, which is closer to high-performance computing — which IBM calls Elastic Storage Server, Walsh explained. “It’s a combination,” he said. “It’s a turnkey solution — half rack, full rack. But it can start very small and grow to the biggest supercomputers in the world, like what we’re doing in the national labs — like the largest top-five supercomputers in the world.”
IBM is positioned to be able to pull this off due to its broad portfolio and the technology differentiation it has, according to Walsh and Eliuk. “And I think it’s moved from ‘trust me’ … to ‘show me,'” Eliuk said. “We’re able to show it now because we’re eating what we’re producing. So we’re showing. The cognizant blueprint — we’re using that effectively inside the organization.”
Watch the complete video interview below, and be sure to check out more of SiliconANGLE’s and theCUBE’s coverage of the IBM Chief Data Officer Strategy Summit. (* Disclosure: TheCUBE is a paid media partner for the IBM Chief Data Officer Strategy Summit. Neither IBM, the event sponsor, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
Photo: SiliconANGLE
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU