UPDATED 15:33 EST / JANUARY 21 2026

Dan Diasio, global consulting AI leader at EY, and Hyong Kim, global and Americas TMT industry leader at EY, talk to theCUBE about AI-native transformation during CES 2026. AI

EY on why AI-native operating models are replacing bolt-on experiments

Ernst & Young Global LLP is seeing a clear split emerge as artificial intelligence moves deeper into the enterprise, with organizations either bolting AI onto existing processes or committing to an AI-native rethink of how work and decisions get done.

Instead of layering intelligence onto legacy workflows, companies are beginning to pull those workflows apart and rebuild them around outcomes, data and human judgment. That shift reflects what EY is encountering with clients as they move past scattered use cases and focus on operating models that can support AI at real scale, according to Dan Diasio (pictured, left), EY global consulting AI leader.

“Over the last six months, we’re seeing a pretty dramatic pivot away from that idea that we can just add AI to the way that we do our work today,” Diasio said. “Instead, we need to really drop the 56 steps that we did in the past of how we did this work and start with a clean sheet of paper because AI fundamentally changes that operating system.”

Diasio and Hyong Kim (right), EY global and Americas TMT industry leader, spoke with Savannah Peterson and Rob Strechay at CES 2026, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They explored how organizations are shifting from bolt-on AI experiments to AI-native operating models that reshape workflows, roles and enterprise value creation. (* Disclosure below.)

AI-native requires rethinking how work gets done

Moving toward AI-native operations starts with abandoning the instinct to retrofit intelligence into legacy processes. Many organizations spent early cycles identifying hundreds of use cases, only to realize they were recreating the same workflows with AI bolted into individual steps. That approach often produced marginal gains but rarely unlocked meaningful transformation, Diasio explained.

“There was a lot of fascination of thinking about different ways to use this technology to be able to solve problems that we have today,” he said. “A lot of our clients were starting their exercise by identifying, what are all the ways that I can use this technology to be able to solve problems for me? That manifested itself in this definition of a use case, and they would get hundreds of different use cases. When you start to string those together, you’re actually doing the work the exact same way, you’re just bolting AI into a bunch of different steps and then hoping that you get some big, significant return on investment from that.”

True AI-native design requires stepping back and redefining outcomes rather than optimizing tasks. That mindset shift can be uncomfortable, especially for teams deeply familiar with how work has always been done. Expertise, paradoxically, can become a constraint when it anchors organizations to outdated operating assumptions, according to Kim.

“I think a piece to that is it’s almost human nature or you almost have to be anti-human nature of getting out of your comfort zone,” he said. “It’s no longer steps one through 10. It could be steps one through five plus this and that. Getting out of your comfort zone to reimagine how things need to happen is really what AI and agentic AI is about.”

As organizations rethink workflows, attention is increasingly shifting toward people and roles rather than just efficiency. AI is automating pieces of work, not entire jobs, which creates anxiety when framed as task replacement rather than role evolution. Reframing that narrative is central to adoption, Diasio noted.

“What AI is doing right now is automating a couple of these tasks,” he said. “We’ve created something inside of our organization that we call a Value Blueprint. There’s an effort to figure out how to train people to be able to occupy that job, but then people feel empowered that they see where their role is in the future and it’s not just something that is slowly withering and eroding away.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of CES 2026:

(* Disclosure: EY sponsored this segment of theCUBE. Neither EY nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.

  • 15M+ viewers of theCUBE videos, powering conversations across AI, cloud, cybersecurity and more
  • 11.4k+ theCUBE alumni — Connect with more than 11,400 tech and business leaders shaping the future through a unique trusted-based network.
About SiliconANGLE Media
SiliconANGLE Media is a recognized leader in digital media innovation, uniting breakthrough technology, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — with flagship locations in Silicon Valley and the New York Stock Exchange — SiliconANGLE Media operates at the intersection of media, technology and AI.

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.