

AI in healthcare delivery is already showing considerable promise. However, with higher stakes than ever, concerns mount over data privacy, regulatory compliance, explainability and the human element in artificial intelligence-driven decision-making.
Google Cloud’s Aashima Gupta and Cognizant’s Ramaswamy Rajagopal talk with theCUBE about how AI in healthcare is transforming patient care, enhancing data security and reducing administrative burdens.
To assuage these concerns, enterprises must adopt and implement self-regulation, with internal best practices in effect ahead of external regulations, according to Aashima Gupta (pictured, left), global director of healthcare solutions at Google Cloud.
“Enterprise-grade privacy security is table stakes to build that trust,” she said. “From the regulation perspective, self-regulation is a necessity. Depending on the use cases, if you are building AI that touches patients [and] clinical diagnostics, we believe regulation will be very important. Our stance is that AI is too important to not be regulated.”
Gupta and Ramaswamy Rajagopal (right), vice president of healthcare strategy at Cognizant Technology Solutions Corp., spoke with theCUBE’s Rebecca Knight at theCUBE’s Coverage of Google Cloud at HIMSS25, during an exclusive broadcast on theCUBE, SiliconANGLE Media’s livestreaming studio. They discussed the need for responsible implementation of AI in healthcare. (* Disclosure below.)
Healthcare AI solutions rely heavily on data, making privacy and security paramount. AI models must be trained on accurate and diverse datasets because any shortcomings in data integrity can lead to erroneous outcomes, ultimately compromising patient safety. Furthermore, with regulations continuously evolving, organizations must be proactive rather than reactive in adapting their AI strategies to meet compliance standards, according to Rajagopal.
“Starting with data, it’s important for any of these solutions to have the right datasets to support model training and things like that,” he said. “If you don’t get your data right, you get into a lot of complexity when you start testing it and rolling it out. The second most important thing is regulation. The regulation continues to evolve, and you must prepare yourself for today and tomorrow.”
Healthcare operates in one of the most strictly regulated environments. Balancing the need for AI-driven efficiency with stringent privacy laws such as the Health Insurance Privacy and Accountability Act is no easy feat. AI models that impact patient care must adhere to legal requirements to prevent unintended consequences, according to Gupta.
“I’m the industry co-chair of the Coalition of Health AI, and we have built a constant color model card,” she said. “It’s like when you have a nutrition label in a yogurt, you see the ingredients on it. You need to be much more open and transparent about [data usage], and thousands of health systems across the globe are adopting that as a standard.”
Organizations must anticipate future regulations by continuously evaluating their AI models for compliance. Companies should integrate change management strategies that ensure AI-driven decisions — such as prior authorizations — align with legal and ethical standards, according to Rajagopal.
“Don’t wait for somebody to tell you what to do because one of the key aspects is how you make this work is through effective change management and communication and things like that,” he said. “You don’t want to be rejecting a prior authorization for some patient and then it bounces back into a problem by itself.”
Despite AI’s ability to process vast amounts of data, human involvement remains critical, especially in healthcare, where empathy plays a fundamental role. AI should support, not replace, human decision-making. Training AI models to incorporate empathetic elements is essential, but the final touch of human compassion must always be present, according to Rajagopal.
“When you’re building these tool sets to support those decision-making processes, you need to be able to get a little bit of an empathetic view alongside the scientific one, as well, especially [in the case of] a pediatric patient,” he said.
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s Coverage of Google Cloud at HIMSS25:
(* Disclosure: TheCUBE is a paid media partner for theCUBE’s Coverage of Google Cloud at HIMSS25. Neither Google LLC, the sponsor of theCUBE’s event coverage, nor other sponsors have editorial control over content on theCUBE or SiliconANGLE.)
THANK YOU