AI
AI
AI
Data privacy measures such as the General Data Protection Regulation and the California Consumer Privacy Act are expanding the definition and protection of private sensitive data. Anonymization efforts, though valiant, can only go so far.
“You can only manage what you measure, right?” said Hannah Sperling (pictured), business process intelligence, academic and research alliances at SAP SE. “But if everybody is afraid to touch sensitive data, we might not get to where we want to be. I’ve been getting into data anonymization procedures, because if we could render more workforce data usable, especially when it comes to increasing diversity in STEM or in technology jobs, we should really be letting the data speak.“
Sperling spoke with Lisa Martin, host of theCUBE, SiliconANGLE Media’s livestreaming studio, during the Women in Data Science (WiDS) event. They discussed data anonymization and the inherent bias of human-generated analysis.
Taking the human factor out of analysis is not only idealistic, it’s the wrong path, according to Sperling. Since analysis is inherently a backward-looking effort, she believes that recognizing and adjusting for those biases is the model to follow.
“I’m sometimes amazed at how many people still seem to think that data can be unbiased,” Sperling said. “The sooner that we realize that we need to take into account certain biases, the closer we’re going to get to something that represents reality better and might help us to change reality for the better as well.”
Lack of diversity in data science has perpetuated bias in artificial intelligence decisions, from soap dispensers that only recognize light-colored skin to decisions on hiring, financial applications and parole approvals.
“There is a big trend around explainability, interpretability in AI worldwide because awareness around those topics is increasing,” Sperling explained. “That will show you the blind spots that you may have, no matter how much you think about the context. We need to get better at including everybody; otherwise you’re always going to have a certain selection bias.”
Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Women in Data Science (WiDS) event:
Support our mission to keep content open and free by engaging with theCUBE community. Join theCUBE’s Alumni Trust Network, where technology leaders connect, share intelligence and create opportunities.
Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a dynamic ecosystem of industry-leading digital media brands that reach 15+ million elite tech professionals. Our new proprietary theCUBE AI Video Cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.