UPDATED 15:40 EST / MARCH 09 2022

AI

Lack of diversity in data science perpetuates AI bias

Data privacy measures such as the General Data Protection Regulation and the California Consumer Privacy Act are expanding the definition and protection of private sensitive data. Anonymization efforts, though valiant, can only go so far.

“You can only manage what you measure, right?” said Hannah Sperling (pictured), business process intelligence, academic and research alliances at SAP SE. “But if everybody is afraid to touch sensitive data, we might not get to where we want to be. I’ve been getting into data anonymization procedures, because if we could render more workforce data usable, especially when it comes to increasing diversity in STEM or in technology jobs, we should really be letting the data speak.

Sperling spoke with Lisa Martin, host of theCUBE, SiliconANGLE Media’s livestreaming studio, during the Women in Data Science (WiDS) event. They discussed data anonymization and the inherent bias of human-generated analysis.

Complete objectivity is logically impossible

Taking the human factor out of analysis is not only idealistic, it’s the wrong path, according to Sperling. Since analysis is inherently a backward-looking effort, she believes that recognizing and adjusting for those biases is the model to follow.

“I’m sometimes amazed at how many people still seem to think that data can be unbiased,” Sperling said. “The sooner that we realize that we need to take into account certain biases, the closer we’re going to get to something that represents reality better and might help us to change reality for the better as well.” 

Lack of diversity in data science has perpetuated bias in artificial intelligence decisions, from soap dispensers that only recognize light-colored skin to decisions on hiring, financial applications and parole approvals.

“There is a big trend around explainability, interpretability in AI worldwide because awareness around those topics is increasing,” Sperling explained. “That will show you the blind spots that you may have, no matter how much you think about the context. We need to get better at including everybody; otherwise you’re always going to have a certain selection bias.”

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Women in Data Science (WiDS) event:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU