UPDATED 13:46 EST / MARCH 04 2020

AI

Diverse teams help build less biased algorithms, says Stanford researcher

As powerful as the benefits of artificial intelligence are, using biased data and defective AI models can cause a lot of damage.

To address that growing issue, human values must be integrated into the entire data science process, according to Lucy Bernholz (pictured), senior research scholar and director of the Digital Civil Society Lab at Stanford University.

“[Values] shouldn’t be a separate topic of discussion,” she said. “We need this conversation about what we’re trying to build for, who we’re trying to protect, how we’re trying to recognize individual human agency, and that has to be built in throughout data science.”

Bernholz spoke with Sonia Tagare, host of theCUBE, SiliconANGLE Media’s mobile livestreaming studio, during the Women in Data Science conference in Stanford, California. They discussed the importance of values in data science, why it is necessary to have a diverse team to build and analyze algorithms, and the work being done by the Digital Civil Society Laboratory.

Breaking the bias cycle

All data is biased because it is people who collect it, according to Bernholz. “And we’re building the biases into the data science and then exporting those tools into bias systems,” she highlighted. “And guess what? Problems are getting worse. So, let’s stop doing that.”

When creating algorithms and analyzing them, data scientists need to make sure that they are considering all the different types of people in the data set and understanding those people in context, Bernholz explained.

“We know perfectly well that women of color face a different environment than white men; they don’t walk through the world in the same way,” she explained. “And it’s ridiculous to assume that your shopping algorithm isn’t going to affect that difference that they experience in the real world.”

It is also necessary to have different profiles of people involved in the creation of the algorithms, as well as in the management of the companies, who can make decisions about whether and how to use them, she added.

“We need a different set of teaching mechanisms where people are actually trained to consider from the beginning what’s the intended positive, what’s the intended negative, and what is some likely negatives, and then decide how far they go down that path,” Bernholz concluded.

Here’s the complete video interview, part of SiliconANGLE’s and theCUBE’s coverage of the Women in Data Science conference:

Photo: SiliconANGLE

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU