UPDATED 09:35 EDT / MAY 06 2014

The Data Economy: White House report shines light on potential Big Data discrimination

numbers man analytics big data scientistA White House report released last week extolls the benefits of Big Data analytics, including its potential to “make possible unexpected discoveries, innovations, and advancements in our quality of life.”

But the report’s authors, who include former White House Chief of Staff under President Bill Clinton and current Counselor to the President John Podesta, also rightly warn against Big Data’s potential use (or misuse) to further racial and economic discrimination. As stated in the report:

An important conclusion of this study is that big data technologies can cause societal harms beyond damages to privacy, such as discrimina- tion against individuals and groups. This discrimination can be the inadvertent outcome of the way big data technologies are structured and used. It can also be the result of in- tent to prey on vulnerable classes.

The report cites the example of the City of Boston’s Street Bump mobile app. When downloaded onto a user’s smartphone, the mobile app senses when a driver hits a pothole and sends the information to the city’s public works department. During a pilot project with city workers, the city discovered a potential problem. As the report states:

“Because the poor and the elderly are less likely to carry smartphones or download the Street Bump app, its release could have the effect of systematically directing city services to wealthier neighborhoods populated by smartphone owners.”

Catching bias

 .

Street Bump is a good example of the potential unintended consequences of Big Data analytics. Boston officials did not intend the mobile app to serve just the city’s wealthier citizens and, to their credit, they discovered this inherent bias before launching the app. Unfortunately, not all Big Data analytics practitioners are as diligent as Boston officials. And even when they are, the complexity and sheer volume of Big Data analytics workloads taking place at any given company makes it increasingly difficult to spot potential discriminatory practices.

Consider a large enterprise, such as a multinational bank or consumer products company, with thousands of customer-facing interactions powered by Big Data analytics taking place every day. Identifying those that might abet discriminatory practices based on unintended biases in the data at hand or the algorithms run on the data is not a trivial task. It requires not just the technology and manpower necessary to identify such discriminatory practices, but the mindset and will to do so as well.

Consider the following. One of the easiest to understand Big Data analytics use cases involves reducing high-value customer churn. A retailer might, for example, offer the same product at a lower price to a high-value customer (i.e. one that spends more than average) at risk of churning than a low-value customer at risk of churning based on Big Data analysis. This makes perfect business sense from the perspective of the retailer who understandably wants to retain high-value customers and is less concerned with retaining low-value customers.

Protecting the people

 .

But if such a program (unintentionally or not) results in price discrimination based on race, sex, class or some other factor proscribed by law, the retailer could (and should) face legal consequences. It is the retailer’s responsibility to identify such discriminatory practices and put a stop to them even if they are the result of just one of thousands of Big Data analytics projects.

Of course, government regulators tasked with protecting the public interest also have a role to play. As the report recommends:

The federal government’s lead civil rights and con- sumer protection agencies, including the Department of Justice, the Feder- al Trade Commission, the Consumer Financial Protection Bureau, and the Equal Employment Opportunity Commission, should expand their technical expertise to be able to identify practices and outcomes facilitated by big data analytics that have a discriminatory impact on protected classes, and develop a plan for investigating and resolving violations of law in such cases. In assessing the potential concerns to address, the agencies may consider the classes of data, contexts of collection, and segments of the population that warrant particular attention, including for example genomic information or information about people with disabilities.

As for Big Data analytics practitioners, all new Big Data analytics projects (but particularly those that impact customer-facing interactions) should include a PoC phase designed specifically to test for inherent biases and unintended discriminatory practices. In most cases, the same analytics technologies used in such projects can be turned inward for this purpose. Further, those enterprises with extensive Big Data practices should engage internal legal departments to vet new projects for potentially discriminatory outcomes.

Big Dat is indeed full of potential benefits, but the White House report is right to point out the risks related to discrimanatory practices as well.

photo credit: rbbaird via photopin cc

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU