People Clash over Govt. use of “Pre-Cog” Data Tip-Offs

Minority Report is an oft-cited movie for emerging technology in touch screens and robots.  But what if I told you that the government really has pre-cognitive technology sans the pre-cogs.  Unrealistic?  Think again.

According to a report in the US Department of Homeland Security, they have an ongoing project that collects data, like “video images, audio recordings, cardiovascular signals, pheromones, electrodermal activity, and respiratory measurements, from test subjects, which includes some DHS employees and volunteer civilians, which will tip-off law enforcement if a person is about to commit a crime.  The project is led by Robert Middleton Jr.

“The Science and Technology (S&T) Directorate Human Factors/Behavioral Sciences Division (HFD) Future Attribute Screening Technology (FAST) project is an initiative to develop a prototype screening facility containing a suite of real-time, non-invasive sensor technologies to detect cues indicative of mal-intent (the intent or desire to cause harm) rapidly, reliably, and remotely. The system will measure both physiological and behavioral signals to make probabilistic assessments of mal-intent based on sensor outputs and advanced fusion algorithms and measure indicators using culturally neutral and non-invasive sensors. FAST uses an established independent peer-review process to ensure objectivity and thoroughness in addressing all aspects of the program.”

EPIC, a public interest research center in Washington, D.C. established in 1994 to focus public attention on emerging civil liberties issues and to protect privacy, the First Amendment, and constitutional values, requested the DHS to give them copies of the FAST Project Privacy Threshold Analysis and Record of review and approval of the Privacy Threshold Analysis by DHS Privacy Officer, under the Freedom of Information Act.

The DHS obliged and gave EPIC a copy of their requested documents, but much of the information had been redacted.  This only made the public question DHS’ motives and concern for public security and privacy rose.  The DHS assures the public that if this project does go public, it will be used in airport checkpoints, border crossings, sea ports or major sporting events.  An in compliance for all ethical and regulatory requirements, aside from collecting an informed consent from all test subjects, they would also collect demographic information such as age, gender, occupation and ethnicity, medical information, current medications and substances used in the last week prior to the start of the test, including caffeine, alcohol, tobacco and others.

An article from CNET contains contradicting answers from DHS informing the public that FAST “does not store any personally-identifiable information (PII) from participants once the experiment is completed. The system is not designed to capture or store PII. Any information that is gathered is stored under an anonymous identifier and is only available to DHS as aggregated performance data. It is only used for laboratory protocol as we are doing research and development. It is gathered when people sign up as volunteers, not by the FAST system. If it were ever to be deployed, there would be no PII captured from people going through the system.” But the DHS Privacy Office said in a statement that FAST “is a privacy sensitive system,” which DHS defines as “any system that collects, uses, disseminates, or maintains” personally-identifiable information.”

Even before pre-cogs begin their tenure with the government, issues arise over the legal use of data.  It’s a matter we should probably sort out before the psychics lend their ability to analyze future data, since a large pool of anonymous data can only be created by amassing and cleaning individual data.  It’s a predicament that’s landed Google in hot water with the EU over its Maps service, and sent both Google and Apple before authorities to defend their use of location-based tracking.  And the issue will only gain attention as the government finds more use cases for the data collective we’re building across different locations, disciplines and activities.  Big data analysis enables a unification of sorts for all the disparate data currently out there, promising to make sense of the big picture, and for the greater good.  The question remains, however, what compromises will we have to make as a people for that greater good?