The right to deceive: Do sensor trackers just make us better liars?

pinocchio liar

Sensor data tells on us, detailing our behaviors in the form of digital evidence. In today’s data-hype economy, digital evidence is the currency of truth, spent on real-time marketing and product research. It’s all to the benefit of big business, where demand for unfiltered truth in data is changing the very nature of the transaction. The Digital Transformation is cultivating a most intimate coexistence for company and consumer, where users lay vulnerable to algorithmic decisions regarding the services and rates they receive.

An unprecedented access to sensor data, alongside technological advances to store, transfer and analyze that data, has crafted a modern economy for usage-based business models. From co-opped property insurance to personal car rides, a business’ ability to closely track consumer behavior means customers can scale the quality of their service with the price. The very concept of “you get what you pay for” has been virtualized, requiring a level of propinquity never before attempted between business and consumer.

As consumer data now influences countless business decisions, is it to consumers’ benefit to attempt gaming the system, or even avoiding the system altogether? Is the business world’s quest for truth merely encouraging consumers to get better at lying in the market surveys we call Instagram and Pinterest, and do consumers have a right to deceive, lest they incriminate themselves?

Truth: the double-edged sword

What happens when the truth backfires? I recently rolled my SUV into a large hedge, attempting a three-point turn on a dark and very narrow New Orleans street. The truth, in this case, cost me my safe driver discount, resulting in a significant rate hike. Including the deductible, the total earned by the insurance company is nearly double the price of simply paying for car repairs out of pocket. Had I lied and told the insurance company that the evening’s storm had dropped a tree branch on my car, the fault would not have been mine and my insurance rate would not have been hiked by 104 percent. The most damning evidence against me was the incidental truth, tarnishing the insurance company’s perception of my driving abilities.

While the insurance company is surely happy for my honesty, I’m a relatively rare case. A recent study found that over half of drivers lie on car insurance applications, both intentionally and unintentionally. Insurance has been in the business long enough to expect inaccurate data from consumers, designing systems of algorithms and human underwriters to check centralized databases for things like medical history and driving records.

Jing Ai, a professor of finance at the University of Hawaii-Manoa, and his team use statistical techniques to enhance insurance companies’ fraud detection software. These techniques “determine what factors contribute to fraud and how they contribute,” he explained in a NerdWallet article. “There are usually indicators, for example, whether there’s a police report or whether it happened at night. One factor doesn’t mean it’s fraudulent, but an incident that does have a police report is just a little bit less suspicious.”

So while insurance companies protect us from the unknown fates of accidents, floods and fires, they must at the same time protect themselves from the unpredictability of man’s free will. Yet even advanced data analytics can’t always foresee future outcomes with precision, maintaining a certain margin of error when it comes to calculating rate plans for the deceitful man. He’s a moving target. Effectively, algorithms can only take a business so far, and the usefulness of data sets can vary over time. For instance, recent changes in climate have skewed newer data, minimizing the predictive capabilities of historic data.

The right to deceive

Do consumers have a right to deceive, by way of not providing such a complete picture of their behavior and preferences? Could the act of algorithmic profiling violate a consumer’s freedom to not incriminate oneself? Originally, self-incrimination rights emerged as protection against confessions or statements elicited through torture tactics, in hopes of actually preserving the truth. Should similar rights be extended to digital disclosures in an effort to preserve truth in business data?

When it comes to life insurance, deceit is considered a fraudulent act and insurance providers can deny a claim as a result. Indeed, a consumer can avoid self-incrimination all they want, just don’t get caught in a lie. Similar to the U.S. judicial system, the responsibility rests with the plaintiff, or business in this case,  to gather evidence against fraud. The expectation for inaccurate data all but justifies today’s digital economy built on sensor data, web crawlers and centralized databases to fact-check consumer claims.

Thus a system of reward and punishment has emerged for many usage-based rate plans, particularly those dealing with liabilities. Companies possess an array of incentives for consumers willing to share more data (like waiving HIPAA rights in exchange for drug store discounts), and businesses may be well aware that the mere act of observing a consumer can affect their behavior. Studies show that people are less likely to lie when they know their words or actions are being recorded or put into writing. When undeniable evidence (a paper trail) is added to the equation, the benefits of lying are diminished.

Yet misguided incentives can have their own affect on human nature, and that is to encourage more and better lies. In some instances, the very algorithms designed to prevent fraud can fail, opening up opportunities for consumers to game the system. Businesses face real danger when rewards are misplaced, as Wells Fargo has so painfully learned after two million unauthorized bank accounts were created by employees incentivized to hit high sales targets.

The result is a vicious cycle of attempted behavior modification, as businesses react to consumer deceit and consumers react to changes in their services. Perhaps most frequently witnessed on the very social media platforms used to gather and distribute advertisement data, this business/consumer cycle has been forced to deal with spam, trolling and the disingenuous use of site features. Pinterest users, for example, have been known to spend hours pruning their “Follow” user subscriptions on the site, in hopes of tricking Pinterest algorithms into showing better recommended content on the homepage. Such actions not only skew Pinterest’s monetizable data, but force marketing-driven principles onto users who must prioritize content over comradery. Fortunately for users, Pinterest has since added an option to turn off recommended content in feeds.

A more disturbing instance of consumer deceit comes from AirBnB, a service that connects vacationers with residential rentals across the globe. A growing number of African American vacationers are reporting a need to omit details that may hint at their race, for the very real fear their rental applications will be wrongfully rejected. Consequently, a handful of alternatives are out to shrink AirBnB’s market share, all because the business-consumer data exchange has been severely compromised.

Good intentions

Genuine data exchange, initiated by the business, is a step towards putting consumer minds at ease and curbing the economic impact of deceit. The “why” matters when it comes to data collection, and this is where a business can encourage truthful data from consumers in order to make informed decisions that could ultimately save both parties even more money on usage-based rate plans.

Privacy by Design

Privacy by Design is one method to this end, initiating a consumer-first stance on data collection. Its principles will become a standard for social marketing, according to DataSift Inc. CEO Tim Barker. While it can’t be denied that consumers are now more willing to share personal information in exchange for commercial perks, they want brands to first ask for permission. Barker explained that to build trust, marketers should lead with privacy and ask consumers for input.

Companies must handle personal data responsibly, that is a given. But they should also communicate and be transparent about why, what and how data is collected and used. Organizations should inform customers — and invite feedback about — the applications of personal data and explain the processes in place for privacy management,” Barker wrote in a statement to SiliconANGLE.

Skip the consumer

Consumers may also find themselves increasingly removed from the equation, reaping the benefits of consensual, indirect data collection without giving up so much privacy.

“Direct human input not only could, but should be eliminated altogether,” stated Nokia Growth Partners’ managing partner, Paul Asel. Commenting on the auto insurance sector’s eagerness to dispatch car-tracking sensors, he furthered, “The key barrier … is less gathering accurate data than the ability to derive insight from relevant data ingested from multiple sources.”

This industry-wide problem has prompted demand for tools that contextualize disparate data, and the residential home, of all places, presents one of the most necessary cases for circumventing direct consumer interaction. ROC-Connect Inc.’s smart home as a service platform sees sensors for systems like water pipes as a way to protect home insurance providers and homeowners alike, by automating certain aspects of home monitoring and maintenance.

“Insurance companies aren’t always popular, but they don’t realize how much their brand could logically stretch into the Internet of Things,” explained Kevin Meagher, the SVP of business development at ROC-Connect. “Would you consider getting a smart home service if your property insurance provided a discount?”

“Big losses come from fire and flood, so what the insurance companies really want you to have is connected water shut off valves and fire detectors,” he went on. “How can insurance incentivize homeowners? If you had leak detectors, I believe you’ll start to see significant discounts, up to 20 percent. This is so easy to do, there’s no way this is not going to happen. It won’t be long before every fire sensor in the country will be linked directly to the fire station.”

More connected things, more business models

This incorporation of more endpoint devices will also mean more “things” are assigned costs in the business world, activating a dynamic and ever-changing IoT ecosystem. Today’s business will want flexible technology to respond in real time to IoT’s dynamic nature, increasing competition and alternative pricing models to hopefully resolve the relationship between data collection demands and services rendered.

“The right IoT monetization platform should be able to handle multiple packaging and pricing scenarios,” according to the findings of a recent THINKstrategies report. “These scenarios include subscription service and consumption/usage-based pricing schemes across a highly diverse population of product endpoints that can be activated and even changed rapidly based on various triggers.”

Enabled by data, dynamic business models could ultimately give consumers more control over the type of data gathered, as well as when and where it’s collected. And there are companies out there that believe in genuine data exchange between businesses and consumers. Neura, for instance, employs an agile algorithm to study individual behavior without compromising their privacy. Acting as a digital bouncer, Neura opens and closes the door on data transactions as the consumer sees fit.

Calling out IoT companies for being too focused on building machines instead of experiences, Neura Inc. CEO Gilad Meiri said his company “strengthens the experience so a fitness tracker can now show cause and effect, and say ‘When you’re sleep deprived, jet-lagged or sleepy trying to drive, this [result] happens.’

“I wouldn’t say we’re replacing communication between humans and machines, but IoT today needs scripting. My grandmother isn’t a Silicon Valley engineer — she’s not going to script [the app]. But she will happily consume the result — the oven can tell her she left it on as she leaves house.”

Specialized data sets provide another example of fresh business opportunities,  optimizing vanity pricing by way of limiting the variables within the user pool. In focusing data collection to only the sources needed, companies can in fact become less discriminatory in practices and rates, having the actual data instead of generalized linear models.

It’s a market disruption we haven’t witnessed since the onset of the credit system, and the business world is long overdue for change, according to one entrepreneur. Root Insurance Company founder Alex Timm thinks data can solve the issues of discriminatory generalizations, where gaps in consumer data prevent a truly unbiased service rate. His is the first new insurance company America’s seen in a decade, servicing only good drivers.

“Root is trying to restore fairness by taking a data-driven approach to risk assessment, matching your rate to how you actually drive,” Timm explained in a recent interview. “To have many variables today is far less important [than the credit era]. When I have real data, I don’t have to use gender as a factor. Just because you have a low credit score doesn’t mean you’re a bad driver.”

“The algorithm itself isn’t innately bias – there’s more potential for bias if you’re sending [insurance] applications to a human being. Algorithms put everyone on an equal playing field. Just drive, and we’ll see how good of a driver you are. I’m sure there will be pieces of information consumers will be very protective of, but what is most important is that companies are up front, and to not be sneaky about it.”

Do unto others

The golden rule holds different meanings for business and consumer, each party reacting to the actions of the other. Nevertheless, businesses need more data to create a complete picture of consumers and offer them better, more attractive services. And on the flipside, consumers need to provide more data to revel in the flexible customizability of usage-based service costs.  

Such a self-perpetual cycle characterizes today’s circuit of data exchange, and business methods must find new ways to coexist with consumer principles. Service providers must remember man’s nature when training their algorithms, as these increasingly intelligent systems have the potential to impact much more than the bottom line.

The truth, even in data, is subjective, but a data-rich economy has room enough for every version of the truth.

Feature image by ChrisyJewell via photopin cc