Many businesses today find themselves locked in an arms race with competitors to see who can convert customer secrets into the most pennies. To try to win, they are building perfect digital dossiers, to use a phrase coined by Daniel Solove, massive data stores containing hundreds, if not thousands or tens of thousands, of facts about every member of our society. In my work, I've argued that these databases will grow to connect every individual to at least one closely guarded secret. This might be a secret about a medical condition, family history, or personal preference. It is a secret that, if revealed, would cause more than embarrassment or shame; it would lead to serious, concrete, devastating harm. And these companies are combining their data stores, which will give rise to a single, massive database. I call this the Database of Ruin. Once we have created this database, it is unlikely we will ever be able to tear it apart.
I have become convinced that my earlier, bleak predictions about the Database of Ruin were in fact understated, arriving before it was clear how Big Data would accelerate the problem. Consider the most famous recent example of big data's utility in invading personal privacy: Target's analytics team can determine which shoppers are pregnant, and even predict their delivery dates, by detecting subtle shifts in purchasing habits. This is only one of countless similarly invasive Big Data efforts being pursued. In the absence of intervention, soon companies will know things about us that we do not even know about ourselves. This is the exciting possibility of Big Data, but for privacy, it is a recipe for disaster.
If we stick to our current path, the Database of Ruin will become an inevitable fixture of our future landscape, one that will be littered with lives ruined by the exploitation of data assembled for profit. But we can chart a different course, in various ways. I think our brightest engineers can develop innovative privacy-enhancing technologies which will enable new techniques for data analytics that minimize costs to privacy. I hope that public institutions and industry, through self-regulation, will devise ways to better balance the burdens on privacy and the benefits of Big Data. If nothing else, I anticipate that society will slowly develop new norms for engaging with the massive amount of information collected about us, creating informal rules governing when and how it is appropriate to release, collect, and use data, the way minors have learned to speak and listen carefully on social networks.
But every one of these correctives requires the same thing: time. We need to slow things down, to give our institutions, individuals, and processes the time they need to find new and better solutions. The only way we will buy this time is if companies learn to say, "no" to some of the privacy-invading innovations they're pursuing. Executives should require those who work for them to justify new invasions of privacy against a heavy burden, weighing them against not only the financial upside, but also against the potential costs to individuals, society, and the firm's reputation. Companies should do this not only as matter of good corporate social responsibility, but also because it will likely square with the government's recommendations for protecting privacy, which seem to advise caution and deliberation, under the banner of "context."
The FTC report offers three broad recommendations: Privacy by Design, Simplified Choice for Businesses and Consumers, and Greater Transparency. In discussing the second recommendation — a call for simplified and more transparent choice — the FTC suggests a carve out. "Companies do not need to provide choice before collecting and using consumer data for practices that are consistent with the context of the transaction or the company's relationship with the consumer, or are required or specifically authorized by law." Under this standard, it might be "consistent with the context," for a company in a direct business relationship with a customer to use that customer's information to deliver ads for its other services, but it might be inconsistent with the context — thus requiring notice and choice — to sell that information to third-party advertisers, the FTC explains.
Similarly, the White House white paper defines a "Consumer Privacy Bill of Rights," which would protect, among other things, "Respect for Context." "Consumers have a right to expect that companies will collect, use, and disclose personal data in ways that are consistent with the context in which consumers provide the data," the paper explains.
These parallel pronouncements mean that companies that deal with personal information (meaning all companies, really) need to focus much more often than they have on the history of privacy practices in their industries. Although neither report defines in depth what it means by the word "context," to me the message seems to be: do not push the privacy envelope. Companies that use personal information in ways that go well beyond the practices of their competitors risk crossing the line from responsible steward to reckless abuser of consumer privacy.
The lesson is plain: compete vigorously and beat your competitors in every legitimate way, except when it comes to privacy invasion. Too many companies have learned this lesson the hard way, launching invasive new services that have triggered class action lawsuits, Congressional inquiries, and media firestorms. These companies knew that they were treading where others had feared to go. This may have felt like an exciting opportunity. It should have felt instead like perilous risk-taking, because it meant hurtling beyond the contextual borderlands defined by past practice.