By using tdwi.org website you agree to our use of cookies as described in our cookie policy. Learn More

TDWI Upside - Where Data Means Business

The Path to Protecting Privacy

Regulators continue to encourage privacy protection, but the public response is mostly "So what?" How should we defend privacy in the digital world?

Expanded and improved privacy protection regulations such as the GDPR and CCPA are being introduced in more liberal countries and states around the globe, driving widespread and often costly data governance initiatives. Is it worth the investment? Despite consistently high survey responses that privacy is important, the general public makes limited use of the rights granted by such legislation.

For Further Reading:

Data Privacy: 3 Best Practices to Enact Now

Surveys Reveal User Attitudes about Online Data Privacy and Accuracy

U.K. Parliamentary Committee Hammers Facebook on Digital Privacy

A comprehensive 2018 literature review of this privacy paradox confirmed that although privacy is important, "most users rarely make an effort to protect [personal] data actively and often even give it away voluntarily." The researchers suggested that users engage in a "privacy calculus" that trades disclosure of personal information for perceived benefits.

Privacy calculus is a form of rational choice theory in which people make decisions by identifying pros and cons, assessing their impacts and probability, calculating a score for each outcome, and choosing the mathematical winner. My 2016 Upside series, "How Do You Make Decisions?" explains why this theory is overly complex and highly improbable. Privacy calculus seems equally unlikely.

To call the position regarding privacy a paradox is misleading, as is the suggestion that users give personal data away voluntarily. I believe a simpler explanation applies. Among the general public, personal utility and limited data literacy provide profitable ground for corporate data collection and use, as well as for state bodies with vested interests in data harvesting. Privacy does not appear in this formulation. However, these three phenomena collide dangerously in the data-rich depths of our increasingly digital world with significant consequences to privacy in society as a whole.

The Path to "Un-privacy" is Paved with Reasonable Intentions

People acquire apps because they want or need the function on offer. This is personal utility and blindingly obvious. Sometimes that function works only by sharing personal data. Even an antediluvian landline phone only works if personal data (such as customer name and address) is linked to each phone number and shared. An arguably small loss of privacy was inevitable but unremarked. A physical phone book increases the utility of the phone by adding addresses to distinguish between all those John Smiths. However, it further decreases privacy, although few worried when the first phone books were printed. Turning the phone book into a database further increases utility, but the reverse searching that's possible ramps up privacy concerns.

Data experts can easily grasp this path of decreasing privacy. For the general public, the focus is entirely on the function and utility of having a phone. Privacy is not a consideration. Ordinary people unwittingly -- rather than voluntarily -- hand over personal data with limited understanding of its privacy implications.

Smartphones, apps, and sensors gather enormous amounts of personal data to offer functions of varied and enormous personal utility. In many cases, these functions do not need to share data to deliver value. Consider baby tracking apps. To allay the anxiety of inexperienced parents, the apps record what goes into their babies' mouths, what comes out in their diapers, and every movement, breath and heartbeat in between. They offer an extensive array of information, charts, and alerts -- a dashboard to reassure parents that their babies' behavior, health, and development are "normal."

Many of these apps upload extensive personal data and, in line with privacy regulations, obtain user consent to do this; the apps also outline the multiple ways data may be used or shared. Parents have no choice but to agree to these terms if they want the app. Possible uses include supporting medical research or improving pediatric services. One app claims to have "probably the largest dataset of sleep ever collected." The opportunities to use such data over the babies' lifetimes are considerable, an expectation that drove Forbes to feature this app in its 2016 list of the next billion-dollar startups. Data reuse is the overriding driver for profit, especially when the apps are free. The privacy of the parents and, more especially, the infant subjects whose personal data is collected gets minimal attention.

Shoshana Zuboff's latest book, "The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power," reveals the full scope of the challenge. In comparison to traditional businesses, where growth was built on physical resources (such as steel or oil) digital businesses' only raw material is the personal data they collect from their users (and, sometimes, their children or friends). Privacy concerns stand squarely in the way of sweating the asset. Capitalism's history of over-exploiting its raw resources and resisting regulation should serve as fair warning of what to expect when it comes to privacy.

For Further Reading:

Data Privacy: 3 Best Practices to Enact Now

Surveys Reveal User Attitudes about Online Data Privacy and Accuracy

U.K. Parliamentary Committee Hammers Facebook on Digital Privacy

Nor should the gathering and use of personally sourced data by governments be ignored. It is easy to point the finger at China, but other authoritarian regimes are already following in Chinese footsteps toward a full surveillance society. Western governments are also far from blameless when it comes to self-protection through abuses of privacy. The most common rationale is to protect the public from criminals and terrorists. However, the charm of endless pools of personal data in which to fish are as attractive to bureaucrats and politicians as to capitalists.

In Defense of Privacy

So, with public disconnect and corporate participation simply to avoid penalties, and governments surveilling in secret, protection of privacy is a challenge that likely exceeds the remit and power of regulators. Should we therefore declare privacy dead as did Sun Microsystems CEO Scott McNealy 20 years ago? It may be the path of least resistance but, in my opinion, it is the wrong road. The consequences today are far more severe than two decades ago.

In our brave, new, digital world, every action we take and every message we send feeds another few kilobytes into the bottomless big data vaults of recorded history. Our emotions, in facial and micro-expressions, are already fair game. If (or when) brain-computer interfaces and thought identification technology advance sufficiently, our most intimate thoughts may well join them.

Where is freedom of thought, novelty of opinion, and self-determination in a world where everything is seen, judged, and used to some unknowable end by some faceless algorithm? How can democracy survive when alternative views are instantly open to branding as dissent, when freedom fighters are labelled terrorists, dissenters burned as heretics? We may already have lost many battles, but we cannot preserve our humanity if we do not win the war.

The answers must lie with those who acquire or gather personal behavior and opinion data at an individual level for analytical use both internally and through further distribution. Yes, I am suggesting that only the data poachers can become privacy's game keepers.

In particular, data management in these organizations must be on the front line in defending privacy, particularly in the case of strongly profit-driven or state-security projects, because it is data management professionals who are best equipped to understand the data, its relationships, and the implications of linking it together. Just as financial institutions have independent audit functions staffed by accountants, personal data collectors must implement independent, privacy audit functions staffed by senior data management experts.

Strong ethical oversight, rigorous data protection and obfuscation measures, and restrictive data collection and storage rules must be defined and enforced by the privacy audit function of all data collectors on all personally sourced data if we are to avoid becoming even more of a surveillance society than we already are.

TDWI Membership

Accelerate Your Projects,
and Your Career

TDWI Members have access to exclusive research reports, publications, communities and training.

Individual, Student, and Team memberships available.