Data Privacy: Can You Keep a Secret?
By Timo Elliott | 11 min read
The Web is worldwide, but data privacy depends very much upon where you live.
In Germany, for example, data privacy is an issue that citizens take seriously. The country’s laws are some of the most stringent in the world and are based on a long-standing aversion to sharing data with both business and government entities.
In fact, the world’s first data protection law came into effect in (then West) Germany in 1978.
Germans long ago realized that their personal safety and self-determination included their data. Consumers around the world are slowly coming to the same realization while our global dependence on distributed computing forces a worldwide reckoning of privacy and data ethics.
Consumers now see that many “We protect your data” statements aren’t designed with their welfare in mind or aren’t backed up with the firepower to make that happen. In 2019, there were 1,314,388 records breached in the United States, according to Privacy Rights Clearinghouse, and an all-time total of 11,613,626,078. (Because of differences in state laws around reporting, Privacy Rights Clearinghouse notes, these numbers are probably less than the actual amount.)
As business increasingly crosses borders, the companies with the highest degree of data ethics will gain a competitive advantage. The first step toward that goal is instituting data ethics policies that satisfy customer expectations without dampening innovation.
On your mark, get set, go.
A change years in the making
In the years following the passage of Germany’s 1978 data privacy law, few other countries followed suit – making Germany the model island of privacy in an otherwise no-holds-barred data-collection world.
It could have continued that way for the foreseeable future if not for a series of data attacks, breaches, and intentional data harvesting that didn’t sit well with consumers, consumer advocates, and, eventually, regulators.
But, of course, the regulators would have had little sway were it not for a major shift in customer thinking about privacy. Consumers are better educated now and asking for more transparency and accountability. They’re better informed and savvier about security, privacy, and rights to data, and they’re thinking about the trade-offs between usability and risk.
Part of this has been prompted by the advent of products like Internet of Things (IoT)-enabled wearables, connected home gadgets, and facial recognition and biometrics technology. These products have helped make consumers more aware of the risks of data and of how their personal data has been, or can be, used. Now corporate customers are thinking along the same lines when considering products in the B2B world.
The market is paying attention: no company wants to be the poster child for privacy neglect. The pendulum has swung toward privacy, and companies must now show customers that they’re taking it seriously.
Privacy as a competitive advantage
Privacy begins with trust. Trust is now the most powerful digital currency and it powers today’s economy. Its constituent elements are data protection and privacy. Together, they must be part of every company’s operational DNA.
The implementation of the General Data Protection Regulation (GDPR) in May 2018 threw a spotlight on global privacy and forced companies to reevaluate their data trail because the regulation is backed up with firepower in the form of penalties – fines of up to €20 million or up to 4% of annual global revenue, whichever is greater.
Depending on the security maturity levels within companies, the GDPR has been either an easy adaptation or a heavy burden. The GDPR has prompted other non-EU countries and their citizens to take a serious look at their current data protection laws. Although the GDPR is a European law, it affects more than just Europe-based companies and EU citizens. If your competitor is GDPR compliant and you’re not, guess who’s going to win business?
The GDPR is, essentially, very close to a certification (more on certifications later). The companies that are GDPR compliant are showing that they take privacy and security seriously and have privacy policies that are accessible and understandable by the general populace so that customers (not just the in-house legal department) know what they’re signing up for.
However, there have been reports of many companies ignoring GDPR. A study from early 2020 found that only 11.8% of consent management platforms (CMPs) conform to GDPR requirements. Further, the designs of popular CMPs manipulate consent (see: dark pattern design). Within a year and a half after GDPR was enacted, companies have been hit with £100 million in fines – and counting (investigations and fees are the responsibility of individual EU member countries).
Companies earn trust by being transparent. Every company needs a “trust model.” There are some basic principles for developing such a model. They include incorporating security into software development from the very beginning, creating strict rules when it comes to who can access data and for what reason, and installing a program of consistent monitoring.
Toward an ethical foundation
Codes of ethics for data already exist, like that of the Association for Computing Machinery. “The question is whether those high-level statements and principles actually provide meaningful guidance because they’re often difficult to translate to day-to-day thinking as technology products are being built,” says Solon Barocas, assistant professor in the department of Information Science at Cornell University in New York. “But high-level statements have value because they signal that the community itself values these things.”
Barocas teaches a data science ethics course, one of a wave of such courses that have recently appeared at universities across the United States. The core thought is to train the next generation of data scientists to incorporate these ideas into their work from the get-go. Recognizing a competitive edge, this is the kind of training that globally ambitious companies will be looking for when hiring data scientists.
The tech industry has existing practices and policies, Barocas says, but they haven’t been particularly well conveyed outside of the industry. That’s beginning to change, however.
How to create a data ethics policy
Data has become a tool for building trust with customers.
What does a data ethics policy look like? Here are six categories, adapted from the guidelines created by DataEthics.
- Human first: Data is always borrowed, never owned. Individuals’ interests, rights, and well-being are prioritized. Individuals benefit from companies’ use of their data. Systems emphasize privacy by design.
- Individual control: Individuals have primary control over, and should be fully aware of, how their data is collected, used, and kept.
- Transparency: Companies must be transparent in how and where data is stored and must be able and willing to explain the artificial intelligence and algorithms they use.
- Behavioral design: Companies must not try to influence customers’ behavior in ways that are not beneficial to their interests.
- Accountability: The protection of personal data that informs data processing should extend to all business agents involved with the data.
- Equality: Companies must pay attention to ensure that data is used without bias.
Done right, data ethics isn’t anticompetitive but is a competitive advantage, says Pernille Tranberg, co-founder of Copenhagen-based DataEthics and co-author of the book Data Ethics: The New Competitive Advantage.
But we should beware of “ethics washing” and “privacy washing,” she says. Think back to the early days of environmental sustainability when there were companies that took serious steps to improve their ecological impact. However, there were also plenty of companies accused of “green washing” – not backing up the talk with actions. This is a real danger, and it’s already happening, says Tranberg.
The answer is developing third-party certification and verification schemes like the ones for the environmental movement. “A lot of companies say, ‘We really anonymize data, we can’t go back and identify you,’” says Tranberg. “That’s very cool if they really do it, but they need to have somebody to check that independently.”
Companies are also recognizing the value in independent certifications for data ethics and privacy. “I already see some companies that are asking for it,” says Tranberg
She thinks it’s likely we’ll start out with many small certification schemes before they mature and grow into larger organizations and governments need to become involved. One example is European Privacy Seal (EuroPriSe), a privacy certification for IT products. The (surprise) German organization was started in 2009 as an EU-funded project and is now a global certification.
Companies that are transparent will thrive. Those that don’t do this will have a much harder time in the future.
- One.Thing.Less Co-Founder James Aschberger
You can take it with you
But it’s not enough to assure customers that their data is being treated with kid gloves. Soon, customers will expect to be able to take their data wherever they want. Article 20 of the GDPR outlines the rules for data portability (the ability to take data from one platform to another).
Although many big companies that depend on data might not appreciate Article 20, it’s intended to increase competition by opening the field to other platforms. It also gives consumers the power to express their displeasure with a particular company by taking their data away.
Achieving portability could initiate new ideas about data monetization. Startups are appearing with products to do just this. There are several new models.
What's your data management strategy?
From collecting to processing and storage, here's how to ensure compliance.
New models for data management
Inspired by opportunity, outrage, or concern – or all three – plenty of ideas aim to change how data is managed and monetized. If any of these ideas take off, we’ll experience a massive change in the data economy.
- Personal data brokers: Think financial adviser but for personal data. Upending the passive role of the consumer, the (maybe) up-and-coming industry promises to help consumers manage and monetize their data.
- Self-sovereign identity: First conceived in the 1970s, self-sovereign identities, unlike today’s distributed digital identifiers, are central, singular, and created and maintained by the individual. Individuals can store and share – or not – as they like. Calling itself a “decentralized, global public utility,” Sovrin.org provides an infrastructure for self-sovereign identities.
- Personal data marketplace: Personal data marketplaces aim to sell data for cash (or cryptocurrency, in some cases) by managing the sales with buyers. One startup (and there are many), Universal Basic Data Income allows users to sell their anonymized personal data that has been collected by an app called Digi.me. However, while the idea of selling personal data isn’t new, the question remains whether a significant mass will sign up for these marketplaces.
- Data labor unions: The book Radical Markets: Uprooting Capitalism and Democracy for a Just Society by Eric A. Posner, a professor at University of Chicago’s law school, and E. Glen Weyl, a senior researcher at Microsoft, posits that data is a form of labor and suggests the creation of data labor unions that will enable fair compensation for data.
- Personal data storage: From the famed Sir Tim Berners-Lee, creator of the World Wide Web, comes Solid, a decentralized, open-source Web platform. A Solid Pod stores personal info and Solid Pod owners can decide who to share access to their Pod with. Personal data can be shared across apps.
How can everyone win?
Consumers want to see benefits from the use of their data, and companies want to grow. The two aren’t mutually exclusive.
Providing transparency can smooth the path to consent. A company that provides consumers with a clear understanding of how their personal data is used and secured, and that makes clear how consumers can benefit from sharing their data, is more likely to win approval for that data use.
Switzerland-based startup One.Thing.Less wants to take the work out of privacy for both consumers and organizations. Its app that facilitates the communication about data privacy and use policies creates an opportunity for companies to engage their customers, says co-founder James Aschberger, and is a win for both parties. Making the data relationship clear and simple will be another competitive advantage. It gives consumers some peace of mind and signals that a corporation isn’t hiding its real intentions behind a wall of terms of use agreements that few actually read.
But a future in which individuals are in complete control of their own data isn’t a given. It’s not clear that people are better at managing privacy than companies are, says Cornell’s Barocas. The idea already has a long history, beginning in the 1990s, and an equally long track record of failure.
One.Thing.Less’s Aschberger isn’t convinced that there’s enough money in personal data for it to ever become a substantial source of income for individuals. But he does think that we’re moving toward a future that’s more collaborative, allowing individuals to control their data and provide, if they so choose, more accurate data. And companies might not have much choice in the matter.
“It will become very interesting because I believe that those companies that are transparent and that give people enough reasons to trust them will thrive,” Aschberger says. “Those that don’t do this will have a much harder time in the future.”
Meet the Authors
Further reading
SAP Insights Newsletter
Ideas you won’t find anywhere else
Sign up for a dose of business intelligence delivered straight to your inbox.