Capturing data is really tough to limit; if we focus more on the usage, that will give us better governance.

The World Economic Forum, a nonprofit best known for its annual gathering of global business and political elites at Davos, has made data privacy a key focus in recent years. We talked to Bill Hoffman, who spearheads their cross-industry initiative on personal data, while researching The End of Anonymity, one of our 10 Trends for 2014 and Beyond. Hoffman discussed some of the privacy approaches businesses should embrace, the rising notion of personal data as currency and why arguing that “Privacy is dead” is a cop-out.

The collection of personal data is becoming a bigger concern for consumers, but marketers see it as a huge opportunity to better understand their customers. How do we start balancing the concerns or needs of both parties?

It’s an opportunity to create new value, to deliver services that are more personalized, to unlock this new asset class of data. From my vantage point, what initially started as an entrepreneurial idea, as a notion that a few evangelists held, has spread. There’s also a growing recognition that along with the upside are also the risks of reputational harm—that when there’s a breakage and lack of stewardship in the way data is managed, there is a very significant downside.

One of the things I’ve noticed is the heightened awareness on both sides of what a tightrope this is and the need to be more accountable—recognizing that silver bullets, one-size-fits-all solutions, aren’t going to be effective over the long term. And then you have to work out, how can you be accountable in a world where things are constantly changing?

The sophistication of, “How do we start to establish principle-based governing systems?” is maturing. We hear from a number of people who have been thinking about it from an infrastructure and technology perspective. So if we can begin to answer the question of, “Who has a copy of your data and where is your data?” that will be a big step forward.

Almost every day now there are disclosures of data being used in a manner in which it wasn’t originally intended. The notion of surveillance is also broadening. I don’t think there’s anyone that denies there is growing sensitivity toward that. Though there’s this notion that the harms are intangible. In general it’s also created the beginnings of a stronger voice for accountability in how data is used, given that it’s just a much more complex and complicated world.

Does data grow more valuable as more is created, or does it become less valuable?

It’s grains of sand, right? The potential of more informed, more granular, better decisions—better insights—clearly can be a function of how much access, how many raw bits did you feed into the equation? In and of itself, data is inert. We need to push the analytical end a little further in the context of what decisions are being rendered with it—what happens to an individual versus the originating source of the data. We are at the very early stages. Capturing data is really tough to limit; if we focus more on the usage, that will give us better governance and a better way to define appropriate uses.

What are some concepts that companies and brands should think about?

Overall, the economics of data is largely captured on the forward transfer. The secondary and tertiary leverage is in the reselling of data that was at some point generated by a particular individual and has some type of connection to them. Being able to capture data and sell it to whomever you want to and, from my point of view, hiding behind very long, complex user agreements that people don’t really read continues to offload all the risk on the individual. Systemically, you’re somewhat rolling the dice. Something [bad] could happen. And there could be a significant backlash.

I click “I agree” on a countless number of websites. I have no idea where the data is going. I may or may not be happy with how the data is being used. That’s where this notion of user engagement and having a richer sense of what consent means comes into play. It’s how the value is going to be created, but within a given use. When you go from one context to another, that’s when you have to look at what are the permissions—the agreements that all the various stakeholders had when the data was originally collected and developed.

In terms of where businesses can start to focus is, (a) transparency. And being able to provide the tools and the means so that individuals have greater agency and a richer understanding of the consequences. Where does the data flow, and who else gets it, and what do they do with it?

Does that require establishing some generally accepted definitions of what is acceptable and what isn’t acceptable, in terms of personal data?

Yes, then you can back that up with contract law. When you swipe a credit card, that retailer has a very complex set of rules they plug into. At the end of the day, it’s a trusted system. People use their cards around the world. As a model, that might be something, at least from a structural point of view, where you can start to look at the agreements and the leveraging of interoperable contracts.

Startups like Personal.com, the Respect Network and Reputation.com are based on the notion that one way to navigate this complicated world is to take control of your data. Is this where we’re headed?

We’re still at the early phases, and so who knows. One of the dynamics that is encouraging, especially in the U.K., is the amount of specific data that’s being handed back to people. So there’s some supply that’s being shared. In the U.K., you can literally ask any business to give you a copy of your data. So at least some of those mechanisms are starting to fall into place. Now, how that becomes something unique and differentiated from other ways to deliver service, we’re still figuring those things out.

In terms of the dynamics and the model, there’s also an infrastructure aspect to deliver these services that’s maturing. Having the API in place, having these personal data clouds that really do have the security, the encryption is important; making sure that there’s a compelling offer from the supply side, those with offers. And also there’s a timing aspect of, Are there enough personal data stores—enough people that have data they would like to collectively express in the market? Is that big enough and better than what you can get through some of the traditional data brokerage houses? It’s going to take a little while to play out.

More companies are presenting clear incentives to opt in, whether it’s into loyalty programs or if you like them on Facebook, you’ll get a discount or something. They’re essentially saying, If you give us a little data, we’ll give you something extra.

You’re right. That concept of reciprocity is starting to kick in. It’s going to be a richer marketplace than maybe people initially understood. We work closely with a business based in Boston called Jana. They have a focus on emerging economies, and their database has access to millions of individuals. And in exchange for answering surveys, taking questions, the individual gets airtime on their cellphone. The users-and-payers type of relationship I find very interesting in that the nature of the relationship begins at a different starting point.

What are the consequences of an individual opting out of the digital universe?

You don’t get as much. On the other hand, there’s an incumbent interest in advancing the narrative that the Internet breaks and dies if we respect privacy. I don’t buy that. There’s ample opportunity where the trust, the transparency, the control can be expressed in the marketplace, and all the great innovation that’s come over the last 20 years can continue. The economics of the forward transfer of data, where it’s impossible to know who has your data and where is your data and how it’s being used—I don’t think that’s absolutely essential for the next 20 years of innovation on the Internet.

Conventional wisdom used to be that the younger generation doesn’t care as much about privacy, but that seems to be changing?

The other notion is that people are growing up with a much more sophisticated use of the tools in the way they express multiple personae on the Internet, particularly as you watch those who have grown up with Facebook as they enter the workforce. The number of people deleting profiles, changing their settings, that is an expression of the sophistication of the marketplace.

I would also say that the term “privacy” has been used for decades as a very convenient rhetorical tool, but by design people used it to conflate a variety of different harms. “Privacy is dead. Get over it”—that’s a cheap answer. We can do a lot better. There are a lot of technical innovations that need to happen. How do we solve this dilemma of, in some contexts being highly anonymous and others being highly authenticated? Right now we’re stuck in an either/or world, and that’s just not going to be a sustainable strategy in our new hyper-connected world.

Are there some best practices that companies should be following?

It depends. There are existing fair information practices and principles that have been around for 30 years. The anchor point continues to be, Are you a good steward? Another best practice is, Do you have the means and capabilities to uphold and enforce your business’s rules of the road for how data is managed? There are a variety of industry sectors that have been using highly personal data for a number of years at scale, and they’ve got very good codes of conduct.

I think more and more leaders—executives, government leaders—are starting to look at innovation in data practices through the eyes of their customers.

Has there been any grassroots backlash to the upsurge in data collection, sort of an Occupy Data movement?

There’s a site called WeTheData.com. They’ve got a very nice populist approach to collectively expressing the voice of saying, At least give us a copy of our data. And that’s the beginnings of that conversation. In the U.K., the nature of the dialogue, from the government’s perspective, to give people back a copy of their data and to start to generate macro-economic benefits by giving people better and more access to their data—that’s where there’s a strong public awareness of what the potential could be.

Do you think that’s something that’s going to transpire in the U.S. as well?

There’s a relatively good chance, and generally the efficiencies being addressed in the U.K. clearly apply in the U.S., as well as in a lot of other governments.

How can companies start to be more mindful in their data practices?

One of the ways people can be mindful is by appreciating that individuals are not just consumers. They’re producers of data, and as a producer of data, it becomes a different type of relationship. If people feel there was a fair value exchange, then that begins to build a healthy dialogue in a relationship with individuals. That begins to frame the opportunity in how you can establish that relationship and trust, and then build out value propositions that work.

Is there a line we’ll be able to draw where we might feel a bit more secure about certain information and maybe less secure about some other information, creating a tiered system of privacy clearances?

That’s exactly right. We don’t have the tools right now to get to those tiers. We lack the means for a more granular way of implementing the individual-based perceptions of where harms come in. Being able to dial up or dial down how I want to share data in a given context is part of the challenge.

The other structural change is the ability of any one individual to blast and tweet [private information] around the world. Before, for me to tell a secret about you, I had to have unique means to make that widely known. Now, we all have access to arguably billions of people. It’s easier for me to spread a secret than it has ever been before. And that’s where the social norms as well as the technical elements are in their early phases, but ultimately we’ll start to develop these norms, these codes, these ways of managing it.

There obviously is a dark side to advanced data collection.

Yes, data algorithms can be used to deny or grant access to services. We should start to think about what are some of the mechanisms of transparency to make those algorithms and those outcomes more effectively understood. Right now the notion of transparency gets bundled in with consent. You could start to see, particularly as the Internet of Things starts to expand, why literally certain doors open for some people and others don’t.

From a marketing perspective, it ultimately comes down to trust, right?

Yes. Another element is, Can we make access to the data more equitable? And how do you start to balance out access to the algorithms? In some instances, why did individual X get a certain treatment and individual Y didn’t? The notion of consequences for computationally driven decisions is where I would anticipate a coming set of public discussions.