We seek to be the centrists in the privacy debate.

As outlined in The End of Anonymity, one of our 10 Trends for 2014, tracking and leveraging consumer data is becoming a critical skill for brands at the same time that privacy concerns are on the rise. In researching how marketers can navigate successfully here, we talked to Jules Polonetsky of the Future of Privacy Forum, a Washington, D.C.-based think tank that aims to advance responsible data practices. Polonetsky has served as chief privacy officer at both AOL and DoubleClick, and worked in public service, including a stint as consumer affairs commissioner for New York City. Among other things, he discussed why certain technologies seem “creepy” and how to make them feel less so, and how savvy businesses can approach highly “nuanced and complicated” data-collection issues.

Can you tell me a bit about the Future of Privacy Forum—what’s its role in the conversation around privacy issues?

We try to be the pragmatic, responsible voice of data protection. Half of our board are chief privacy officers of Facebook, Google, Microsoft, Yahoo!, Intel, eBay, Procter & Gamble, app developers, startups, data companies, consumer brands. The other half are academics and privacy advocates.

We seek to be the centrists in the privacy debate—we are optimistic and enthusiastic about new technologies and how they can add value to the world and improve the lives of individuals. But we’re also deeply aware of the downside of technology or data use, and we take on projects in areas like education in data, marketing use of data, social media, the smart home, smart grid, smart devices, Big Data.

We just did a project with most of the companies that do retail analytics—that provide stores with reports showing how customers walk throughout their store, how long wait times are online, how many people come into a store and never shop, how users navigate through aisles. The companies do it by tracking the Wi-Fi or Bluetooth address of your cellphone, and a number of them have been criticized for doing that without appropriate privacy controls. So we, together with a number of leading retailers, organized the industry, and then together with Sen. [Charles] Schumer, announced a code of conduct that those companies have agreed to follow.

Our theory was, look, this can be useful. It’s valuable for stores. But if consumers don’t know what’s going on and policymakers think the data is being misused, you can understand why there is concern. So let’s get ahead of it and lay out a good path forward, and we can update it to take into account new risks or new data uses as they come about.

Being the centrist in this privacy debate seems like a big challenge, given how many new technologies have implications for privacy.

Well, there’s so many new ways that we’re using data, often for valuable and helpful purposes, but every time we have a new type of data used in a new way, there’s also a potential misuse. The world is becoming more complicated. We’re learning more about how to tackle problems, and most of the answers to the problems of today’s world are data—how do I measure what is working?

Sometimes I [may be] using data crunching to measure how cellphone users in a country with famine move throughout the country seeking food or water, and I can track their phones and tell you that there’s going to be a terrible need for food. So we can save lives. Cars will be less likely to crash as they become aware of each other—but, yeah, that means somebody knows where a car is. The fact that we can find all the world’s information at our fingertips is incredibly powerful, but it also means I might look you up and research things about you.

The data needs to be there. It needs to be available, and we need rules sometimes that say, just because you can do it doesn’t mean you should do it. And then sometimes we need laws that protect things, and sometimes we need new technologies. I’m optimistic that we’re going to have technology solutions. This [secure-text app] Frankly—you send the text and it disappears in a number of seconds. It’s not 1,000 percent perfect; the NSA might be able to figure out how to get it. But it sets the tone for communication—that this is not intended to be forever.

We’re starting to see a bit of a technical pushback: The pushback against the NSA by the big tech companies—they’re encrypting things, they’re making it more difficult for outsiders to snoop in. That’s probably overdue, but it’s well worth doing now. And then you’ll see apps like Snapchat and Frankly and others. Or you’ll see browser controls. Mobile devices are already building in new protections. The latest version of iOS 7 doesn’t let apps grab certain data that they used to be able to grab. So I think these things evolve.

These days you hear the word “creepy” a lot to describe data-collection practices.

I have a paper coming out in the Yale Journal of Law and Technology called “A Theory of Creepy.” So the question is, how do you know, right? How should a company know what is going to turn out to be creepy? Creepy is sometimes in the eye of the beholder, and what was creepy yesterday sometimes is not creepy anymore.

When Kodak cameras were first invented and you could go and take a picture in the street, this was considered an outrageous privacy intrusion. There were actually efforts in Washington to ban Kodakers from public parks because of the idea that now someone could snap your face in the street and put it in the newspaper. We got used to it. Now we’d never dream of banning cameras, but we have people reacting to Google Glass—“Oh, my God. Someone will take a picture of you off their head.”

So we have these cycles where we have new technology that uses data in some new ways. We react to it, and then we adjust. But that adjustment process can be very challenging. If I did a background check on my nanny, you would say, “That’s a responsible parent.” If I checked out the driving records of the parents who drive my kid to school in his carpool, you’d say, “That’s a little weird.” But why, right?

Well, there’s a norm there. We have to think these things through. Companies also have to think these things through. There’s always going to be some new actors who rush quickly and who create a backlash because of silly or sometimes dangerous things they do. And then the rest of us try to think through the rules, and sometimes we need a law to prevent the bad practice, and sometimes we need self-regulation, and sometimes we simply need good manners.

It’s becoming really important for companies to figure out what exactly is creepy. As you’re saying, it’s not necessarily all that logical.

It’s a number of different factors. Sometimes it’s when you’re surprised to learn that something can be done that you didn’t know about. This retail location technology I mentioned—when people read stories that stores can track your phone just by the fact that you’re walking through the store and you have your Wi-Fi on, the reaction was “Huh, that’s creepy, I didn’t know they could do it.” So often it can be simply something that’s happening behind the scenes that you didn’t expect. Sometimes, once exposed, sunlight is the best disinfectant, and in the light of day it seems to be sometimes innocuous.

Sometimes it’s consumer expectations. You go to Amazon and they track everything you do, and you kind of like it. You don’t have to read the privacy policy. They sort of say, We think you might like this because everyone else like you likes this. So without any sort of disclosure or any privacy conversation, they just told you they track you, they track everybody, they analyze you, the analyze everybody, and here’s what they would like to make money selling you. They set the tone that this is a tailored and personalized experience, and it’s done for you and for them to sell you things and make money.

Yet when Orbitz was exposed to have been tailoring its results to show people who use Macs fancier vacations, there was a big backlash because they hadn’t set the tone that “This is an experience where we’re trying to find what you want.” Instead [the reaction] was, “Ah, you’re trying to sell me something I maybe don’t want because of what you’re assuming about me.”

Our argument is that data today is too important to be left to a privacy policy or a legal disclosure. Data is a feature very often, and we need to design consumer experiences that help users navigate the data features. It’s a decision I need to make to optimize my experience or not. So sometimes it gets called privacy, but more often than not it ought to be, Is this being done for my benefit in an open and transparent way, or is this something that’s being exposed because it’s being done in secret, and maybe it’s being done to get something over on me and get me to buy something I don’t want. Those can be the same experience sometimes. But by framing this appropriately and by designing a UI that enables transparency, I can convey an empowering message.

Do you think one of the obstacles is that people are often afraid of the wrong things?

I don’t think there’s a right or a wrong thing, because I do think that some degree of privacy concern can be psychological and it doesn’t mean it’s not valid. If I have a feeling of risk or of concern, it’s not always rational. Look, we worry more about our safety on a plane than we do when we drive, even though the statisticians tell us it’s riskier to drive. We have to recognize that that’s how consumers feel, and privacy is not always hyper-rational—again, I’m going to let Amazon have a lot of data, but I’m going to clear cookies of some little blog who probably can’t do anything with the data anyway. One company is giving me a feeling of control, and another has acted in a way that has put me on edge.

Target was criticized when it sent coupons that alerted a father that his daughter was pregnant before he had been told by her. Because the algorithm understood. I’d argue that if the algorithm was so smart that it could be sensitive to consumer expectation and be designed to be surreptitious and somehow sense this feeling that certain categories are more sensitive than others, that would be even creepier. I’m comforted by the awkwardness of the algorithm sometimes—“If you buy these certain things, then you want these certain things.” It’s blunt, and it’s obvious and it’s robotic.

It’s when the machines start managing our privacy expectations in some nuanced way that lets them have the data but not let us know that judgments are being made—I worry about that. So we call for transparency of the algorithm. It’s less about, Show me the data. The data is lots of bytes, and it’s probably meaningless to me. Tell me what the algorithm is doing: What sort of judgments are being made? Are you marketing? Are you discriminating? Let me have a chance at fixing it and making it right.

Nordstrom was criticized after it posted signs revealing that people were being tracked by Euclid’s retail-analytics tools. What could they have done differently?

The challenge that Nordstrom and Euclid ran into is that they were first movers with a technology that was still unknown to the general public. The only way one can advance a technology like that without creating a backlash is to really be forward in declaring, announcing, educating, promoting.

One thing they might have done is to initially make this available only for their premium customers: “We always want to know when our premium customers are in the store because we want to treat you first-class. We invite you to get into this program if you apply for it.” And then after promoting and announcing, rolling it out to a broader audience.

Disney is doing a very interesting job as they’ve rolled out their MagicBands. [They’re saying,] “First, we’re going to give it to the invited audience and the people who get to stay at Disney hotels will get it. And here are all the additional [perks]—you can reserve your lunch and it’ll be ready as you approach, etc.” Yet there’s other data about analytics, and they’ll know more about who uses their park. But they’re leading with the value. I started seeing kids who have been part of the initial wave, and they don’t want to take it off. They’re wearing it now months later.

The process of introducing a new technology is one that used to be managed. Let’s take another example. When Gmail launched, they were going to scan your emails. Some critics didn’t like that. There was a proposal to ban that in California. And Google initially made Gmail only available by invitation. Then you could invite others. Finally it rolled out en masse. They were very sophisticated in doing that. You need to be very careful about your rollout process when you have a new technology. You need to pronounce and debate and explain what you’re doing. Nordstrom put up a sign and worked with a company that had some very visible and good privacy practices, but it was still a technology that was unknown and where the rules weren’t clear.

By organizing all the companies and by inviting advocates and the Federal Trade Commission and Senator Schumer, we said, “Hello, there’s this new technology, and it helps stores know this and it helps them know that. And here’s what could happen wrong, and we’re against the wrong. And here’s a way to right now opt out of it in your home before you go anywhere.” Our hope is that by announcing the data and announcing the possible concerns and announcing how we’re dealing with the concerns, we’ve provided a path. Consent sometimes is the right option, and sometimes it doesn’t work when it’s new and people don’t know what [the technology] is.

Imagine if, when Google’s vehicles that map the world rolled by your house, they put out a flyer saying, “We’ll be photographing your house and street, and would you agree that we can log the number of the Mac address that is being beamed into the street by your Wi-Fi router? Because doing so will let us build a global database that will help build location services, and your phone will have access to this database and your GPS will be better.” People would have been, like, “What? Someone call the police.” Well they did it, right? And every time I turn on my phone, it looks for the GPS, and it’s assisted by the fact that it looks at these local Wi-Fi routers, and now we get annoyed if it’s not working. So there are times where it’s a matter of educating, explaining, promoting, so as to help set consumer expectations.

Do consumer attitudes vary widely across generations?

There’s definitely a difference against generations, but it’s not the case that the younger generation doesn’t care about privacy. You often hear that said. They do care about privacy, but they look at it a bit differently. They are more concerned about things being private from their parents, from their employer, from their college admissions officer or from the creepy adult who doesn’t belong in their business. They are a little less concerned about cookies or corporations. They’re cynical about that.

They ignore the ads. It’s hard to sell them things. They expect and tune out the degree of commercialism. The older generation is still provoked by commercialism or data collection where they don’t expect it. The younger generation grew up using free services, free games, free Facebook, and so it’s hard to sell them stuff. If they think you’re selling them, they tune out. But they’re highly attuned to the wrong people being in their business.

Adults are a little more focused on the commercial marketing and sometimes less artful than the young people about managing their personal data. Every teen worth his salt knows that when he chats on Facebook, the entire audience sees it. So they may go to Twitter to have a closed group or even an open group where only their pals are following them. So we see teens and younger people doing a lot to manage their audience. Adults sometimes don’t understand that. “I’m friends with you on Facebook, so I can comment and discuss things as you’re chatting with your other teen friends.” No! [Teens have] learned those rules. They know there are places to engage and places not, and just because you can doesn’t mean you should.

It often goes back to that idea of “Just because you can…”

Corporations aren’t always as sophisticated about those nuances, and that’s where we have these disruptions when companies think they have a relationship with us and they overstep. There are a lot of companies that think they have relationships with us. They wish me happy Thanksgiving and they wish me happy new year, and I’m thinking, “Are you my friend? Why are you sending me an email to wish me happy anything?”

That doesn’t mean you can’t manage and build a relationship. I did a TEDx talk where I tried to use food and sex as models for how companies should learn to, in a nuanced way, manage permissions in areas where the rules aren’t clear.

We come back to the idea of what’s creepy and how to steer clear of that zone.

The boundaries aren’t always clear, and navigating those boundaries can be really difficult, and different audiences have different expectations. What might be cool on Twitter might not be OK on Facebook, and what might be cool online might seem jarring in the physical world and might be welcomed by one person yet rejected by another. So it’s complicated, but frankly, so is life.

If I go into certain bars, there’s a different vibe. And in some bar if you went up to someone and said, “Hi, can I introduce myself?” you’d be punched and somebody would call the police, and in another bar it would be expected because of the scene there. [It also depends whether it’s] the beginning of the night or the end of the night when everybody has had a few drinks, a bar that’s a pickup bar or a bar that’s a straitlaced, business type of bar.

Life is nuanced and complicated, and managing relationships is nuanced and complicated. Data is about relationships very often.

Do you think businesses are becoming more upfront in talking about privacy, more sensitive to the issue?

Savvy businesses are becoming smarter about engaging users as partners in how data is used. The businesses that talk privacy are having a hard time, because consumers don’t want to talk privacy; they want to talk about how the data is used. They don’t want to hear, “We respect your privacy.” They want, “Here is what we’re doing, and here are the choices you have, and here’s what our point is.” Straightforward. We find they’re very skeptical when people make privacy claims.

Some companies think the answer to privacy concern is more time talking about privacy and promising users privacy. And others recognize that the future is much more likely to be, “Here’s the data we have, and here’s how we’re going to try to use it to serve you or sell you. We’d like to know you better, and you can be in control of what happens.” In my view, that’s a better path, because inevitably if you promise privacy, you may end up falling short. You’re much better off being upfront, being transparent and putting the consumer in control so you can serve them and profit from that relationship, supporting their needs.

At a time when the data is in my home because of smart devices, and it’s in my pocket, and it’s in my glasses, and it’s around my wrist tracking my steps, and I have a Wi-Fi scale sending my weight to an app—at a time when companies are entering my physicality, my home, my body, my clothing, my personal space—that’s a very intimate relationship. And you will only succeed if I feel in control of what’s going on and I feel that what’s being done is to my benefit.

I don’t think users will tolerate what they’ve accepted online, where they’re not really sure if they’re being tracked—maybe they should clear their cookies, and maybe they ought to download some software that will protect them. You can’t have people nervous about whether their home devices are tracking them inappropriately. We need to make that obvious and transparent.

Do you think we’ll see a lot more best practices or codes of conduct that lay out some basic principles?

I hope so. We plan on doing a lot more of it. The Obama administration has proposed a consumer bill of rights. That consumer bill of rights is supposed to be a high-level bill and the details filled in by self-regulatory codes of conduct.

In the APAC region, there’s been a lot of work done to show that companies can be certified as following accountable practices across the entire region. So that part of the world has made a big investment in certifying that you comply with a code of conduct that shows you meet baseline practices across many countries.

Can you talk about some of the main topics you’re looking at beyond retail analytics and smart devices?

We’re looking at education, because that’s an area where data can be used in very helpful ways to help measure how kids perform, to understand their weaknesses and where they need help, to personalize learning, to understand which schools are working—but it’s also an area where data can be concerning. Is it in the cloud? Who can access it? Are kids going to be redlined? How are MOOCs going to use their data? You can also see misuses.

We have a connected-cars working group with a number of leading auto companies and other third parties dealing with the data being used in automobiles with apps, with location.

We launched a site called www.allaboutdnt.com—all about “Do Not Track,” which tries to help users understand what’s going on in this whole debate around browsers and cookies and tracking. And we’ve been writing about the benefits and risks of Big Data.

For 2014, what are some of the main topics you think will be dominating the discussion of privacy in the consumer realm?

Wearable technology, clearly, whether it’s Google Glass or FitBit, Nike FuelBand—the data and the sensors that are going to be collecting data about me physically but are not subject to the health laws. Facial recognition is going to be a big debate, whether it’s stores using it to detect customers or whether it’s Facebook and Google or government use. Big Data will continue to be a debate as new ways are used for large-scale data collection and new benefits and new risks are identified.

The NSA/Snowden issues will give rise to a focus on, what are companies doing with data? Right now the focus is on what’s the government doing and what is the government getting. But the pendulum at some point will switch back. There will be increased scrutiny and calls for transparency around business use of data as well.