Netflix Tweet: "To the 53 people who've watched A Christmas Prince every day for the past 18 days: Who hurt you?"
PHOTO: Netflix

As marketers, we’re hungry for data. We want it to improve the flow of leads and sales. We want the CEO to look at the ROI number and feel we’ve met and exceeded the goal.

The world, however, is changing quickly. And the marketing profession is, to a growing extent, leading that charge. If we are inventing that future, we should put some thought into what we want that future to be.

Fifteen years ago the internet was flush with friendly, earnest people and businesses excited to do wicked cool stuff and share it with the world. The spammers and scammers were there selling their little blue pills, but most of us could safely ignore them. Behavioral ad targeting was considered a distasteful and somewhat shady practice, until Google bought DoubleClick in 2007.

But it’s not like that anymore. We are in a race to find out everything we can, and to use it to the greatest extent we can, to convince and convert people into customers. Which is fine, as long as we consider the longer term trajectory of our actions. Which is fine as long as we ask ourselves if marketers, and the companies they represent, have an ethical obligation with regard to data and privacy.

When you saw that Netflix tweet did you laugh? I did. I also know it doesn’t mean Netflix knows who those 53 people are — but it can probably find out. We reveal so much of ourselves in seemingly innocuous transactions. What does and what should that mean to us and our profession?

Consumers Don’t Know What They’re Giving Away

My husband LOVES that he can unlock his phone with his thumbprint. There is no way I’m handing my biometric information over to Apple, despite its as yet positive reputation for privacy defense. Its position could change.

Remember when we figured out that Chrome could record everything we said near our computer? Now the iPhone X wants my face. Roomba’s iRobot now maps your house while it sucks up dust, and plans to sell those maps. Your auto insurance company wants you to put trackers in your cars so you can save a bit on your coverage. Amazon’s Alexa and Google’s Home literally invite unseen and unexamined people and algorithms into our homes to listen to our every word.

And you don’t need to have “smart” gadgets to be tracked at shocking levels of detail. Recent studies show that 79 percent of websites globally are tracking your information and sharing it with as many as 10 other companies.

Cambridge Analytica’s psychographic targeting demonstrates how companies can psychoanalyze us and craft intentionally manipulative messages at massive scale. They can use that data in unlimited ways. And if someone breaks in and steals that data, they are not responsible — just ask Equifax.

Does that mean all of these companies are doing horrible things? No. It means they could. Because there is little to stop them from doing so.  There are few – if any – practical limits on what companies can do with our data. There is also very little meaningful mainstream debate in our industry on what we think we should be doing. We do a little more every year, till suddenly what was considered shady is accepted with a shrug. Where is that likely to lead?

Looking Forward

We have reached a data precipice where the companies we deal with may well know more about us than we know ourselves. Our habits, interests and demographics, our physical, biological, psychological and emotional characteristics are all for sale. Our conversations are out there. Our financial information is out there.  Our relationships are out there.

While the internet briefly tilted the commercial power relationship in favor of consumers, it has radically shifted back in favor of corporations (and governments). Information is power, and corporations have lots of it.

The European Union has had data and privacy protections in place for several years, and has now adopted the EU General Data Protection Regulation for all businesses offering goods and services to citizens of the EU. The GDPR has strict requirements about obtaining clear consent to collect or process any personally identifiable data and will enforce these requirements with significant fines for companies in breach of these rules.

The US, however, has no cohesive and comprehensive set of laws or regulations at the federal level. Certain things, like medical information is covered by HIPPA – but only in specific contexts, like doctors, pharmacies and insurance companies. HIPPA is silent on any medical or health information you might reveal on Facebook, or your fitness tracker, or a DNA ancestry test. Many state laws attempt to prevent “deceptive” practices, and companies can get into trouble for breaking their own privacy policies, but no single law regulates data collection or use in the US. In fact, the Trump administration rolled-back the FCC Privacy Rules that would have required customer consent for an ISP to use and share their customers’ personal information.

There’s an opportunity here to innovate and do great things for brands and for consumers and society, but the power imbalance is now so large that it invites abuse.

So, as a marketing professional, how does that make you feel? Are we in the service business or the exploitation business? Are YOU in the service or exploitation business? Where is the line? Few members of our profession want to win a race to the bottom — into an unethical marketplace where individuals have no rights or privacy. That means there is a grand opportunity for leadership on these issues, and we should be there, leading and encouraging the debate. The first step is to have the discussion.

3 Discussions to Have

We need to encourage and elevate thinking and debate about three core data issues that define the balance of power between businesses and the people they hope to serve.

Accountability

If you collect lots of information — or even just a little, should you and your company have some kind of responsibility for that information? Some accountability if it is stolen or otherwise misused? What responsibility should the companies you deal with have over that information?

Exploitation

What if you figure out how to identify people at imminent risk of heart failure based on their social media activity or typing speed? Would you you use that to save their lives or to make a billion dollars? Both? Do you have the right?

What if you discovered a way to take those Roomba mapped houses and use that information to sell each customer an additional smoke detector or cable box or Alexa unit? What if you just want to sell targeted ads for new furniture? What if the data reveals people’s vulnerabilities — do you prey on it?

What should be the rules here? Does free enterprise require an invitation to use your data? What will you, personally, within your organization, do about it? What if you want to stop seeing all of these ads? Do you have a right to use Facebook and not be bombarded by manipulative or otherwise unwelcome content?

Transparency

Do you think your customers should know what information you collect about them and how you use it? Do you think they should have a say in that relationship? Do you think that a 40 page legal terms of service document is a meaningful way to be open and honest with your customers? Do you think that open and honest is a thing? If you were making the rules, what would you want the rules to be? As a business person? Or as a person?

What are you, business person, willing to do to make it happen?

The New Relationship

There is a new relationship emerging between people and business, and we are the ones who will influence and build that relationship.

A key question for marketing leaders: do we really want to invest in trust and customer relationships? Or are we content to seek control instead? The EU’s new rules insist on clarity and transparency. They stop short of defining exploitation.

Our country needs a similar comprehensive, universal set of rules that all companies who interact with online must conform to. These rules should be simple. They should not restrict innovation, but, in this wildly imbalanced environment, they should lean toward the consumer’s benefit. They should begin with absolute clarity and transparency about what data is collected and how it is used. They should go further to ensure that people are not paying a premium for essential goods and services if they opt-out — like at the grocery store or for auto-insurance.

In our existing regulatory vacuum and technological data race, there is a grand opportunity for marketing to take a leading role in defining the future. How can we drive both innovation and data rights?

We are the ones with the knowledge to have the debate. We are the ones with the potentially conflicting priorities. We are also the creative ones. The brand builders and the relationship builders. It is for us to lead this discussion.

Ask yourselves where you should stand on transparency, accountability and exploitation and what steps you believe your organization and our industry should take. What trade-offs, if any, are you willing to make?

The best is yet to come.