Facebook CEO Mark Zuckerberg meets European Parliament President Antonio Tajani at the European Parliament
Facebook CEO Mark Zuckerberg meets European Parliament President Antonio Tajani at the European Parliament PHOTO: Shutterstock

It feels like every week, a news item emerges that could serve as a case study in ethics. A company's poor decision when exposed to the light of day (provided by the press) seems shockingly bad. The ethical choice in most cases should have been obvious, but it clearly wasn’t the one made.

This week, as in many weeks in 2018, the case study comes from Facebook. Facebook collects a lot of data. It has an impressive social graph for its members. It can analyze communication patterns and determine our moods, the strengths of our relationships, and our tolerance for our crazy uncle. Most, if not all, of Facebook's ethical lapses involve its handling of this data. The most recent one involved allowing other large firms access to that data.

With Great Power …

Large media firms, such as Facebook, Apple and Google, hold a lot of information about everyone. Even if you don’t use their product, your appearance in someone’s contact list reveals a lot about you. This information is what they bundle up and use to sell products. 

When done right, they sell targeted advertising and access without revealing your deepest secrets. When done wrong, they end up selling you.

Your firm is likely not a large firm, but you still want to know more about your customers. You probably don’t have a social graph, but you know which ads and emails have caught their eye and which parts of your site they frequent. Depending on the size and breadth of your marketing effort, you may know quite a lot about a person.

What should you do with that information?

Related Article: Marketers, Data Collection and the E-Word: Ethics

… Comes Great Responsibility

The first answer is to protect it. The more data you collect, the more you need to secure it. Last month I wrote about balancing security with forward progress. If your forward progress involves collecting more data, than it should also involve better protection for that data.

When it comes to collecting data, remember you are trying to achieve a goal. The raw data is not the goal. If you sell toys, you are trying to determine what kind of toys the customer might want today and in the future. A customer profile may outline what type of kids (e.g. boy vs. girl, active vs. sedentary) he buys for and how many kids he has. That itself can be a challenge as he could be buying for multiple kids or one, very eclectic, child.

What needs to be stored is the result. Every action doesn’t need to be recorded in perpetuity. Every action sways a profile. Sure, knowing he looked at a product without buying it has value for a month as you try close the deal. However, six months later all you need to know is he was looking at toys for a 5-year-old girl who may now be 6.

Related Article: How Much Information Security Is Enough?

Using Data Smartly

Consider Amazon. I have no insight into how it manages its customer data beyond what is publicly available. All I know is 1) it tends not to land in the news about failing to protect data and 2) it can reliably tell me what I’ve purchased.

I’ve been shopping on Amazon for quite some time. I can go back and look at my three orders from 1999. They were all books. I'm good with Amazon retaining that history. After all, if I want to peruse my past orders, I should be able to see them.

But what else does it do with my data?

It most likely has built a general picture of who I am and what my hobbies are based on my orders and browsing history. This can be tricky as sometimes I buy gifts for people who don’t share my interests. Still, Amazon can work up a clear picture as gifts are often identified as such during checkout. It can also strike anomalous items bought during the holidays. This creates an evolving picture that updates over time to target me with ads.

Additionally, every purchase should go into its analytics tools showing some guy in Virginia bought these items. It can perform analysis around those in my demographic group to determine trends, both short and long-term. It doesn't need to know more than my zip code and my estimated family make-up for these calculations because it is looking at trends. It can then take these results and use them to target specific demographic groups.

Amazon can support the targeting of ads from retail partners without revealing my data. Partners should make the request based on filters and then be shown an estimate of how many people may be reached and the cost. If the target audience is too small or large, the filters could be adjusted before committing to the ad buy.

At no point should the partner have access to the data. It should never know I was targeted unless I click on the ad. Facebook failed in this aspect. It allowed direct access to the data. It also appears to have failed in another area, by falsely reporting the number of people its ads reached — but that is an ethics lesson from earlier this year.

Related Article: Worried About the Amazon Effect? Let Data Show You the Way

Do the Right Thing

Every market is different. Car dealers will need to understand my buying behavior over a longer period of time than a toy store. The same principles hold though:

  • The more information you have, the greater your responsibility to protect it.
  • Behavioral information has an expiration date and should not be stored indefinitely except in aggregate.
  • At no point should any specific information be shared with others unless it is clearly agreed to by your customers.
  • Customers are permitted to change their mind.

The last two tie into the privacy regulations we've seen emerge in multiple countries and regions. Unfortunately, it doesn’t seem to exist as a functional reality for Facebook. Having a sentence in a several thousand-word terms of service doesn't count when a revised version is presented years after joining the service.

Facebook's actions were an ethical fail on many levels. The Golden Rule tells us to treat others as we would be treated ourselves. Facebook failed to observe this rule. It never put itself in its user's shoes and considered how it would feel to have your data shared. And while Mark Zuckerberg received a brief taste of how it might feel during the congressional hearings, was it enough to change course?