Most of what’s been written about big data and data analytics -- and there’s been a lot written --accentuates the positives and the possibilities. It highlights the ability to use insights gleaned from data to make faster, smarter business decisions. It talks about how companies can use big data to drive the development of new and improved products and services capable of improving life for customers. It examines the myriad ways data analysis can be used to improve the quality and delivery of healthcare, facilitate a better learning experience for students, and help the world proactively prepare for disasters.
None of this is untrue. Big data analytics can indeed be immensely powerful. But as the saying goes, with great power comes great responsibility. As we turn the calendar to October, a month famous for the scary, there’s no better time to talk about the dark side of big data. If not handled properly, ethically and procedurally, big data can get pretty creepy, pretty quickly.
Perhaps the most important thing to understand about the dark side of big data is that creepiness is in the eye of the consumer, not the company. If your customers find your use of data to be creepy, well, your use of data is creepy. Simple as that.
So how do we implement rules or guidelines to avoid creepy behavior? Computer algorithms, machine learning and analytics can’t determine what is creepy -- at least not yet. That means that as a company, it’s your responsibility to ensure that your use of big data doesn’t cross the line.
With that in mind, let’s take a look of some of the over-the-line scenarios you’ll most want to avoid.
The 'Stalker' Experience
Everyone knows that their data is being collected, analyzed and used, and to an extent, they’re “OK” with it. Emphasis on to an extent. Receiving the occasional special offer, coupon or purchase suggestion is one thing. But when a company and its offers seem to follow you everywhere you go, and seem to track your every online movement, the line has likely been crossed.
Again, there’s no actual, physical, tangible “line,” so to speak -- only the line that exists in the minds of your customers. If they think it’s been crossed, then it’s been crossed. I’ve had people tell me they would seriously consider paying money if it meant eliminating a company’s advertising from their life for good. That’s not the kind of experience you want to create for your customers, it’s not the kind of reputation you want for your company, and most importantly, it’s not the way you win new business.
The 'In Poor Taste' Experience
Targeted marketing based on customer behaviors and online activity is a reality. It’s not going away, and the consumer public doesn’t expect it to go away. But they do expect it to be done in good taste.
Let’s take the example of a person who has just lost a family member. He runs an online search trying to find a nearby funeral home. The next thing he knows, he’s bombarded by offers pertaining to caskets, gravestones, floral arrangements and more. Yes, those are things he might need. But they’re also constant reminders of the painful fact that he just lost a loved one, and he’d have every right to feel as though these companies are targeting him in poor taste, with no regard for the grief he’s experiencing.
Again, there may be nothing wrong with the marketing execution itself, but is this the type of experience you want to create for your customers?
The 'Outright Misuse of Data' Experience
The dark side of big data is clearly at its darkest when companies flagrantly misuse data. Misuse of data takes many forms, but can be generally understood as any instance in which an organization uses data for a reason other than that for which it was disclosed. There are obvious instances of misuse, such as if an insurance company collecting data on blood pressure for an industry study were to then use that data to increase the premiums of at-risk customers.
But there are other not so obvious misuses, such as when data disclosed for a specific reason is combined with other data -- be it public data, or private data disclosed to a completely separate entity for a completely different reason -- to reveal something that an individual had no intention of disclosing.
A major retailer ran into this very issue, where combining disparate pieces of data led to an understanding that a given customer was pregnant, even though that customer never intended to disclose her pregnancy, nor expressed a desire to receive offers of pregnancy-related products.
The Golden Rule: Do No Harm
The challenge organizations face with respect to ensuring that data usage never crosses the “line” is that there’s no clear blueprint to follow. Few written standards and guidelines exist. Few best practices applicable across verticals have been outlined. At the end of the day though, the customer is always right, which means the onus is on the company.
It’s the company’s responsibility to ensure data usage always remains moral and ethical, and never crosses those creepy lines. In other words, it’s the company’s responsibility to ensure that no harm is done. Consider that the golden rule of big data. Instilling that mindset throughout your organization is the first step on the way to an ethical big data practice.
From there, be proactive and take the lead. No formal company policy exists policing the use of big data? Write one. No moral code or set of ethical standards exists outlining your company’s expectations with respect to the use of customer data? Draft one up. Make it clear that there are standards your company expects its employees to uphold. Make it clear that use of data for purposes other than those expressly consented to by the discloser will not be tolerated. Make it clear that you expect no harm to be done.
Most of the dialogue around big data analytics revolves around its vast potential to change the world for the better. It’s incumbent on us, the users and keepers of big data, to keep it that way.