The Gist
- Data privacy. Marketers are among the largest generators and users of customer data in organizations.
- Data responsibility. They must, therefore, take responsibility to protect the privacy of their customers' data.
- Data risks. Despite their best efforts, even seemingly innocent acts of marketing can land them in trouble if they don’t internalize the responsibility to protect data in every way.
Marketers have got the message loud and clear: They have to be more accountable for customer data privacy, given their position as some of the most significant creators and consumers of consumer data within the organization. The good news is that privacy is increasingly core to business operations across verticals. Investment in privacy continues to rise, with the average privacy budget of enterprises up by 13% to $2.7 million in 2022.
Despite their best efforts, however, marketers are the most at risk of inadvertent privacy violations because they are at the forefront of collecting, processing and activating customer data in a complex, multichannel environment.
As a marketer, are you exposing yourself to consumer data privacy violations without even realizing it? Here are 10 areas where you may be vulnerable to seemingly minor cracks in privacy compliance that may lead to major consequences for the business and customers.
1. Privacy Risks With Social Media Marketing
The EU’s General Data Protection Regulation (GDPR) and other state-specific privacy regulations are becoming more specific about rules that can make social media campaigns more vulnerable to under-the-radar data privacy violations.
For instance, remarketing to website visitors on social media platforms often requires specific opt-ins, especially in certain geographies, industries and age groups. GDPR’s “right to be forgotten” includes erasure from external social media groups and pages as well. Marketers working with third parties for brand collaborations or influencer marketing are accountable for how partners use customer data gathered in joint social media campaigns.
All of these are potential risks that may go unnoticed till it's too late.
Related Article: The State of Consumer Data Privacy Legislation in 2023
2. The Perils of Email Marketing
Email marketing often involves transferring or sharing data with third parties such as email service providers, ad tech software platforms or media agencies, and data vendors. While these are necessary to improve targeting and personalization, the partners’ data privacy compliance standards need to be immutable.
Christine Frohlich, head of data governance at Verisk Marketing Solutions, recommends regular and thorough data collection, storage and disposal audits across marketing and sales. While periodic data deletion and opt-in refreshes help avoid risks, not all marketers have documented data retention schedules. Worse, many teams lack the systems to ensure customer unsubscribes or opt-downs and preference changes immediately reflect across all email lists and systems.
Any of these seemingly minor defaults could lead to data privacy violations originating from some long-forgotten list sitting in a dusty corner of your CRM.
3. Privacy Unknowns in the Evolving Programmatic Ad Tech Space
The rapid shift to “server-side” ad tech after over two decades of mainly browser or device-side technologies like HTTP, JavaScript, HMTL, mobile SDKs and browser/device APIs, noted Myles Younger, head of innovation and insights at U of Digital, will require significant time and upskilling for marketers and agencies to master. That includes learning about the privacy implications of new server-side ad tech.
Shrinking third-party data has also pushed marketers and advertisers toward data clean rooms. They are promising because they allow brands to merge or match their first-party data with another brand’s first-party data in a safe environment — i.e., doesn’t need data movement or personal identifiable information (PII) exchange.
But there are potential cracks that may lead to privacy violations. First, there is no guarantee each data set of each clean room participant has been collected in a compliant manner. Next, different privacy regulations may apply to various participating industries, jeopardizing all participants. Clean room interoperability — an issue being addressed by IAB Tech Lab’s new technical standards for data clean room interoperability — means that the potential to reverse engineer anonymized data back to an individual, however slim, still exists through correlation, suggests Christopher Penn of AI-powered data and analytics agency Trust Insights.
As brands try to get first-party data from multiple channels and enrich it across sources including data clean rooms, the points of exposure have multiplied, said Jayesh Easwaramony, founder of Spectra Global, which helps marketers and publishers convert data into revenue with first-party data platforms. Two common exposure points are form fills that happen nondigitally, say in a retail environment, or via contests where privacy ownership fragments. From a technology standpoint, he added, data teams should build strong privacy tech to support data sourcing and processing across channels and flag vulnerabilities before utilization.
Related Article: Balancing Customer Data Privacy and Usefulness
4. The Use of AI in Digital Advertising and Marketing
Digital advertising and marketing at scale requires large data sets for segmenting, targeting and predictive analytics. But anytime you use AI, cautions Jake Moskowitz, founder of ad tech strategy and thought leadership consulting company Persuasion Art, there is an inherent risk of bias created by the training data.
Marketers using AI-powered targeting and measurement models should confirm the right to use consumer data for training the AI and put checks to ensure training doesn’t lead to unfair biases. For instance, one group being favored over another in terms of targeting or offers.
Testing the accuracy rates of AI models segment-by-segment should be the default culture to use AI for both — segmentation/ targeting and measurement use-cases. A simple first step, he suggested, is to require vendors to prove that they have done this level of bias testing.
That said, Moskowitz added, using AI for breakthrough outcomes requires a mindset shift to reimagine what the technology can do. Rather than using it to replicate old ways, albeit more efficiently, he proposes using AI in fundamentally new ways. Ways that embrace current realities like more regulation, more customer awareness and more technology. Lookalike targeting, for instance, is a good example of where AI can be used to improve upon familiar models. But lookalike targeting requires IDs. If IDs are going away and there's more anonymous inventory in the ecosystem, we need a mindset shift to consider new use cases for AI, such as using it to find value in anonymous inventory, he concluded.
5. Confusion and Obfuscation With ‘Dark Patterns’
Nearly half (46%) of US consumers feel they cannot adequately protect their data because they don’t understand exactly what organizations are collecting and doing with their data, despite transparency and clarity of purpose being tenets of the privacy-first culture.
What makes it “hard for customers to understand”? The answer, as per the California Privacy Rights Act (CPRA), is “dark patterns” — “a user interface designed to manipulate, subvert or impair user autonomy, decision-making, or choice.”
Simply put, designing banners where it’s hard to find or select opt-out buttons, using misleading or overly technical language designed to exasperate users into submission, pre-selecting check-boxes more favorable to the company than the customer, or disguising the true intended purpose of the data being collected. (See graphic).
Learning Opportunities
Privacy laws do not deem consent obtained this way as true consent. That leaves marketers who choose this route (you know who you are!) vulnerable to violations and lawsuits, especially as state laws get stricter and penalties stiffer.
6. Data Anonymization Is Not a Privacy Silver Bullet
Anonymizing, randomizing, noising or masking data for analytics (in the cloud) — while necessary practices, do not guarantee compliance or security. Marketers must be cognizant that anonymized big data sets, even without PII, can still be cracked by expert data scientists (or hackers) by stitching together characteristics to identify specific data subjects.
This is a particularly vexing problem for marketers in sensitive and regulated industries like healthcare and financial services. Marketers in a hurry to gather insights may move big data sets to the cloud using external tools and vendors, but without consulting central IT and security on the right technique and tool, they are creating privacy risks.
7. DSAR at Scale
Rights similar to the GDPR’s data subject access request (DSAR) are appearing in various forms in emerging US state laws. They give customers the right to access, correct, transfer or delete their personal information from all applicable company systems, databases and geographies.
Handling a DSAR request manually is costly and error-prone. Marketers need the systems in place to automate response to DSAR requests across all applicable data sets and sources in a timely, cost-effective manner — or risk creating a compliance gap.
Unfortunately, data silos and fragmentation across systems mean not many marketers are prepared to handle DSAR requests seamlessly and within the timeline specified.
8. Transferring Data to Partners and Vendors
Marketers are responsible for any user data they collect, through its lifecycle, period. Not just on their own digital properties but even when the data has left their premises or goes out of their direct control.
Decentralized marketing teams with complex martech and ad tech stacks can’t do without sharing data with third-party cloud-based software as a service (SaaS) vendors. But they need clarity on where vendors host data, their ability to illustrate compliance in multiple applicable geographies, and a clear line of sight into the path their data travels. Ask central IT and legal for a checklist of questions to ask partners about the privacy and security of data being transferred or processed.
In short, marketers can transfer the data, but not responsibility for data privacy.
9. Not Separating Data Privacy From Data Security
Though joined at the hip, privacy and security are not the same thing. Data security is about authorized and legal access to data, whereas data privacy refers to the data owner’s right to choose how their data is collected, stored, processed and even discarded. While chief information security officers (CISOs) or chief technology officiers (CTOs) are often accountable for data security, research shows the chain of command to enforce data privacy policies is often diffused and execution is often decentralized across teams in different geographies.
To avoid the associated risks, marketers should take responsibility for consumer data privacy and engage with CISOs, CTOs and legal counsel to build a clear chain of command and ensure marketing and sales workflows don’t create inadvertent privacy vulnerabilities.
10. An Inadequate Privacy Tech Stack
Marketers responsible for the brand’s privacy will need the right tools and platforms to walk the talk.
- A consent management platform ensures geographically applicable opt-in at each internal and external touchpoint, including social media and private social communities such as Discord or Slack.
- A robust preference management platform to power true messaging relevance and contextuality at each stage of the consumer’s life.
- Access to the IAB Tech Lab’s Global Privacy Platform (GPP V1): A recent framework to enable digital advertising stakeholders — advertisers, publishers and technology vendors — to leverage customer preferences, reduce the cost of managing privacy compliance and mitigate privacy risks.
- Other must-haves, as per Frohlich, include a cookie management tool with global privacy functionality, a data discovery tool like Securiti.ai to help enterprises control complex security, privacy, and compliance risks; and a data lineage tool such as Informatica to connect, unify and democratize your data to advance business outcomes.
Trust, Not Data, Is the New Oil
“With data privacy laws varying by region, marketers could easily overlook minor items that cause major risks. For example, records in contact lists sent via email or Slack will not get discovered in ‘right to forget’ system scans. Not providing B2B clients with adequate data privacy options or not preparing for Global Privacy browser settings are common,” Frohlich warned.
The lesson is that consumer data privacy is evolving too fast to act on each new change in each geography for each element of your marketing mix in a fragmented manner. A proactive, top-down, privacy-first, privacy-by-design, data minimization culture just makes more sense.