A concert-goer taking a photo.
User-generated content can be both a marketer's dream and a market's nightmare. PHOTO: veeterzy

For many marketers, user-generated content (UGC) is a dream come true — and it’s not hard to understand why. One reason is obvious: Free content frees up time and dollars for other campaigns.

Another reason is its effectiveness

  • User-generated content increases brand engagement by 28 percent.
  • Users are twice as likely to share user-generated content than content generated by a brand.
  • 50 percent of consumers trust user-generated content over content published on the brand’s own website.
  • User-generated content can significantly boost SEO.

Sounds like a win-win situation, right? Well, it can be. But it can also be your worst nightmare — if you don’t do it right.

The Legal Risks of User-Generated Content

The digital revolution completely disrupted legal issues that were thought to be settled. In 1998, Congress responded by passing the Digital Millennium Copyright Act, which strengthened some copyright protections while simultaneously providing some immunity to service providers that play only a passive role in any violations. YouTube, for example, can’t be held responsible when users post copyrighted movies -- as long as they react immediately upon notification. Similarly, the Communications Decency Act protects service providers from liability due to the actions of users, especially when it comes to defamation.

However, there are still a lot of gray areas \and gray areas are inherently risky. Here are the six that keep me up at night:

1. Ownership and Licensing

This addresses the issue of whether a brand violates copyright law by sharing user-generated content. The answer? It depends on things like implied consent, explicit consent, licensing, hashtags and so on. The bottom line is that, just because somebody tags your brand in a photo, you don’t necessarily have permission to use that photo.

2. Privacy

If a photographer posts an image under a Creative Commons license, a brand could understandably conclude that they have all the permission they need to use it. But what if there are other people, especially minors, in the photo? While the photographer can choose to give up ownership rights to the photo itself, he can’t give up the privacy rights of the people in the photo. That’s up to the individuals in question — or, if they’re minors, their parents.

Chang v Virgin Mobile provides a perfect example of this conundrum. Fifteen-year-old Alison Chang was photographed participating in a church car wash. The photographer posted the photo on Flickr. Virgin Mobile later used the photograph in an ad campaign. The company argued that the photograph had been posted under a Creative Commons license that allowed commercial use. And that was true, but that didn’t translate into permission to use the teenager’s image, which wasn’t the photographer’s to grant.

3. Monetization

Even users who freely submit content may have second thoughts when a brand uses that content to generate a profit. A user who thought they were just participating in a hashtag campaign on Twitter might not be thrilled to find their content at the center of a huge ad campaign.

4. Harassment and Hate Speech

Theoretically, the Communications Decency Act protects brands from being held liable for content posted by users. When you dig down to the specifics, however, things get a little murkier. Does that immunity still apply if the brand is aware of the offensive content? Is a general awareness that users “might” post offensive content enough to cancel out immunity, or does it require a specific complaint and takedown request? What kind of time frame is involved? And, conversely, can a brand be subject to discrimination accusations when it takes down one user’s content but not content from another user?

5. False Statements

Sometimes, users post inaccurate statements about a brand’s competitors. Is the brand responsible? Maybe. Subway filed a lawsuit against Quiznos after the latter sponsored a contest in which users submitted “Subway vs. Quiznos” videos. The case was complicated by the fact that, while Quiznos didn’t generate the content itself, they did actively solicit it. The parties ended up settling out of court.

6. Hijacking Campaigns

There’s always the possibility that users will hijack a brand’s campaign, harming the brand’s image rather than enhancing it. McDonald’s, for instance, wanted to recognize the farmers who grow the food they use in their products. The initial hashtag, #MeetTheFarmers, was innocuous. Somewhere along the line, however, the hashtag changed to #McDstories. The end-result was a stream of posts about users’ experiences with McDonald’s — most of which were unflattering. McDonald’s pulled the campaign after a mere two hours, but the hashtag continued to circulate.

The bottom line is that, as valuable as user-generated content can be, it has to be weighed against the corresponding risks, which can be substantial. And, unless you want frontline employees making that decision every day, you have to create digital policies addressing the potential issues.

Creating Digital Policies to Govern UGC

The legal risks of user-generated content are high. And then there are the “soft” costs — like the amount of time employees spend discussing whether they should share a particular piece of content or take down a user’s comment. Clear, easily accessible policies about user-generated content are the best way to offset both of those risks. Here are some things to consider:

  • Should you require explicit consent before sharing a piece of content, or can consent be assumed?
  • How, and to what extent, will you moderate the content users post on your website? How often will you check? What criteria will you use to determine whether to remove a post, and how quickly will that be done?
  • Where is the line between playing an active role vs. a passive role, and do your employees understand it?

These are just a few of the questions digital policies should address regarding user-generated content. I recommend that my clients retain legal counsel for liability advice and a digital policy expert to help them weave their way through issues that aren’t quite so cut-and-dried.