An illustration of fake news
What strategies can online communities use to deal with fake news? PHOTO: The Public Domain Review

The impact of "fake news" continues to make headlines as Facebook and other social networks struggle to combat this unexpected player in the election and beyond.

While the numbers of fake news creators are climbing, the problem is not new — and not restricted to fake news.

Internet trolls, who at times seem to dominate online communities and ruin the opportunity for real sharing, plague larger communities.

Just last month, the popular Internet Movie Database (IMDB) announced it was shutting down the popular message boards on its lively online community because of internet trolls. Many of those most loyal to the site wondered if there would be a continued reason for going to the site anymore.

So how do sites that rely on these communities to keep users coming back, also work to make sure a positive experience for users?

By their very nature and goals, smaller private online communities are often more keenly aware of wanting to control the problem before it leads to discord among constituents. Because of this, they are ahead of the problem.

So we asked community leaders from a variety of private networks for their thoughts on the challenges of fake news, as well as any lessons they’ve gleaned from running their own communities.

From the way private communities monitor message boards and forums for accuracy to the steps they take to promote diverse opinions, are there lessons that could be applied to larger networks? Here are their thoughts.

Vanessa DiMauro, CEO of Leader Networks

Vanessa DiMauro
Vanessa DiMauro
DiMauro heads Leader Networks, a Belmont, Mass.-based strategic research and consulting firm. As an executive advisor, speaker and educator, she helps organizations drive top line growth through digital strategy design and thoughtful, scalable execution, find better ways to meet customer needs, exceed customer expectations and accelerate business processes, and provide better customer experiences.

One of the many benefits of professional communities is that fake news is less of a problem due to the nature of the discourse and the traceability of the community members who tend to shy away from political or general interest topics.

So you can take that and apply the same good judgment and rules to larger communities -- know your source of information, rely on trusted outlets, question the information, and if you see something, say something. Now, more than ever, the accountability around information is shared, and community members, whether it’s a small neighborhood listserv or Facebook, need to play a role in stomping out misinformation.

Fake news can slip through the finest of editorial review processes, and as community members, we all share a responsibility for what information we spread.

Lindsay Starke, Senior Community Manager at Higher Logic

As senior community manager at Higher Logic, Lindsay Starke serves as the company's resident tactical community expert. Her specialty is bridging gaps and providing clients with engaging content and strategic initiatives that help them to achieve their digital goals while maintaining excellent user experience and customer service.

Online communities, just like the communities our ancestors formed to hunt mammoths millennia ago, are governed by a social contract and behavioral norms. These rules are reinforced, challenged and adapted by the members of that community — for better and for worse.

In traditional, top-down forms of communication, editors and other media arbiters can determine what is and is not considered true, and this information filters down to consumers, with some limited flow back up (comments sections usually do not inform editorial decisions).

With the advent of online communities and other digital communication technology, news has become more democratic. With the ability to transmit information globally carried, quite literally, in the palms of our hands, “news” can be created and disseminated by anyone, as the post-election stories about Facebook prove.

Of course, with collective voice also comes collective intelligence. We remind ourselves not to use Wikipedia as a primary source, but many of us still go there as a first source.

The social contract between Wikipedia editors helps to govern it and keep up a general level of agreement in the veracity. So too with other online communities: as participants model behavior and norms to others, the group begins to benefit from the wisdom of crowds.

We become our own quality control.

Online communities are not free from fake news (Reddit has quite a few sub-Reddits dedicated to reinforcing truly foul stories), but we have processes for managing them. To paraphrase the most democratic of superheroes: with distributed power comes distributed responsibility.

David Spinks, CEO of CMX Hub

David Spinks is the founder of CMX, the hub for the community industry. Forever fascinated by how technology can bring people together, he's been studying and building online communities since he was 13 years old. Before founding CMX, Spinks cofounded Feast (acquired) and BlogDash, has built communities for Zaarly, LeWeb, and Udemy.

In thinking about this, you have to first acknowledge that what works for smaller communities won’t necessarily scale to the size of large social networks.

That said, there are things that could work towards a solution. One of the things that work in smaller communities is reputation, a critical part of any community platform.

In larger networks, you could give account reputation scores to every user. This will motivate them to only create and share good quality, accurate content.

For people who post fake content on a regular basis, their reputation would take a hit. For those who contribute quality content and work to flag fake content, their reputation would improve.

Richard Millington, Founder and Managing Director, FeverBee

Richard Millington is the founder of FeverBee, author of Buzzing Communities, and a frequent speaker at online community events around the world. Richard's unique focus in on cracking the 'social code' behind successful social groups. FeverBee's approaches combine cutting edge social psychology with advanced data insights and a library of repeatable case studies.

Social media platforms shouldn’t be surprised by the problems they’re facing today. Most of the problems are as old as online communities themselves.

The impact may seem bigger today (e.g. accusations of impacting election results), but the problems (community members spreading false news) are very familiar to those of us who manage online communities. Resolve the problems and you remove the negative impact.

The most common problems are spam, abuse, privacy attacks, declining trust, radicalization, participation inequality, lack of good filtering, growing demands upon technology, and finding good managers for communities.

Of course the bigger any group becomes, the more it will encounter these problems. Not because they're doing anything wrong — they're simply bigger.

Every platform has just three tools to resolve these 10 problems. They can change the technology (how the community is hosted), processes (how it is managed), or the people (who manage it and who's allowed in). The magic is how to configure those resources.

Every problem is resolved by placing some form of restrictions upon members. You usually restrict what members can do or what members can see. You might begin with a utopian vision to let members do what they want, but that will rapidly become a 4chan-style cesspit of illegal and immoral activities.

The solution is deciding where your community is going to sit on that continuum and applying this consistently. If you think members should be allowed to share information which isn't true (i.e. you believe individual liberty trumps accurate group knowledge), that's fine. But communicate that often and consistently.

Communicate the benefits of that, too. If you don't, then you're going to need to say so and determine the process (who decides what's fake, what's the criteria, how can you scale this).

The social media platforms today seem to be reacting to problems haphazardly instead of proactively establishing their values and where they have decided to be on the continuum. Not everyone will agree with your place on the continuum, but they shouldn't be surprised by the decisions you make based upon it.