erasing the Facebook logo on a phone
PHOTO: Thought Catalog

Thirteen years ago this month, my colleague Atle Skjekkeland and I decided we needed to figure out exactly what Facebook was all about. At the time, Facebook had 58 million monthly users, almost all high school and college kids, and was in the process of opening up access to non-educational domains. And so we tentatively set up accounts, with only each other as “friends” (seemed like a weird concept at the time), and waited to see what would happen. The resultant family drama is described here: "My Daughter is Horrified: "You're on Facebook??!!!"

Facebook and Me: What a Long, Strange Trip It's Been

As you can tell from my first three posts, I had a very rich and deep understanding of the potential of the Facebook platform and the nuances of status updates.

john mancini facebook

It’s been a long Facebook journey for me since those initial three highly incisive status updates. I’ve thoroughly enjoyed the long-lost connections Facebook facilitated, but my enthusiasm has been tempered by some of the many Facebook bumps along the way.

  • The “interesting” structure of the initial IPO that essentially allowed Mark Zuckerberg unprecedented power.
  • The abandonment of the initial goal of connecting friends and family in favor of creating one of the world’s most massive marketing engines.
  • The massive scaling to over a billion global users, and the aggressive crowding out of potential competitors like Instagram and WhatsApp along the way by simply gobbling them up.
  • The often cavalier treatment of privacy, topped off by the Cambridge Analytica fiasco, and the use of that data to disrupt the 2016 elections.
  • The 2016 Buzzfeed report showing that false news stories outperformed real news.
  • The use of the social network by Myanmar's military officials in 2018 to incite genocide against the Muslim Rohingya minority.
  • The $5 billion FTC fine for privacy violations.

After the 2016 elections, Facebook and I took a trial separation. I just couldn’t take all the incessant political screaming that seemed to substitute for conversation and the feeling that I was being gamed — not just by Russian or Chinese bots, but by sophisticated political analytics of all stripes, both foreign and domestic and Democratic and Republican.

Related Articles: Facebook, A Case Study in Ethics

Dear Facebook, It's Not Me, It's You

I gradually drifted back into the Facebook fold in the last few years, especially to Instagram. But now, I think it’s time to formalize the divorce papers. I suppose the capper in all of this is the likelihood of continued shenanigans — from both foreign and domestic actors — in the 2020 elections and the underlying threat this represents to our democracy. And the response from on high?

  1. Facebook won't police misleading political ads and statements made on the platform.
  2. Facebook will register 4 million new voters.
  3. Facebook will allow users to turn off all social issue, electoral or political ads from candidates, Super PACs or other organizations that have the “Paid for by” political disclaimer on them.

Not exactly inspiring.

Coders have a saying to describe unintended positive consequences of errors and the after-the-fact documentation to show that bugs were actually intended: “It’s Not a Bug, It’s a Feature.” I suspect the “feature” Facebook has helped introduce to our political system — ever more narrow and self-reinforcing silos of fellow believers — is not an unintended bug. It’s at the core.

In its 2019 Annual Report, Facebook notes,

We generate substantially all of our revenue from advertising. The loss of marketers, or reduction in spending by marketers, could seriously harm our business …. Marketers will not continue to do business with us, or they will reduce the budgets they are willing to commit to us, if we do not deliver ads in an effective manner, or if they do not believe that their investment in advertising with us will generate a competitive return relative to other alternatives.

If Facebook was just doing all this to sell widgets, perhaps it wouldn’t matter. But in the process of massive scaling, it has also become a primary source of news and the filter through which ideas are disseminated. And the results have not been good.

Related Article: Marketers, Data Collection and the E-Word: Ethics

Facebook's Fundamental Design Flaw

Per the Pew Research Center’s American Trends survey of over 5,000 people last year, 51% said “inaccurate news” and “one-sided news” on social media are “very big problems.” Eighty-eight percent said social media companies tend to favor news organizations that “produce attention-grabbing articles” and 84% said they favor organizations “with lots of social media followers.” Only 34% said they favored news organizations with “high reporting standards” and only 18% said they favored those “whose coverage is politically neutral.”

Massive political segmentation, driven by advertising dollars and ever-richer layers of personal data, is not a small bug that can be fixed by reforms or reconfigured as a feature. It’s a fundamental design flaw.

Yaël Eisenstat, a visiting fellow at Cornell Tech in the Digital Life Initiative and a former elections integrity head at Facebook, CIA officer and White House adviser, offered this assessment in the Washington Post:

... it’s clear that tinkering around the margins of advertising policies won’t fix the most serious issues. The real problem is that Facebook profits partly by amplifying lies and selling dangerous targeting tools that allow political operatives to engage in a new level of information warfare. Its business model exploits our data to let advertisers aim at us, showing each of us a different version of the truth and manipulating us with hyper-customized ads — ads that as of this fall can contain blatantly false and debunked information if they’re run by a political campaign. As long as Facebook prioritizes profit over healthy discourse, it can’t avoid damaging democracy.

What now? Ideas welcome.