Facebook has helped sow doubts about COVID-19 vaccines

Read more...

Mar 17, 2021

Share

If you were running the most influential social media platform in the world and small groups on your website kept demonstrating an outsized ability to warp attitudes about Covid-19 and other pivotal issues, how would you respond?

A) Study the behavior.

B) Ban the behavior.

C) Ignore the behavior.

D) Count your money.

Mark Zuckerberg, the founder and steward of Facebook Inc., has generally opted for a mix of A, C and D. Too often, he has avoided choosing B. The latest test of whether he might chart a new course arrives courtesy of Washington Post reporter Elizabeth Dwoskin, who got her hands on internal Facebook documents showing that the company has been studying what shapes its users’ doubts about Covid-19 vaccines in the U.S.

Facebook divided its American users into 638 segments of various sizes and discovered that just 10 of those segments produced half of all the “vaccine hesitancy” content on its platform. Within segments populated by users most resistant to shots, half the anti-vaccine content was generated by only several dozen Facebook members. In other words, a handful of Facebook’s users managed to seed widespread anxiety about vaccines because of the ease with which some ideas — no matter how bonkers or dangerous — go viral.

Moreover, as Dwoskin noted, Facebook’s research found “early evidence of significant overlap between communities that are skeptical of vaccines and those affiliated with QAnon, a sprawling set of baseless claims that has radicalized its followers and been associated with violent crimes.”

The Federal Bureau of Investigation considers QAnon a domestic terrorism threat. Its adherents have a record of fomenting Covid-19 denialism and myths about the 2020 presidential election being rigged. As Dwoskin has reported previously, QAnon acolytes also have diligently spread a host of ludicrous ideas about vaccines — including that they’re bioweapons meant to alter people’s genetic structure so the planet can be depopulated and controlled by governments in league with pharmaceutical companies.

But Facebook’s research also revealed that vaccine skepticism wasn’t limited to QAnon supporters. Doubt had spread to many of its other communities unaffiliated with the movement. All of which poses new challenges for Zuckerberg, who has long asserted that Facebook is a technological platform, not a publisher, and should largely operate as a free-wheeling buffet for myriad voices and perspectives.

Zuckerberg has set some limits around what Facebook deems acceptable. A few years ago the company issued its moderators a list of “community standards,” including definitions of hate speech, violent rhetoric, sexual exploitation and other content that the site would not allow. Early last year, it banned misinformation about the coronavirus. In December, it prohibited false or misleading statements about vaccines. On Jan. 7, a day after Donald Trump’s followers attacked the Capitol, Zuckerberg banned the former president from Facebook to prevent him from using the site “to provoke further violence.”

The Trump ban followed years of criticism that Zuckerberg allowed the ex-president to run rampant on the site, and after months of pressure from civil rights groups and Facebook’s own employees to take action in the wake of the George Floyd protests. Although he cracked down on Trump, Zuckerberg took no action against Steve Bannon, a former Trump adviser, after Bannon called for the beheading of two U.S. officials late last year. Facebook also famously allowed the site to be a playground for Russian disinformation campaigns during the 2016 election.

“We have specific rules around how many times you need to violate certain policies before we will deactivate your account completely,” Zuckerberg told Facebook employees after he decided not to shut down Bannon. “While the offenses here, I think, came close to crossing that line, they clearly did not cross the line.”

At best, Zuckerberg has been inconsistent in his efforts to limit abuses. At worst, he’s turned a blind eye to glaring problems, allowing abuses to flourish. That’s not only because he has to navigate nuances about what constitutes harmful actions and language on the platform. It’s also because Facebook’s business model thrives on the very tribalism and polarization that encourage abuses and disinformation to take hold. “Everything the company does and chooses not to do flows from a single motivation: Zuckerberg’s relentless desire for growth,” noted a recent article in MIT Technology Review that examined why Facebook has been unwilling to more aggressively curtail hate speech and lies on its platforms.

In vaccine hesitancy, Zuckerberg faces the latest version of this conundrum. Social media has become a go-to resource for people looking for reliable information about health care, and because Facebook controls four of the world’s top five platforms — with an audience of more than two billion users — its decisions influence the effectiveness of public health campaigns.

Some of the anti-vaccination messaging Facebook has studied doesn’t violate its rules governing incendiary language. Much of the content is framed as doubt or concern about vaccines, not given out as flagrant misinformation. But the documents the Washington Post obtained also indicate that Facebook recognizes that the messaging is still a problem. “We’re concerned that harm from non-violating content may be substantial,” the documents said.

Common sense may rule the day here. Despite the garbage fires being ignited on Facebook, enough people may want to get vaccinated against Covid-19 that the general population will be protected from the naysayers.

The realities confronting Zuckerberg aren’t going away, however. As this episode should remind him, Facebook isn’t merely a self-sustaining machine that gives the world a place to interact, shop, scheme and disrupt. Like the newspapers it has eclipsed, Facebook is also a gatekeeper.