A Blog by Jonathan Low


Apr 6, 2021

How Facebook Built the Perfect Platform For Covid Conspiracies

Engagement through groups reinforced misinformation and conspiracy theory spread. 

And also generated more advertising income for Facebook. JL 

Sarah Frier and Sarah Kopit report in Bloomberg BusinessWeek:

At the heart of Facebook’s misinformation is the design of Facebook itself. To build community meant highlighting Groups, so people would bond over shared interests. This change also helped Facebook boost revenue by pushing more content into people’s feeds. Now anything shared in a group they belong to ends up in their feed, creating more space for ads. Facebook’s algorithm recommended groups to its users based on what people with similar interests tended to like. Facebook might suggest one about vaccine harm. And so users found communities, which propelled them deeper into misinformation rabbit holes.

Kaleese Williams had mostly stayed off Facebook and Instagram before Covid-19 hit. But during the lockdown, the 37-year-old was stuck on her north Texas farm with her husband, their 3-year-old, and their chickens and goats. She was also cut off from a source of income. Williams sells essential oils for a multilevel marketing company in Utah called Young Living. She’d normally set up booths at conferences and other events, making a little money while socializing with passersby. “Quarantine is not a whole lot of fun,” Williams says. “So I started thinking, ‘What would be so wrong with me sharing on social media?’ ” Her plan was to take her essential oils business on Instagram, where she could sell to people she met there.

Williams decided to splurge on an online course called Ready Set Gram Pro. It promised to help her build a “highly engaged” community on the photo sharing app that would “generate consistent leads and sales.” By watching web tutorials and participating in Zoom sessions, she learned tricks to attract potential customers to her profile—for instance, by commenting on the posts of popular wellness influencers.

As she built her following to more than 1,000 users, she became engrossed by Instagram, especially the parts of the app dedicated to natural living. Williams was already averse to traditional medicine after feeling bullied during a bout of cancer in 2017, during which she says her doctor failed to disclose that a treatment she underwent could cause infertility. Now she was spending more and more time consuming information about different forms of alternative medicine such as naturopathy and functional medicine.

That’s where she first started to read about the Covid-19 vaccine. She came across posts based on unfounded rumors that claimed the Pfizer Inc. and Moderna Inc. shots were toxic, caused adverse reactions, and might have infertility risks. Before long she became convinced that the Food and Drug Administration-approved vaccines, which have few side effects and are almost entirely effective at preventing hospitalization or death from Covid, were not for her. “It’s scary,” she says. “I believe in the immune system. I do not believe in vaccine-induced herd immunity.”

You’d think during the worst pandemic in a century virtually everyone would be desperate to get their hands on a vaccine that promises to help them get their life back. But you’d be underestimating the power of Facebook and Instagram to provide all the necessary tools for anti-vaccine activists and other wellness hucksters to suck in converts. Over the years, these opportunists have cultivated a strategy optimized for the social era. They drip anti-science skepticism into Facebook groups and Instagram stories and posts, where algorithms reward content that elicits strong emotional reactions, further amplifying the misinformation.

These social media influencers, legitimized by their sizable follower counts, had a full year to sow doubt about Covid vaccines before Facebook took significant action. They’ve exploited public confusion and mixed messaging from government and health officials on everything from masks to vaccine side effects and safety. Facebook’s official stance is that it doesn’t ban posts unless they “cause imminent harm”—a threshold the social network claims vaccine misinformation only crossed months into a global inoculation campaign.

Even as hesitancy persists and anti-vaccine lies continue to circulate online, Chief Executive Officer Mark Zuckerberg stalwartly defends Facebook Inc.’s actions. His critics argue the company still hasn’t done enough. “The content that your websites are still promoting, still recommending, and still sharing is one of the biggest reasons people are refusing the vaccine,” Pennsylvania Representative Mike Doyle, a Democrat, said at a March 25 congressional hearing with Zuckerberg and his fellow social media CEOs. “And things haven’t changed.” In October 2020 a group of wellness gurus with social media followings in the millions gathered virtually to discuss an historic opportunity. The world was months away from the Covid-19 immunization effort, with several vaccine makers signaling that they’d soon be seeking emergency use authorization from the FDA. These vaccine skeptics saw an opening to push a counternarrative.

In a series of discussions with the vibe of a sales conference, the speakers talked up the promise of the coming months. “All of the truths that we’ve been trying to broadcast for many, many years, there are people hearing it,” said Robert F. Kennedy Jr., son of RFK, and a leading vaccine conspiracy theorist. “Those seeds are landing on very fertile ground.”

For years, activists—some with medical credentials, some with none at all—had attracted followings, especially among moms of small children, by claiming, falsely, that routine measles and mumps shots can cause autism and other maladies. Although the vast majority of Americans have ignored this and continue to get their inoculations, measles, which the U.S. Centers for Disease Control and Prevention declared eradicated from the U.S. two decades ago, has made a comeback in recent years. Even slight decreases in vaccination rates can chip away at the herd immunity needed to keep certain viruses at bay, and in 2019, the U.S. saw a 300% increase in measles cases. Among the outbreaks’ causes: “misinformation in the communities about safety” of the shots, according to the CDC.

With Covid, adults, not kids, were the first to be eligible for the shot. Still, vaccine skeptics targeted a group whose fears they knew well: young women. Last fall the groups started circulating on Facebook and Instagram a now-deleted blog post of unknown origin citing two doctors with an incorrect but frightening headline: “Head of Pfizer Research: Covid Vaccine Is Female Sterilization.” It falsely claimed that the vaccine contained a spike protein that could block the creation of a placenta and make women infertile.

This claim was false, but it compounded real uncertainty. Pfizer and Moderna hadn’t yet specifically tested their vaccines on pregnant or breastfeeding women, and the FDA’s emergency use authorization doesn’t cover pregnancy. The guidance from the American College of Obstetricians and Gynecologists only goes as far as saying that “vaccines should not be withheld from pregnant or lactating individuals.” Even so, by February, more than 30,000 pregnant women had signed up for a U.S. government monitoring program after getting Covid shots, and so far there have been no red flags. More recent studies have found the vaccines are not only effective on pregnant women, but they also pass antibodies on to their newborns. And because pregnant women are at greater risk of dying of Covid, many doctors are recommending they get the shot anyway. “Women are confused,” says Lori Metz, a licensed clinical social worker in New York who specializes in fertility. “Your doctor may say one thing, and then you read a blog that starts to pull on all these other fears.”

This gray area was fertile terrain for anti-vaccine activists. In December, Del Bigtree, founder of the Informed Consent Action Network, shared the false post about sterilization with hundreds of thousands of his followers on Facebook and Instagram. The blog was subsequently shared on Facebook more than 25,000 times. “I am seeing this EVERYWHERE!” a woman named Emily wrote along with a screenshot of the fake Pfizer blog, which can still be found circulating on Facebook in multiple languages. “I am starting to believe this.” Commenters flooded her post with more (completely false) “evidence” backing up the claim. Some said it proved another debunked conspiracy: that the Covid vaccine is part of the worldwide Bill Gates-funded depopulation effort.

The effects of this disinformation are already showing up in survey data. Of people who say they are not likely to get a vaccine, more than half of women in the U.S. are concerned about side effects, compared with 44% of men, according to a U.S. Census Bureau survey from Mar. 3 to Mar. 15. Many of the already eligible women are turning down the shot, according to surveys and interviews with over a dozen people. More than a third of nurses, a group that skews heavily female and was among those first offered shots, aren’t confident the Covid-19 vaccine is safe and effective, according to the American Nurses Foundation. And a Washington Post-Kaiser Family Foundation poll in March found 18% of health-care workers don’t plan on getting vaccinated.

The high rates of refusal and hesitancy among health-care workers is an alarming bellwether. The vaccines are largely considered the country’s best shot at ending a pandemic that has killed more than half a million Americans and has caused a global financial crisis. Epidemiologists estimate vaccinating 70% to 85% of the U.S. population would trigger the herd immunity needed for a return to normalcy. If even high risk front-line workers who’ve seen the devastation of Covid firsthand don’t want the shots, experts worry not enough of the general population will get theirs either, allowing the virus to continue circulating.

Up against the internet memes and anecdotes, Pfizer has offered scientific jargon. “It has been incorrectly suggested that Covid-19 vaccines will cause infertility because of a very short amino acid sequence in the spike protein of SARS-CoV-2 virus that is shared with the placental protein, syncytin-1,” it said in response to a news report on the rumor. “The sequence, however, is too short—four shared amino acids—to plausibly give rise to autoimmunity.” It was perfectly accurate, but the misinformation was infinitely more shareable, says Karen Kornbluh, director of the German Marshall Fund of the United States Digital Innovation and Democracy Initiative. “The folks who are supporting the science have to get better at telling the story,” she says.

In the first months of the Covid pandemic, Zuckerberg went through the motions of positioning himself, and by extension Facebook, as a source of good, science-based information. He hosted Anthony Fauci, the country’s top infectious disease official, several times for live question-and-answer sessions and had his company develop a Covid-19 page with information on social distancing, testing, and masks.

As for all that vaccine misinformation circulating on his platform, Zuckerberg as recently as September said he didn’t think it was appropriate for his company to take most of it down. “If someone is pointing out a case where a vaccine caused harm or that they’re worried about it—you know, that’s a difficult thing to say from my perspective that you shouldn’t be allowed to express at all,” he told news site Axios. “There is a fine line between an important level of high energy around an important issue and something that can kind of tilt over into causing harm.”

By Feb. 8, almost two months after vaccinations began in the U.S. and a year since the Covid-19 crisis started, Zuckerberg reversed himself and decided that misinformation was, in fact, causing harm. By that point, online skepticism was manifesting in real-world decisions to not take the shot. Facebook declared that Instagram accounts and Facebook groups that repeatedly shared false information about vaccines would be banned and that those advocating against vaccines would become less prominent in search results—a move that critics had urged for years. Kennedy, Bigtree, and other big names lost their access to Facebook. But many others didn’t. In a test search for “vaccine” on Instagram a few days after the announcement, the majority of the top 20 accounts offered up by the platform were explicitly vaccine-skeptical. The sixth in the list was called @antivaxxknowthefacts (profile: “Inject Veggies Not Vaccines”). The eighth was called @anti.vaccine. The 12th, listed right before @covidvaccineinjury and @anti_vaccine_4_life, was @vaccinefreedom, the account of the 54,000-follower National Vaccine Information Center, the same group that sold tickets to the October conference where Kennedy spoke.

Facebook told Bloomberg Businessweek that since February it’s removed 2 million pieces of anti-vaccine content that violated its policies. But by then many of these conspiracies had already made their way to people such as Williams on her north Texas farm and still continue to circulate in ways Facebook’s automated cleanup tools can’t as easily find, such as via screenshots, in comments, and in group messages.

Even worse, misinformation seeded by vaccine skeptics on Facebook was spreading in the real world—even into nursing schools. A nursing student from Houston says her clinical professor proudly declared to the class that she wasn’t getting the vaccine. The person administering vaccines on campus was abstaining, too. One 28-year-old nurse in Southfield, Mich., says she declined her first opportunity for the vaccine because she’s trying to have a baby. She’d seen the claims online, and “even if you’re in the medical field,” she says, “you just don’t know.”

Monika Bickert, a content policy executive at Facebook, defended the timing of the move on a call with reporters in February. She said the World Health Organization, the CDC, and other public health experts advised the social network to take stronger action, because misinformation was convincing people not to get the shot. That was the proof Facebook needed that content on its site was causing “real world harm.” This was the same standard the social network used during the 2020 U.S. presidential election, when it refused to deactivate #StopTheSteal groups that were spreading the lie that President Donald Trump had won the election. Instead, Facebook stuck a note on their posts that Joe Biden had really won the race. On Jan. 6, violent rioters stormed the U.S. Capitol—an attack partially planned on the social network. Only then did Facebook start banning #StopTheSteal groups and suspend Trump’s account.

Facebook’s approach of allowing false content about vaccines to stay online and merely fact-checking it may be even less effectual than its attempts to rein in political misinformation and incitement. People who have already bought into conspiracies are unlikely to be swayed by a misinformation label, and the labels are easily ignored. Instagram users now see a pop-up when they search “vaccine” asking if they really want to see the results. It’s readily dismissed with a tap.

At the heart of Facebook’s misinformation problems is the design of Facebook itself. After the 2016 election, as a response to criticism of Facebook’s growing role in political polarization, the company made a crucial change to the way its platform worked. Zuckerberg announced a new mission statement: To build community and bring the world closer together. That meant highlighting Groups, so people would bond over shared interests, instead of bickering about news. This change also helped Facebook boost revenue, by pushing more content into people’s feeds. Before, users mostly saw what their friends or friends of friends posted; now anything shared in a group they belong to ends up in their feed, too, creating more space for ads.

Facebook’s algorithm recommended groups to its users based on what people with similar interests tended to also like. So if you joined a group about vegan cooking, Facebook might recommend a group about natural medicine. If you joined that one, then Facebook might suggest another one about vaccine harm. And so users found communities, some of which propelled them deeper into misinformation rabbit holes. The social network uses similar mechanisms to personalize recommendations for Instagram accounts to follow. “Once you find somebody,” Williams says, “you’re able to find another.”

At the same time, Facebook and Instagram also highlighted what it called “meaningful” conversations—posts that generated lots of comments very quickly. The change, meant to highlight pregnancy, engagement announcements, and other big life events, also boosted controversial, surprising, or scary content that stirred up debate, including anti-vaccine posts. Well-meaning people trying to debunk vaccine misinformation in the comments have instead helped it go viral by signaling to Facebook’s algorithm that it should push the posts into more people’s feeds.

Instagram was fertile ground in its own way. Health is one of the fastest-growing categories, and some of the biggest names on the platform live in a genre broadly known as wellness. A subgenre of those influencers, such as the ones Williams was drawn to, offer up pseudoscientific strategies for healthy living. Benign-sounding lifestyle choices (plant-based diets, detox teas) are served up to users next to questionable medical advice. When Covid hit, for example, some of these influencers claimed a healthy diet, exercise, and whatever supplements they were hawking were the best ways to avoid catching the virus.

Behind these claims, often, is a profit motive. The big names in the anti-vaccine movement, Bigtree and Kennedy among them, make money off speaking engagements, online webinars, or the sale of supplements. One Instagrammer sells $15 packs of stickers that say, “Vaccines can cause injury and death” and “I will never get a Covid-19 vaccine.” Another hawks what she claims are Covid-busting workouts. A third, who says she is a naturopathic doctor, charges $295 for a vaccine consultation, or $49.97 for a webinar, and gives discounts on supplements and air purifiers to rid one’s environment of “toxins.” A legal disclaimer at the bottom of her website says the information is “for educational and informational purposes and is NOT medical advice.” She has no disclaimer on her Instagram page.

Talk to doctors who’ve spent the last year in the Covid wards, and they’ll say they’re spending more and more time talking their patients out of something wildly untrue they read on Facebook. “I’ve had countless patients tell me that Covid isn’t real and that it’s no worse than the flu,” says Ryan Marino, a medical toxicologist in Cleveland. Now people tell him they don’t want the vaccine either—and not just young women; anti-vaccine activists have spread lies among other vulnerable groups, particularly Black communities. Marino is part of an informal cadre of health-care workers who spend what free time they have on social media attempting to fight bad information with good. Marino’s medium of choice is Twitter, but there are doctors like him on Facebook, Instagram, and TikTok, too.

This is exactly the type of behavior Facebook says will overpower the vaccine misinformation rampant on its platforms. “Research shows that the best way to combat vaccine hesitancy is to connect people to reliable information from health experts,” says Facebook spokesman Kevin McAlister, pointing to the company’s Covid-19 Health Information Center. Studies have found that straight facts do little to shift opinions; personal stories from known sources work much better. At first Danielle Belardo thought, much like Facebook, that sharing science-based information would do the trick. When the virus started spreading in March, Belardo, then a cardiology fellow in Philadelphia, was reassigned to the Covid wards. She’d spend her days attending to patients, some on ventilators, others barely able to breathe on their own—all without the proper protective gear for herself. “It was really rough,” she says. “I was going to work, seeing the virus, seeing the death, and then coming home to see a ton of misinformation online.”

She already had a healthy Instagram following from posting about plant-based nutrition. At the beginning of the pandemic, she decided to use her platform to debunk false viral Covid-19 claims. Her early posts were technical, scientific, and aimed at setting the facts straight. Whenever she got harassment and angry commenters—which was a lot—she’d reply to the critics. “I was doing it wrong, and a lot of physicians were doing it wrong,” she says. “We were highlighting posts that were false and linking right to them, directing higher traffic to those posts, boosting them in the algorithm.”

Belardo, who’s now director of cardiology at the Institute of Plant-Based Medicine in Newport Beach, Calif., no longer engages directly with lies, because that only calls further attention to them by the logic of Facebook and Instagram. Instead she tries to shares things that she knows will play well: memes, selfies, personal stories, and Q&A posts. She’s been relatively successful. Her follower count has grown by tens of thousands of people. But her posts still draw plenty of anti-vaccine commenters, despite blocking anyone who harasses her.

Marino says he, too, is flooded with harassment and death threats—and not only online. People have shown up at his workplace; others have called the hospital where he works, trying to get him fired. “I’ve had patients accuse me of profiting off Covid, of only testing people because I get paid to make a diagnosis,” he says. “Meanwhile the biggest key figures in the anti-vax movement have all done very well financially.”

Despite their success building bigger followings, efforts like Belardo’s and Marino’s have yet to reach Williams. The algorithms show people more of what they want to see. “To be completely honest with you, I haven’t seen doctors recommending the shot,” Williams says. And even if she did, she says she wouldn’t believe them.


Post a Comment