A Blog by Jonathan Low

 

Jun 13, 2021

Did American Individualism Stymie the Country's Pandemic Response?

Yes, the celebration of and emphasis on individual behavior reduced the history for public or collective (!) action, which had a been a key tenet of American exceptionalism for two hundred years. 

And while originally meant to promote the primacy of business interests, showed its negative impact in how the pandemic spread, affecting all, but not equally. JL

Ed Yong reports in The Atlantic:

Infectious diseases are always collective problems because they are infectious. An individual’s choices ripple outward to affect cities, countries, and continents; one sick person can seed a hemisphere’s worth of cases. Each person’s odds of falling ill depend on the choices of everyone around them—and on societal factors that lie beyond their control. Public health became more individualistic. Epidemiologists began to see health in terms of personal traits. They became focused on finding “risk factors” that make individuals more vulnerable to disease, as if the causes of sickness play out purely across the boundaries of a person’s skin.During a pandemic, no one’s health is fully in their own hands. No field should understand that more deeply than public health, a discipline distinct from medicine. Whereas doctors and nurses treat sick individuals in front of them, public-health practitioners work to prevent sickness in entire populations. They are expected to think big. They know that infectious diseases are always collective problems because they are infectious. An individual’s choices can ripple outward to affect cities, countries, and continents; one sick person can seed a hemisphere’s worth of cases. In turn, each person’s odds of falling ill depend on the choices of everyone around them—and on societal factors, such as poverty and discrimination, that lie beyond their control.

Across 15 agonizing months, the COVID-19 pandemic repeatedly confirmed these central concepts. Many essential workers, who held hourly-wage jobs with no paid sick leave, were unable to isolate themselves for fear of losing their livelihood. Prisons and nursing homes, whose residents have little autonomy, became hot spots for the worst outbreaks. People in Black and Latino communities that were underserved by the existing health system were disproportionately infected and killed by the new coronavirus, and now have among the lowest vaccination rates in the country.

Perhaps that’s why so many public-health experts were disquieted when, on May 13, the CDC announced that fully vaccinated Americans no longer needed to wear masks in most indoor places. “The move today was really to talk about individuals and what individuals are safe doing,” Rochelle Walensky, the agency’s director, told PBS NewsHour. “We really want to empower people to take this responsibility into their own hands.” Walensky later used similar language on Twitter: “Your health is in your hands,” she wrote.

Framing one’s health as a matter of personal choice “is fundamentally against the very notion of public health,” Aparna Nair, a historian and an anthropologist of public health at the University of Oklahoma, told me. “For that to come from one of the most powerful voices in public health today … I was taken aback.” (The CDC did not respond to a request for comment.) It was especially surprising coming from a new administration. Donald Trump was a manifestation of America’s id—an unempathetic narcissist who talked about dominating the virus through personal strength while leaving states and citizens to fend for themselves. Joe Biden, by contrast, took COVID-19 seriously from the off, committed to ensuring an equitable pandemic response, and promised to invest $7.4 billion in strengthening America’s chronically underfunded public-health workforce. And yet, the same peal of individualism that rang in his predecessor’s words still echoes in his. “The rule is very simple: Get vaccinated or wear a mask until you do,” Biden said after the CDC announced its new guidance. “The choice is yours.”

From its founding, the United States has cultivated a national mythos around the capacity of individuals to pull themselves up by their bootstraps, ostensibly by their own merits. This particular strain of individualism, which valorizes independence and prizes personal freedom, transcends administrations. It has also repeatedly hamstrung America’s pandemic response. It explains why the U.S. focused so intensely on preserving its hospital capacity instead of on measures that would have saved people from even needing a hospital. It explains why so many Americans refused to act for the collective good, whether by masking up or isolating themselves. And it explains why the CDC, despite being the nation’s top public-health agency, issued guidelines that focused on the freedoms that vaccinated people might enjoy. The move signaled to people with the newfound privilege of immunity that they were liberated from the pandemic’s collective problem. It also hinted to those who were still vulnerable that their challenges are now theirs alone and, worse still, that their lingering risk was somehow their fault. (“If you’re not vaccinated, that, again, is taking your responsibility for your own health into your own hands,” Walensky said.)

Neither is true. About half of Americans have yet to receive a single vaccine dose; for many of them, lack of access, not hesitancy, is the problem. The pandemic, meanwhile, is still just that—a pandemic, which is raging furiously around much of the world, and which still threatens large swaths of highly vaccinated countries, including some of their most vulnerable citizens. It is still a collective problem, whether or not Americans are willing to treat it as such.

Individualism can be costly in a pandemic. It represents one end of a cultural spectrum with collectivism at the other—independence versus interdependence, “me first” versus “we first.” These qualities can be measured by surveying attitudes in a particular community, or by assessing factors such as the proportion of people who live, work, or commute alone. Two studies found that more strongly individualistic countries tended to rack up more COVID-19 cases and deaths. A third suggested that more individualistic people (from the U.S., U.K, and other nations) were less likely to practice social distancing. A fourth showed that mask wearing was more common in more collectivist countries, U.S. states, and U.S. counties—a trend that held after accounting for factors including political affiliation, wealth, and the pandemic’s severity. These correlative studies all have limitations, but across them, a consistent pattern emerges—one supported by a closer look at the U.S. response.

“From the very beginning, I’ve thought that the way we’ve dealt with the pandemic reflects our narrow focus on the individual,” Camara Jones, a social epidemiologist at Morehouse School of Medicine, told me. Testing, for instance, relied on slow PCR-based tests to diagnose COVID-19 in individual patients. This approach makes intuitive sense—if you’re sick, you need to know why—but it cannot address the problem of “where the virus actually is in the population, and how to stop it,” Jones said. Instead, the U.S. could have widely distributed rapid antigen tests so that people could regularly screen themselves irrespective of symptoms, catch infections early, and isolate themselves when they were still contagious. Several sports leagues successfully used rapid tests in exactly this way, but they were never broadly deployed, despite months of pleading from experts.

The U.S. also largely ignored other measures that could have protected entire communities, such as better ventilation, high-filtration masks for essential workers, free accommodation for people who needed to isolate themselves, and sick-pay policies. As the country focused single-mindedly on a vaccine endgame, and Operation Warp Speed sped ahead, collective protections were left in the dust. And as vaccines were developed, the primary measure of their success was whether they prevented symptomatic disease in individuals.

Vaccines, of course, can be a collective solution to infectious disease, especially if enough people are immune that outbreaks end on their own. And even if the U.S. does not achieve herd immunity, vaccines will offer a measure of collective protection. As well as preventing infections—severe and mild, symptomatic and asymptomatic, vanilla and variant—they also clearly make people less likely to spread the virus to one another. In the rare event that fully vaccinated people get breakthrough infections, these tend to be milder and shorter (as recently seen among the New York Yankees); they also involve lower viral loads. “The available evidence strongly suggests that vaccines decrease the transmission potential of vaccine recipients who become infected with SARS-CoV-2 by at least half,” wrote three researchers in a recent review. Another team estimated that a single dose of Moderna’s vaccine “reduces the potential for transmission by at least 61 percent, possibly considerably more.

Even if people get their shots purely to protect themselves, they also indirectly protect their communities. In Israel and the U.S., rising proportions of immunized adults led to plummeting case numbers among children, even though the latter are too young to be vaccinated themselves. “For people who do not get vaccinated and remain vulnerable, their risk is still greatly reduced by the immunity around them,” Justin Lessler, an epidemiologist at Johns Hopkins, told me.

There’s a catch, though. Unvaccinated people are not randomly distributed. They tend to cluster together, socially and geographically, enabling the emergence of localized COVID-19 outbreaks. Partly, these clusters exist because vaccine skepticism grows within cultural and political divides, and spreads through social networks. But they also exist because decades of systemic racism have pushed communities of color into poor neighborhoods and low-paying jobs, making it harder for them to access health care in general, and now vaccines in particular.

“This rhetoric of personal responsibility seems to be tied to the notion that everyone in America who wants to be vaccinated can get a vaccine: You walk to your nearest Walgreens and get your shot,” Gavin Yamey, a global-health expert at Duke, told me. “The reality is very different.” People who live in poor communities might not be near vaccination sites, or have transportation options for reaching one. Those working in hourly jobs might be unable to take time off to visit a clinic, or to recover from side effects. Those who lack internet access or regular health-care providers might struggle to schedule appointments. Predictably, the new pockets of immune vulnerability map onto old pockets of social vulnerability.

According to a Kaiser Family Foundation survey, a third of unvaccinated Hispanic adults want a vaccine as soon as possible—twice the proportion of unvaccinated whites. But 52 percent of this eager group were worried that they might need to miss work because of the reputed side effects, and 43 percent feared that getting vaccinated could jeopardize their immigration status or their families’. Unsurprisingly then, among the states that track racial data for vaccinations, just 32 percent of Hispanic Americans had received at least one dose by May 24, compared with 43 percent of white people. The proportion of at least partly vaccinated Black people was lower still, at 29 percent. And as Lola Fadulu and Dan Keating reported in The Washington Post, Black people now account for 82 percent of COVID-19 cases in Washington, D.C., up from 46 percent at the end of last year. The vaccines have begun to quench the pandemic inferno, but the remaining flames are still burning through the same communities that have already been disproportionately scorched by COVID-19—and by a much older legacy of poor health care.

For unvaccinated people, the pandemic’s collective problem not only persists, but could deepen. “We’re entering a time when younger children are going to be the biggest unvaccinated population around,” Lessler told me. Overall, children are unlikely to have severe infections, but that low individual risk is still heightened by social factors; it is telling that more than 75 percent of the children who have died from COVID-19 were Black, Hispanic, or Native American. And when schools reopen for in-person classes, children can still spread the virus to their families and communities. “Schools play this fairly unique role in life,” Lessler said. “They’re places where a lot of communities get connected up, and they give the virus the ability, even if there’s not much transmission happening, to make its way from one pocket of unvaccinated people to another.”

Schools aren’t helpless. Lessler has shown that they can reduce the risk of seeding community outbreaks by combining several protective measures, such as regular symptom screenings and masks for teachers, tying their use to community incidence. But he worries that schools might instead pull back on such measures, whether in reaction to the CDC’s new guidance or because of complacency about an apparently waning pandemic. He worries, too, that complacency may be commonplace. Yes, vaccines substantially lower the odds that people will spread the virus, but those nonzero odds will creep upward if other protective measures are widely abandoned. The onset of cooler weather in the fall might increase them further. So might the arrival of new variants.

The Alpha variant of the new coronavirus (B.1.1.7, now the most common U.S. lineage) can already spread more easily than the original virus. The Delta variant (B.1.617.2, which has raised concerns after becoming dominant in the U.K. and India) could be more transmissible still. An assessment from the U.K. suggests that a single vaccine dose is less protective against Delta than its predecessors, although two doses are still largely effective. For now, vaccines are still beating the variants. But the variants are pummeling the unvaccinated.

“My biggest concern is that those who are unvaccinated will have a false sense of safety and security as cases drop this summer,” says Joseph Allen, who directs Harvard’s Healthy Buildings program. “It might feel like the threat has fully diminished if this is in the news less often, but if you’re unvaccinated and you catch this virus, your risk is still high.” Or perhaps higher: In the U.S., unvaccinated people might be less likely to encounter someone infectious. But on each such encounter, their odds of catching COVID-19 are now greater than they were last year.

When leaders signal to vaccinated people that they can tap out of the collective problem, that problem is shunted onto a smaller and already overlooked swath of society. And they do so myopically. The longer rich societies ignore the vulnerable among them, and the longer rich nations neglect countries that have barely begun to vaccinate their citizens, the more chances SARS-CoV-2 has to evolve into variants that spread even faster than Delta, or—the worst-case scenario—that finally smash through the vaccines’ protection. The virus thrives on time. “The longer we allow the pandemic to rage, the less protected we’ll be,” Morehouse’s Camara Jones says. “I think we’re being a bit smug about how well protected we are.”

Ian mackay, a virologist at the University of Queensland, famously imagined pandemic defenses as layers of Swiss cheese. Each layer has holes, but when combined, they can block a virus. In Mackay’s model, vaccines were the last layer of many. But the U.S. has prematurely stripped the others away, including many of the most effective ones. A virus can evolve around a vaccine, but it cannot evolve to teleport across open spaces or punch its way through a mask. And yet, the country is going all in on vaccines, even though 48 percent of Americans still haven’t had their first dose, and despite the possibility that it might fall short of herd immunity. Instead of asking, “How do we end the pandemic?” it seems to be asking, “What level of risk can we tolerate?” Or perhaps, “Who gets to tolerate that risk?

Consider what happened in May, after the CDC announced that fully vaccinated people no longer needed to wear masks in most indoor places. Almost immediately, several states lifted their mask mandate. At least 24 have now done so, as have many retailers including Walmart, McDonald’s, Starbucks, Trader Joe’s, and Costco, which now rely on the honor system. The speed of these changes was surprising. When The New York Times surveyed 570 epidemiologists a few weeks before the announcement, 95 percent of them predicted that Americans would need to continue wearing masks indoors for at least half a year.

Some public-health experts have defended the CDC’s new guidance, for at least four reasons. They say that the CDC correctly followed the science, that its new rules allow for more flexibility, that it correctly read the pulse of a fatigued nation, and that it may have encouraged vaccination (although Walensky has denied that this was the CDC’s intention). In sum, vaccinated people should know that they are safe, and act accordingly. By contrast, others feel that the CDC abrogated one of its primary responsibilities: to coordinate safety across the entire population.

In the strictest sense, the CDC’s guidance is accurate; vaccinated people are very unlikely to be infected with COVID-19, even without a mask. “You can’t expect the CDC to not share their scientific assessment because the implications have problems,” Ashish Jha, who heads the Brown University School of Public Health, told me. “They have to share it.” Harvard’s Joseph Allen agrees, and notes that the agency clearly stated that unvaccinated people should continue wearing masks indoors. And having some flexibility is useful. “You can’t have 150 million people who are vaccinated and ready to get back to some semblance of what they’re used to, and not have this tension in the country,” he told me. The new guidelines also move the U.S. away from top-down mandates, recognizing that “decisions are rightly shifting to the local level and individual organizations,” Allen wrote in The Washington Post. If some organizations and states pulled their mask mandate too early, he told me, “that’s an issue not with the CDC but with how people are acting based on its guidance.

It’s true, too, that the CDC is in a difficult position. It had emerged from a year of muzzling and interference from the Trump administration, and was operating in a climate of polarization and public fatigue. “When agencies are putting out recommendations that people aren’t following, that undermines their credibility,” Jha told me. “The CDC, as a public-health agency, must be sensitive to where the public is.” And by May, “there was a sense that mask mandates were starting to topple.”

But that problem—that collective behavior was starting to change against collective interest—shows the weaknesses of the CDC’s decisions. “Science doesn’t stand outside of society,” Cecília Tomori, an anthropologist and a public-health scholar at Johns Hopkins, told me. “You can’t just ‘focus on the science’ in the abstract,” and especially not when you’re a federal agency whose guidance has been heavily politicized from the get-go. In that context, it was evident that the new guidance “would send a cultural message that we don’t need masks anymore,” Tomori said. Anticipating those reactions “is squarely within the expertise of public health,” she added, and the CDC could have clarified how its guidelines should be implemented. It could have tied the lifting of mask mandates to specific levels of vaccination, or the arrival of worker protections. Absent that clarity, and with no way for businesses to even verify who is vaccinated, a mass demasking was inevitable. “If you’re blaming the public for not understanding the guidance—wow,” Duke’s Gavin Yamey said. “If people have misunderstood your guidance, your guidance was poor and confusing.”

Meanwhile, the idea that the new guidance led to more vaccinations is likely wrong. “I’ve overseen close to 10,000 people being vaccinated, and I’ve yet to hear ‘I can take the mask off’ as a reason,” Theresa Chapple-McGruder, a local-health-department director, told me. Although visits to the site vaccines.gov spiked after the CDC’s announcement, actual vaccination rates increased only among children ages 12 to 15, who had become eligible the day before. Meanwhile, a KFF survey showed that 85 percent of unvaccinated adults felt that the new guidance didn’t change their vaccination plans. Only 10 percent said they were more likely to get vaccinated, while 4 percent said they were less likely. Vaccination rates are stuck on a plateau.

Creating incentives for vaccination is vital; treating the removal of an important protective measure as an incentive is folly. The latter implicitly supports the individualistic narrative that masks are oppressive burdens “that people need to get away from to get back to ‘normal,’” Rhea Boyd, a pediatrician and public-health advocate from the Bay Area, told me. In fact, they are an incredibly cheap, simple, and effective means of collective protection. “The pandemic made clear that the world is vulnerable to infectious disease and we should normalize the idea of precaution, as we see in other countries that have faced similar epidemics,” Boyd said. “But recommendations like this say, This is something we put behind us, rather than something we put in our back pocket.”

Collective action is not impossible for a highly individualistic country; after all, a majority of Americans used and supported masks. But such action erodes in the absence of leadership. In the U.S., only the federal government has the power and financial freedom to define and defend the collective good at the broad scales necessary to fight a pandemic. “Local public health depends on guidance from the federal level,” Chapple-McGruder said. “We don’t make local policies that fly in the face of national guidance.” Indeed, the CDC’s guidance prompted some local leaders to abandon sensible strategies: North Carolina’s governor had planned to lift COVID-19 restrictions after two-thirds of the state had been vaccinated, but did so the day after the CDC’s announcement, when only 41 percent had received their first dose. Meanwhile, Iowa and Texas joined Florida in preventing cities, counties, schools, or local institutions from issuing mask mandates. Rather than ushering in an era of flexibility, the CDC has arguably triggered a chain of buck-passing, wherein responsibility for one’s health is once again shunted all the way back to individuals. “Often, Let everyone decide for themselves is the easiest policy decision to make, but it’s a decision that facilitates spread of COVID in vulnerable communities,” Julia Raifman, a health-policy researcher at Boston University, told me.

The CDC’s own website lists the 10 essential public-health services—a set of foundational duties arranged in a colorful wheel. And at the center of that wheel, uniting and underpinning everything else, is equity—a commitment to “protect and promote the health of all people in all communities.” The CDC’s critics say that it has abandoned this central tenet of public health. Instead, its guidelines centered people who had the easiest and earliest access to vaccines, while overlooking the most vulnerable groups. These include immunocompromised people, for whom the shots may be less effective; essential workers, whose jobs place them in prolonged contact with others; and Black and Latino people, who are among the most likely to die of COVID-19 and the least likely to have been vaccinated.

During a pandemic, “someone taking all the personal responsibility in the world may still be affected by a lack of coordinated safety,” Raifman said. “They may be vaccinated but less protected because they are immunosuppressed and get the disease working in a grocery store amidst unmasked people. They may have a child who cannot be vaccinated, and miss work if that child gets COVID.” As Eleanor Murray, an epidemiologist at Boston University, said on Twitter, “Don’t tell me it’s “safe”; tell me what level of death or disability you are implicitly choosing to accept.” When Rochelle Walensky said, “It’s safe for vaccinated people to take off their masks,” she was accurate, but left unaddressed other, deeper questions: How much additive burden is a country willing to foist upon people who already carry their disproportionate share? What is America’s goal—to end the pandemic, or to suppress it to a level where it mostly plagues communities that privileged individuals can ignore?

“When you’re facing an epidemic, the responsibility of public health is to protect everybody, but those made vulnerable first,” Boyd, the pediatrician, told me. “If you have protection, the CDC is glad for you, but their role is not the same for you. Their role is to keep those most at risk of infection and death from exposure.”

America is especially prone to the allure of individualism. But that same temptation has swayed the entire public-health field throughout its history. The debate about the CDC’s guidance is just the latest step in a centuries-old dance to define the very causes of disease.

In the early 19th century, European researchers such as Louis-René Villermé and Rudolf Virchow correctly recognized that disease epidemics were tied to societal conditions like poverty, poor sanitation, squalid housing, and dangerous jobs. They understood that these factors explain why some people become sick and others don’t. But this perspective slowly receded as the 19th century gave way to the 20th.

During those decades, researchers confirmed that microscopic germs cause infectious diseases, that occupational exposures to certain chemicals can cause cancers, that vitamin deficiencies can lead to nutritional disorders like scurvy, and that genetic differences can lead to physical variations among people. “Here … was a world in which disease was caused by germs, carcinogens, vitamin deficiencies, and genes,” wrote the epidemiologist Anthony J. McMichael in his classic 1999 paper, “Prisoners of the Proximate.” Public health itself became more individualistic. Epidemiologists began to see health largely in terms of personal traits and exposures. They became focused on finding “risk factors” that make individuals more vulnerable to disease, as if the causes of sickness play out purely across the boundaries of a person’s skin.

“The fault is not in doing such studies, but in only doing such studies,” McMichael wrote. Liver cirrhosis, for example, is caused by alcohol, but a person’s drinking behavior is influenced by their culture, occupation, and exposure to advertising or peer pressure. The distribution of individual risk factors—the spread of germs, the availability of nutritious food, one’s exposure to carcinogens—is always profoundly shaped by cultural and historical forces, and by inequities of race and class. “Yet modern epidemiology has largely ignored these issues of wider context,” McMichael wrote.

“The field has moved forward since then,” Nancy Krieger, a social epidemiologist at Harvard told me. Epidemiology is rediscovering its social side, fueled by new generations of researchers who don’t come from traditional biomedical backgrounds. “When I started out in the mid-1980s, there were virtually no sessions [at academic conferences] about class, racism, and health in the U.S.” Krieger said. “Now they’re commonplace.” But these connections have yet to fully penetrate the wider zeitgeist, where they are still eclipsed by the rhetoric of personal choice: Eat better. Exercise more. Your health is in your hands.

This is the context in which today’s CDC operates, and against which its choices must be understood. The CDC represents a field that has only recently begun to rebalance itself after long being skewed toward individualism. And the CDC remains a public-health agency in one of the most individualistic countries in the world. Its mission exists in tension with its environment. Its choice to resist that tension or yield to it affects not only America’s fate, but also the soul of public health—what it is and what it stands for, whom it serves and whom it abandons.

0 comments:

Post a Comment