So what would get someone to change their mind about a false belief that is deeply tied to their identity?
“Probably nothing,” Tavris says. “I mean that seriously.”
But of course there are areas where facts can make a difference. There are people who are just mistaken or who are motivated to believe something false without treasuring the false belief like a crown jewel.
“Personally my own theory is that there’s a slide that happens,” McIntyre says. “This is why we need to teach critical thinking, and this is why we need to push back against false beliefs, because there are some people who are still redeemable, who haven’t made that full slide into denialism yet. I think once they’ve hit denial, they’re too far gone and there’s not a lot you can do to save them.”
There are small things that could help. One recent study suggests that people can be “inoculated” against misinformation. For example, in the study, a message about the overwhelming scientific consensus on climate change included a warning that “some politically motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists.” Exposing people to the fact that this misinformation is out there should make them more resistant to it if they encounter it later. And in the study at least, it worked.
While there’s no erasing humans’ tribal tendencies, muddying the waters of partisanship could make people more open to changing their minds. “We know people are less biased if they see that policies are supported by a mix of people from each party,” Jerit says. “It doesn’t seem like that’s very likely to happen in this contemporary period, but even to the extent that they see within party disagreement, I think that is meaningful. Anything that's breaking this pattern where you see these two parties acting as homogeneous blocks, there’s evidence that motivated reasoning decreases in these contexts.”
It’s also possible to at least imagine a media environment that’s less hospitable to fake news and selective exposure than our current one, which relies so heavily on people’s social-media networks.
I asked Manjoo what a less fake-newsy media environment might look like.
“I think we need to get to an information environment where sharing is slowed down,” Manjoo says. “A really good example of this is Snapchat. Everything disappears after a day—you can’t have some lingering thing that gets bigger and bigger.”
Facebook is apparently interested in copying some of Snapchat’s features—including the disappearing messages. “I think that would reduce virality, and then you could imagine that would perhaps cut down on sharing false information,” Manjoo says. But, he caveats: “Things must be particularly bad if you’re looking at Snapchat for reasons of hope.”
So much of how people view the world has nothing to do with facts. That doesn’t mean truth is doomed, or even that people can’t change their minds. But what all this does seem to suggest is that, no matter how strong the evidence is, there’s little chance of it changing someone’s mind if they really don’t want to believe what it says. They have to change their own.
As previously noted, Daniel Shaw ultimately left Siddha Yoga. But it took a long time. “Before that [New Yorker] article came out,” he says, “I started to learn about what was going to be in that article, and the minute I heard it is the minute I left that group, because immediately it all clicked together. But it had taken at least five years of this growing unease and doubt, which I didn’t want to know about or face.”
It seems like if people are going to be open-minded, it’s more likely to happen in group interactions. As Manjoo noted in his book, when the U.S. government was trying to get people to eat organ meat during World War II (you know, to save the good stuff for our boys), researchers found that when housewives had a group discussion about it, rather than just listening to a nutritionist blather on about what a good idea it was, they were five times more likely to actually cook up some organs. And groups are usually better at coming up with the correct answers to reasoning tasks than individuals are.
Of course, the wisdom of groups is probably diminished if everyone in a group already agrees with each other.
“One real advantage of group reasoning is that you get critical feedback,” McIntyre says. “If you’re in a silo, you don’t get critical feedback, you just get applause.”
But if the changes are going to happen at all, it’ll have to be “on a person-to-person level,” Shaw says.
He tells me about a patient of his, whose family is involved in “an extremely fundamentalist Christian group. [The patient] has come to see a lot of problems with the ideology and maintains a relationship with his family in which he tries to discuss in a loving and compassionate way some of these issues,” Shaw says. “He is patient and persistent, and he chips away, and he may succeed eventually.”
“But are they going to listen to a [news] feature about why they’re wrong? I don’t think so.”
When someone does change their mind, it will probably be more like the slow creep of Shaw’s disillusionment with his guru. He left “the way most people do: Sort of like death by a thousand cuts.”