A Blog by Jonathan Low

 

Jan 26, 2018

Is Facebook's Plan To Rate News Sources On Trustworthiness Naive - or Cynical?

Facebook has acknowledged that it may have encouraged extreme partisanship - and may even be threatening to democracy.

To suggest that the solution is giving already alienated users a vote on which outlets are most trustworthy suggests that Facebook may not yet be ready to acknowledge its responsibility - and their consequences. JL


Will Oremus reports in Slate:

A Pew Research survey found that Fox News ranked as the go-to news outlet for a plurality of Americans during the 2016 election. As described Facebook’s system might well rate Fox News as more trustworthy than the New York Times. That might sound OK to Trump voters, but would be shared by hardly anyone with a grasp of journalism.(And) a ratio of trust to familiarity rewards partisan outlets that are familiar to people of one political bent or another.
How do you fight fake news, misinformation, and sensationalism on Facebook? One way might be to try to somehow distinguish between reputable and sketchy news sources within the news feed. On Friday, Facebook announced that it will attempt just that—fraught as the endeavor is certain to be.
It’s a risky stand for a company that has long resisted any action that could open it to claims of political bias—especially at a time when the credibility of mainstream news sources has become a polarizing issue in itself. And, at least philosophically, it seems like a step in the right direction. Now we have to hope that Facebook’s implementation turns out to be a lot more thoughtful and nuanced than CEO Mark Zuckerberg makes it sound.
Starting next week, the company will test a change to the news feed ranking system that aims to prioritize links to “broadly trusted” sources of news over those from, well, less-broadly trusted sources. And how will it determine that? Here’s how Zuckerberg explained it:
The hard question we’ve struggled with is how to decide what news sources are broadly trusted in a world with so much division. We could try to make that decision ourselves, but that’s not something we’re comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you—the community—and have your feedback determine the ranking.
We decided that having the community determine which sources are broadly trusted would be most objective.
Specifically, Facebook will begin asking respondents to its ongoing user surveys to tell it whether they trust various news sources. From that data, it will rank news outlets based on the ratio of the number of people who trust the source to the number who are familiar with it.
This is the first of a series of changes, Facebook said, intended to make sure that the news people see on the social network is “high quality.” This tweak is designed to address trustworthiness; subsequent changes will aim to prioritize news that’s “informative” and “local.”
At first blush, it looks like Facebook is doing exactly what I and other critics have long been calling for it to do: acknowledge that its algorithm plays a crucial role in determining what news people read, and take some responsibility for its profound effects on the media and the spread of information. It’s about time, right?
So far, Facebook’s approach to a notoriously difficult problem looks painfully simplistic and naïve.
Except that, based on its announcement, Facebook’s approach to a notoriously difficult problem—figuring out which media to trust—appears to be painfully simplistic and naïve.
Let’s acknowledge first that it’s possible the company has a more sophisticated system in mind than Zuckerberg let on in his announcement.
Facebook has become much better at communicating changes to its algorithm over the years, but it’s still careful in most cases to avoid tipping its full hand. It wants to give high-level guidance to good-faith actors without handing hackers, clickbaiters, and trolls a detailed blueprint of just how to exploit the system. The company’s head of news feed, Adam Mosseri, told the Wall Street Journal that “no one signal that we use is perfect,” which could be read either as a commitment to future improvements or a copout. Let’s hope it’s the former, and that Facebook’s trustworthiness rankings will ultimately rely on something more than a pair of yes-or-no survey questions.
Otherwise, it’s all too easy to imagine how “a ratio of those who trust the source to those who are familiar with it”—Zuckerberg’s description of the survey data’s output—could go grievously awry.
President Trump has spent much of his campaign and presidency trying to persuade larger and larger swaths of the American public that the nation’s serious, established news sources are actually “fake news.” A Pew Research survey last year found that Fox News ranked as the go-to news outlet for a plurality of Americans during the 2016 election. As described, then, Facebook’s system might well rate Fox News as more trustworthy than the New York Times. That might sound OK to Trump voters, but it’s an opinion that would be shared by hardly anyone with a strong grasp of how journalism works.
Another problem: A simple ratio of trust to familiarity seems destined to reward deeply partisan outlets that are familiar mostly to people of one political bent or another. If more conservatives than liberals are familiar with, say, the Independent Journal Review, then it might rank much higher on Facebook’s list than would a publication with a more diverse readership. (Knife twist: The success and proliferation of such partisan outlets, on all sides of the political spectrum, is in large part due to Facebook and its algorithm’s long-standing bias toward content that plays on gut emotions such as anger and solidarity.)
This is what happens, of course, when you try to optimize for subjective qualities such as “trustworthiness” while remaining deeply committed to “objectivity”—a favorite word of Facebook’s higher-ups when it comes to anything resembling an editorial judgment. Terrified  of exhibiting any kind of human value judgment, the company throws up its hands and holds a popularity contest.
The frustrating thing is you can see exactly why Facebook acts this way. Remember what happened when Gizmodo revealed that the social media company employed human news editors who in some cases might have had—gasp—political leanings? It dumped them in favor of an incompetent algorithm faster than you can say “but algorithms have biases too.”
The best hope at this point might be that the rankings prove to be so self-evidently screwed up in early testing that the company’s product managers are forced to confront the system’s flaws before it ever reaches a full public rollout. One obvious way to improve it would be to add a layer of human oversight to the algorithm’s raw output; another might be to develop more illuminating survey questions than just, “Are you familiar with this outlet?” and “Do you trust it?”
Facebook has taken an important step toward accountability by making credible news an explicit goal of the news feed algorithm. But to judge by Zuckerberg’s announcement Friday, it’s still a long way from figuring out how to achieve it.

0 comments:

Post a Comment