A Blog by Jonathan Low

 

Sep 30, 2017

Facebook Can Absolutely Control Its Algorithm

Impotence is a convenient excuse, but not a sustainable strategy. JL

Erin Griffith reports in Wired:

“We’re just a platform” is a convenient way to avoid taking full responsibility for an increasingly serious set of problems. Facebook has repeatedly shown it can police content on its platform, particularly when doing so affects its $27 billion business. Facebook’s algorithm is determined by data, and it’s based on what users want. Changes to the News Feed algorithm are in the name of getting users increasingly addicted to Facebook.So far, fake news has proven to be addictive; in the last year Facebook’s user base and revenue have grown by 17% and 47%.
There is a narrative emerging around Facebook that implies the social-networking giant cannot prevent the spread of fake news, Russian political messages, and ads targeting hate groups on its platform. Like Dr. Frankenstein, Facebook created a monster! The algorithms have already won! Facebook’s sheer scale means it could never successfully police the gazillion pieces of content its 2 billion users create and share every second. We are doomed to live in a dystopian post-truth world of propaganda, dark ads, and artificial intelligence.
This echoes Facebook’s own defense against the rising backlash it faces. If Facebook is beholden to algorithms, it cannot be held fully responsible for the activity on its network. Announcing new tools last week to police things like Russia’s purchase of political ads, CEO Mark Zuckerberg said Facebook could do better, but tried to set expectations: “I’m not going sit here and tell you that we’re going to catch all bad content in our system.” He framed his reasoning in terms of freedom of speech: “We don’t check what people say before they say it, and frankly, I don’t think society should want us to.”
There’s a small problem with this argument: Facebook has repeatedly shown it can police content on its platform, particularly when doing so affects its $27 billion business. “We’re just a platform” is a convenient way to avoid taking full responsibility for an increasingly serious set of problems. In 2011, when Facebook decided that games from companies such as Zynga were disrupting the way people used Facebook, it limited how many messages gaming companies could send to Facebook’s users. The dominance of gaming—and Zynga—on Facebook immediately declined.
In 2012, when independent content apps like SocialCam and Viddy began annoying users, Facebook began demoting content related to them. Their usage dropped, prompting each company to sell and ultimately shut down.
In 2013, when Facebook decided that the curiousity-gap headlines and clickbait articles offered by viral websites like Upworthy (“the fastest growing media site of all time”) and ViralNova were wearing thin, it changed the algorithm for News Feed, the main river of content for Facebook users. Traffic to ViralNova and Upworthy dropped dramatically.
On matters of taste, Facebook has long banned nudity from its platform, “because some audiences within our global community may be sensitive to this type of content.”On the advertising side, the company has a clearly defined system to ensure that alcohol ads comply with various national regulations, as former product manager Antonio Garcia Martinez recently wrote for WIRED.
It won’t be as easy for Facebook to ferret out sophisticated “bad content” as it was to demote FarmVille notifications. For example, the Russian campaign purchased ads that both supported and criticized the Black Lives Matter movement, according to the Washington Post. Martinez, the former product manager, describes the problem as “playing whack-a-mole.”
Facebook did not respond to a request for comment. But the company has shown it can tackle complicated political situations. The company has complied with requests from leaders of Vietnam and other countries to censor content critical of those governments. Facebook reportedly created a censorship tool that suppresses posts for users in certain geographies as a way to potentially work with the Chinese government. The company has reportedly used the same technology it uses to identify copyrighted videos to identify and remove ISIS recruitment material.
At a 2015 conference, Facebook head of product Chris Cox repeatedly denied that changes to Facebook’s content algorithm were subjective: Facebook does not have a “lever” or “dial” which it uses to control what content its users see in their News Feeds, he argued.
“I know people imagine we have this room with dials,” he said. “There’s no levers, no dials. What we have is features we can build that do a good job of predicting what kinds of things people are going to like and what kind of things they aren’t.” Cox added that changes to the News Feed algorithm are “in the interest of people saying what they care about and what they don’t care about.”
In other words, Facebook’s algorithm is determined by data, and it’s based on what users want. Changes to the News Feed algorithm are in the name of getting users increasingly addicted to Facebook.
So far, fake news has proven to be addictive; in the last year Facebook’s user base and revenue have grown by 17% and 47%, respectively. In the days after the Presidential election, Zuckerberg believed the notion that fake news influenced the campaign was a “pretty crazy idea.” He even brushed off a personal warning from President Obama about the problem, according to the Washington Post.
Now, Facebook faces pressure on numerous fronts: Special counsel Robert Mueller and Congressional committees are investigating the Russian-sponsored ads; democracy activists want to corral fake news; and there are potential antitrust and privacy regulations in Europe. Facebook now acknowledges its role in the election and is cooperating with investigators.
But none of that goes as far as taking full responsibility for what happens on Facebook’s platform. As one half of a global digital advertising duopoly with a market value of $473 billion, Facebook has great power. The responsibility that comes with it is a work in progress.

0 comments:

Post a Comment