A Blog by Jonathan Low

 

Mar 14, 2018

The Day the Algorithm Died

Changing the algorithm is not going to solve the problem of the internet's failed utopian vision. JL

Rob Howard reports in Newco Shift:

Social media operates based on the Silicon Valley hypothesis that if ideas are distributed freely, the most valuable will rise to the top. Perhaps we wouldn’t still be defending this failed experiment if it wasn’t so profitable. Facebook’s problems can’t be solved with more data or better code. They’re simply the most potent and alarming example of the fact that the Internet has failed as a public forum. As long as we trust software to shape our interaction with the world, life will be a disappointing, chaotic, infinite scroll.
When in doubt, blame the robots. As Facebook has fallen from grace and struggled to reconcile its role in spreading propaganda and stoking political anger, the company has proposed a familiar solution:
If the algorithm has failed, let’s just build a better algorithm.
It’s a noble goal for the next hackathon. As a mechanism for real change, however, the focus on the software misses the point.
Facebook’s problems can’t be solved with more data or better code. They’re simply the most potent and alarming example of the fact that the Internet has failed as a public forum.
Not long ago, the scientists and software developers who pioneered the World Wide Web thought it would democratize publishing and usher in a more open, educated and thoughtful chapter of history. But while the Internet and its offshoot technologies have improved society and daily life in many ways, they have been an unmitigated disaster for the way we communicate and learn.
It feels good to blame Facebook, but the crisis is evident in every nook and cranny of the web. The Internet is crawling with normal, everyday humans who transform into vicious, nihilistic psychopaths the moment they’re granted even a thin veil of anonymity in a comment thread. This was the nature of online communication in 1995, when astronomer and early-adopter Clifford Stoll lamented in Newsweek:
“The cacophony more closely resembles citizens band radio, complete with handles, harassment, and anonymous threats. When most everyone shouts, few listen.”
Twenty-three years of software advancement later, his words still ring true.
Despite Facebook’s efforts to make its platform less anonymous than its predecessors, it is painfully clear that it still feels good to pile on the hate when you’re tapping away on your keyboard or phone, comfortably distant from the consequences of your words on the real person reading them many fiber-optic cables away. Facebook, along with most of its social media competitors, operates in part based on the Silicon Valley hypothesis that if all ideas are distributed freely, the most valuable ones will rise to the top. Instead of validating that seemingly uncontroversial point, however, our collective experience online has proven it wrong. The marketplace of ideas has become an environment in which there is zero barrier to transmitting vitriol to millions, largely negating the Internet’s egalitarian, utopian goals.
“I thought once everybody could speak freely and exchange information and ideas, the world is automatically going to be a better place,” Evan Williams, a founder of Twitter, Blogger and Medium, said in a 2017 interview with The New York Times. “I was wrong about that.”
Perhaps we wouldn’t still be defending this failed experiment if it wasn’t so stunningly profitable. Facebook didn’t invent advertising, but it has scaled the many flaws of that business model to an unprecedented magnitude. The runaway financial success of highly targeted, pay-per-view ads has warped software design into a competition to build the most addictive digital slot machine, masquerading as social engagement. The longer we stay hooked, the more advertising we see, and the more eyeballs can be sold for a fraction of a penny apiece. This is why you get an e-mail if you don’t sign in for a few days. This is why your apps throttle your notifications, slowly distributing the likes on your vacation photos over the course of minutes rather than seconds. It’s an elaborate manipulation to keep you coming back for more.
It’s also why Jeffrey Hammerbacher, a former Facebook engineer, pined for something more in a 2011 interview with Businessweek.
“The best minds of my generation are thinking about how to make people click ads,” he said. “That sucks.”
Like Facebook’s other innovations, the advertising platform is valuable for individuals but disastrous at scale. The company’s wealth of demographic and behavioral data helps small business owners find their niche audience, but it is also a gold mine for amateur propagandists. The platform earns money from engagement, and guess which ads and articles are the most engaging? It’s not the calm, thoughtful, balanced ones.
And that brings us to the news. Veteran journalists spent the web’s first decade tentatively dipping their toes in the new technology, rightfully skeptical of its supposed virtues. They felt the sting of getting scooped by fly-by-night blogs with low budgets and lower standards. Then, slowly but surely, they hopped on board. By the time Facebook became a significant source for news, the major publications knew better than to risk being left behind.
The rush of traffic from Facebook has been beneficial for the publications’ bottom lines. However, that has been accompanied by an avalanche of meaningless and disorienting social feedback — a thoughtless like, a forgotten share — that has pushed even the best publications toward clickbait and sensationalism. According to the analytics, that’s what people like. The result is that writers and editors have far less leeway to focus on what they believe is valuable, because the ultimate arbiter of success and prestige is the fleeting gratification of a page view. More news, faster news and trendier news paved the road to victory.
The source of this problem can’t be found in a datacenter. The algorithm has failed because we are collectively seeking knowledge and human connection via the impersonal interface of the Internet, and then feeling angry and confused when we come up empty-handed.
There is no software that can force commenters to engage in respectful debate. There is no app to eliminate the immense conflicts of interest and perverse incentives of pay-per-view advertising sales. There is no subroutine to stop news organizations from competing in a race to the editorial bottom, seduced by clickbait and lusting for attention at any cost.
No matter how well we code, no matter how convincingly we simulate and augment reality, our brains and bodies still know that what we experience on a screen is, in an important but ambiguous way, not real. The people aren’t really there, so we hate them. The approval isn’t real, so it is never truly satisfying. And if we manage to make a friend online, we can’t help but fantasize about how different it might be to meet IRL. (Yes, there’s an acronym for In Real Life.)
Tweak the algorithm all you want. It will never be a worthy substitute for a good book, a healthy debate or an honest friendship. As long as we trust software to shape our interaction with the world, life will be a disappointing, chaotic, infinite scroll.

0 comments:

Post a Comment