A Blog by Jonathan Low


Dec 29, 2019

The Platform Concept Spurred Tech Firms' Growth. Now It May Undo Them

Tech companies have been able evade responsibility for what appears on their platform, claiming that they just provide a means for others.

That era of self-serving denial is ending - by popular demand. JL

Jason Dean and Katherine Bindley report in the Wall Street Journal:

Everyone in tech—and many beyond—wants to be a platform business. Companies built around that concept - Facebook, Google’s YouTube, Twitter , Amazon, Uber, and Airbnb - gained billions of users and created hundreds of billions of dollars of market value. (But) how much responsibility tech companies have for their platforms has become one of the defining issues of our era. The model is changing. Responsibility for what happens on platforms is no longer an afterthought. Scalability comes with constraints. “The age of regulation of platforms is upon us.” "(Companies have) to “take more responsibility for the stuff on our platforms.”
Tech companies like to boast of their life-changing products and mind-blowing innovations, but the word that most defines the industry’s boom over the past two decades is more mundane. “Platform.”
Seemingly everyone in tech—and many beyond—wants to be a platform business. Uber Technologies Inc. used the term 747 times in its IPO prospectus in May. We Co., parent of troubled office lessor WeWork, boasted of its global platform throughout the filing for its failed IPO. Peloton Interactive Inc., which makes networked exercise equipment, calls itself “the largest interactive fitness platform in the world.” Even Beyond Meat Inc. likes to talk of its “plant-based-product platforms.”
Behind the mania for a seemingly humdrum term is a business concept that, in varied incarnations across the internet, became a springboard for enormous growth and wealth. Companies built around that concept—from Facebook Inc., Google’s YouTube, and Twitter Inc. to Amazon.com Inc., Uber, and Airbnb Inc.—collectively gained billions of users and created hundreds of billions of dollars of market value by deploying software systems to connect content creators with viewers, sellers with buyers, drivers with riders, hosts with guests. These companies wield power with little historical precedent over how people communicate, what they know and see and the ways they shop and get around.
Aided by a hands-off regulatory approach in the U.S., such companies prioritized adding suppliers and users as quickly as possible—and often shrugged off the constraints, controls and costs shouldered by more traditional companies in the sectors they were seeking to disrupt. “We’re a platform,” goes the refrain, not a media company, or seller, or service provider, or car-service operator.
That framework is now under pressure on multiple fronts. Users, lawmakers and regulators in the U.S. and Europe have assailed Facebook, YouTube, and other social-media platforms for doing too little as bigots, bullies and foreign propagandists inundated their platforms with abuse and disinformation. Amazon is taking fire for lax controls over its marketplace, and Uber is battling a California law that could require gig-economy companies to treat drivers as employees.
Political leaders from House Speaker Nancy Pelosi to Attorney General William Barr have called for more regulation of tech platforms. “We’ve heard widespread concerns from consumers, businesses and entrepreneurs, including about stagnated innovation, high prices, lack of choice, privacy, transparency, and public safety,” Mr. Barr told the National Association of Attorneys General this month, explaining the Justice Department’s broad review of leading online platforms.
How much responsibility tech companies have for their platforms has become one of the defining issues of our era. How the question is resolved will have ramifications for decades to come across business, consumer welfare and social discourse.
For most of their history, the tech companies were so focused on the challenges of fast growth that tending to what was on their platforms wasn’t as urgent, says Geoffrey Parker, a Dartmouth College professor of engineering and co-author of a book about the rise of such businesses called “Platform Revolution.”
“Now,” he says, “it’s exploding in their face.”
Twenty-six fateful words
The platforms wouldn’t exist as we know them without a piece of 1996 legislation called Section 230 of the Communications Decency Act, which largely exempts online services from responsibility for third-party content.
Ironically, Section 230 began as an effort to make content moderation easier. It followed a pair of defamation lawsuits with opposite outcomes in the early 1990s against two of the companies then called “internet portals,” which hosted newsletters, forums and chat rooms featuring user contributions.
In the first case, a federal court found CompuServe not liable for allegedly defamatory claims one newsletter on its platform made about a competitor. CompuServe was considered a distributor: Much like a library or a bookstore, it couldn’t be held responsible for every bit of content it carried, the court found.
A few years later, an anonymous post to a bulletin board on the Prodigy portal alleged fraud at Stratton Oakmont, a Long Island, N.Y., brokerage later featured in Martin Scorsese’s film “The Wolf of Wall Street.” Prodigy operated differently from CompuServe, with content guidelines and moderators who sometimes removed posts. Stratton Oakmont sued Prodigy, arguing that those traits meant it was responsible for the fraud claim on its platform. In 1995, a New York state court agreed.
That alarmed some lawmakers, who feared that punishing tech companies that moderated some, but not all, the content they carried would hobble the development of the internet. Two of them, Sen. Ron Wyden, Democrat of Oregon, and Chris Cox, then a Republican representative from California, proposed the legislation that became Section 230, which included the clause: “No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”At the time, no one imagined what the internet would become. Amazon wasn’t yet two years old, and Yahoo, and eBay were also infants. Google would be incorporated two years later, Facebook six years after that, and Uber and Airbnb not for more than a decade after the law passed.
But that string of words ended up giving tech companies “incredibly broad immunity” that was essential to the growth of the platforms we use today, says Jeff Kosseff, author of a book about Section 230 titled “The Twenty-Six Words That Created the Internet.”
Get big fast
The law dovetailed with developments in technology and strategy. The spread of faster network connections and more computing power—especially after the advent of the smartphone—added more people to an ever more pervasive internet.
Leveraging that power, entrepreneurs seized on business models whose allure for users grew the more people used them, says Prof. Parker of Dartmouth—more content attracted more users, who made the platform more attractive to content creators, who drew more users.
Those factors intertwined with another key trait: The platforms’ rapid growth often required relatively little added cost—especially with the legal shield provided by Section 230. They were, in Silicon Valley parlance, highly scalable.
These qualities—which worked for social-network friends, sellers and buyers, drivers and riders, and so on—geared the companies toward rapid growth. And venture capitalists and entrepreneurs saw many of these markets as winner-take-all, making it essential to get bigger faster than the competition.
“You really are in survival mode,” says Tim Kendall, who from 2006 to 2010 was director of monetization at Facebook; he later was president of the online image-sharing company Pinterest Inc. He says that when he joined Facebook, it was far behind Myspace in users and it wasn’t clear it would make it. “When you’re in survival mode, you’re not compelled to zoom out and think about society and think about the higher-order implications, because you want to live to fight another day.”
Executives maintained this mind-set even as their companies grew into behemoths. Facebook settled into a persistent pattern over the past decade—long after it came to dominate social media—of launching some new product or feature only to apologize later after pushback about content, privacy and other issues. Uber, dueling with rival Lyft Inc., raced into city after city before regulators could adapt.
“Every bone in your body tells you that you’re David,” says Mr. Kendall, who now runs a startup called Moment that helps people reduce the time they spend on their smartphones. “And none of these companies know when they become Goliath. They almost are incapable of knowing that they’re Goliath, until they get smacked so hard by a regulator or a government or a fine.”
The backlash
The smacks are coming from several directions of late, and the platform companies are responding with both defiance and attempts at conciliation.
The flood of misinformation during the 2016 U.S. presidential election focused unprecedented public scrutiny on what responsibilities Facebook, Twitter, YouTube and other social platforms should take for the content they carried. Facebook Chief Executive Mark Zuckerberg, Twitter Chief Executive Jack Dorsey and other executives, hauled into congressional hearings, have pledged to do more. Facebook says it has been hiring thousands of new workers to beef up monitoring and safety on its platform—an effort made more complex, and costly, by the need to adapt its measures to the myriad markets where it operates world-wide.
Amazon says it spent $400 million last year to deal with counterfeit and unsafe items, and is prepared to spend billions more in coming years, after rapid growth in the number of outside vendors on its marketplace brought a surge of such products.
Users and investors, too, are proving warier of the platforms’ defenses.
The Wall Street Journal reported in March that Care.com Inc., the nation’s largest online marketplace for babysitters and other caregivers, provided only limited vetting of its caregivers, sometimes with tragic results. The company responded that it was a platform, and like other platforms it didn’t generally verify the information posted by users. Care.com’s shares plunged in the wake of the report, its CEO resigned, and, this month, it agreed to a sale to IAC/InterActive Corp. for around 60% of its earlier valuation.
Meanwhile, courts and lawmakers are setting new limits on the leeway afforded by Section 230. Last year, Congress overwhelmingly passed legislation removing immunity for online businesses that have facilitated the online sex business. The legislation, a response to a surge in online prostitution on sites including the now closed Backpage.com, was strongly opposed by internet industry groups, which warned that it could erode Section 230 protections.
In March, a federal appeals court in California rejected an effort by Airbnb and rival HomeAway, a unit of Expedia Group Inc., to invoke Section 230 as a defense against a Santa Monica, Calif., ordinance requiring them to ensure that local listings on their platforms complied with city rules.
Airbnb also is stepping up safety measures on its platform after issues including a deadly shooting during an out-of-control house party at a property rented out on Airbnb. The company had to “take more responsibility for the stuff on our platform,” Chief Executive Brian Chesky said about the changes last month. “This has been a gradual, maybe too gradual, transition for our industry.”
The battles over platform responsibility go beyond content. California passed a law set to take effect on Jan. 1 that could compel Uber and rivals to treat their drivers as employees entitled to benefits including minimum wage and paid sick days. Uber has said it doesn’t have to change its practices because of the law, and that it isn’t a ride-hailing company but a “technology platform for several different types of digital marketplaces.”
The platform businesses likely aren’t going anywhere. They’ve amassed many millions of users with services that connect people in useful new ways. Despite the backlash, many of them continue to grow.
But the model is clearly changing. Responsibility for what happens on platforms is no longer an afterthought. Scalability comes with constraints. Says Prof. Parker: “The age of regulation of platforms is upon us.”


Post a Comment