A Blog by Jonathan Low

 

Oct 15, 2019

How Pinterest Built One Of the Most Successful Algorithms In Tech



Focus on what is working now - and in the future - not in the past. JL

Will Oremus reports in OneZero:

For the 50 million new users who join Pinterest each year, the code that powers Pinterest’s feed is similar in kind to that which underpins Facebook, YouTube, or TikTok. It’s the core product of a $15 billion company — the only one among a stock of tech unicorns whose stock has consistently traded above its IPO price.The company will roll out a feature to address its algorithm’s most visible flaw: its tendency to draw the wrong conclusions from users’ past behavior, and pollute their feeds with stuff they don’t want to see anymore



Like most social networks, Pinterest was built on assumptions and biases. Unlike most social networks, Pinterest admits it.
From the start, you tell the company how to profile you. The service asks two personal questions when you register — your age and gender — and how you answer them shapes everything that happens next. Based on your responses, along with your language, region, and bits of your browsing history, Pinterest chooses an array of topic categories it thinks you might be interested in and asks you to pick at least five.
Tell Pinterest you’re a woman in your thirties, and your suggested interests will include “Makeup,” “Hair Tutorials,” “Workout Plans,” and “Dinner Recipes.” Tell it you’re a man in your thirties, and you’ll get some very different choices: “Woodworking,” “Funny Pictures,” “Survival Skills,” and “Gaming.” Or you can type your own response into a “Non-Binary” selection — it allows you to input anything — and you’ll get a stock of gender neutralish options like “Animals,” “Home Decor,” “Hairstyles” for women, “Men’s Hairstyles,” and “Coffin Nails.”
Once you’ve made your picks, Pinterest’s machine learning software crafts a home feed full of images, or “pins,” that it predicts will appeal to you. This is a crucial moment: Pinterest says its internal data shows that if people see pins they like right away, there’s a good chance they’ll become active users, returning to the site regularly for fresh content related to their interests, viewing ads tailored to those interests, and curating their own “boards” of related pins. If people fail to find anything that interests them at first glance, they may never come back.
For the 50 million new users who join Pinterest each year, the sign-up process is the first taste of one of Silicon Valley’s most successful yet least scrutinized algorithms. The code that powers Pinterest’s home feed, search results, and notifications — determining what images and ideas users see at every turn — is similar in kind to that which underpins Facebook’s News Feed, YouTube’s recommendations, or TikTok’s For You page. It’s the core product of a $15 billion company that successfully went public this year — the only one among a stock of tech unicorns like Uber, Lyft, and Slack whose stock has consistently traded above its IPO price.
Behind the scenes, however, Pinterest’s engineers and executives are grappling with the same kinds of tensions that have caused trouble elsewhere. The company’s leaders say they want to map a different route to success in Silicon Valley, one that’s less meteoric and more humane. But in its first year as a public company, it faces a pivotal challenge: How to grow beyond a user base that has historically skewed toward white, suburban women without alienating loyalists, stereotyping newcomers, or potentially allowing for the spread of misinformation and radicalization.

The company will roll out a feature designed to address perhaps its algorithm’s most visible flaw: its tendency to draw the wrong conclusions from users’ past behavior, and pollute their feeds with stuff they don’t want to see anymore — like wedding dresses for a user who broke off her engagement, or nursery decor for a user who suffered a miscarriage. The feature, which Pinterest is calling the Home Feed Tuner, will let users review and manually edit their activity history and interests, essentially telling the algorithm what to remember and what to forget.
It’s a feature that Pinterest expects will reduce complaints and raise satisfaction among a small subset of power users. But it will do little to help the site expand, and could even reduce engagement for those who use it by limiting the information available to the algorithm. It’s the kind of trade-off the company says it’s willing to make, especially since early tests showed no significant drop-off in user activity.



Pinterest plans to announce a new feature titled “Tune Your Home Feed” that lets you tell its algorithm which of your interests and behaviors you want used for future recommendations, and which ones you don’t. Credit: Pinterest

Other trade-offs are proving trickier, however, like how to understand users deeply enough to keep them coming back for more, without boring them, boxing them in, or creeping them out.
“Users don’t want to be pigeonholed,” says Candice Morgan, the company’s head of inclusion and diversity. She commissioned a study earlier this year to understand how Pinterest could better serve users from backgrounds that the platform underrepresents. “They don’t want us to guess what they’re going to like based on their demography,” she adds.
And yet, Pinterest does guess what they’re going to like based on their demography, at least in their first moments after sign-up. If it didn’t, some portion of users would decide Pinterest isn’t for them.
Then there are troubles that have plagued higher-profile social networks: viral misinformation, radicalization, offensive images and memes, spam, and shady sites trying to game the algorithm for profit, all of which Pinterest deals with to one degree or another. Here the company has taken a different approach than rival platforms: embrace bias, limit virality, and become something of an anti-social network.
So far, it’s working.





FFounded in 2010 by three young, male, tech workers at Mountain View’s Hacker Dojo, Pinterest struggled at first to gain traction as a general-interest platform for sharing collections of images. That changed when Iowan co-founder Ben Silbermann attended a conference for female bloggers and influencers, who took to it instantly. The site bloomed across networks of women and suburbanites who found it ideal for sharing recipes, style tips, and DIY projects along with home decor ideas.
Those early users shaped the site’s trajectory. As the company’s engineers followed the social media template by developing personalization algorithms that learned from users’ behavior, it was their interests and patterns of activity that the software absorbed. But relying too much on the specific data generated by these early users led to some problems. For example, you might stumble across a board full of wedding dresses in which the models are all white.
Initially, the home feed showed an assortment of the most popular pins from all users, based on the boards they followed, which was perfect for attracting like-minded newcomers, but not for diversifying the site’s appeal. “There was this misconception among men that Pinterest was just something women use for beauty content,” Morgan says, “even though a lot of the content was gender-neutral.”
“Would users want to proactively provide more about themselves to increase personalization? We found the answer is no — they just want the product to work for everyone.”
Over the years, Pinterest had to redesign its systems and retrain its algorithms to better identify and target different types of users and map their interests. Hence the question about gender when you sign up, the topic picker that gives the algorithm an initial sense of what you’re into, and the perhaps slightly intrusive (though industry standard) use of browser data that can tell Pinterest whether you’ve visited the site before and how you arrived there.
The question about language and region, for example, has helped Pinterest reach audiences outside the United States, who had previously complained that the platform “felt foreign to them from the moment they signed up.” Well over half of Pinterest’s users now come from outside the United States, which is in line with other social networks of its size. In some ways, those users are helping to point the way to a more inclusive Pinterest: In Japan, for instance, the company reports that men are as likely to become active users as women after visiting the site for the first time.



From the start, your experience of Pinterest is shaped by your choice of gender. Above are the interests it suggests for a new user who identifies as male and in his thirties. Credit: Pinterest

But dicing users into ever finer subgroups carries its own risks, especially for groups that have historically been underrepresented on the site. Internal data might tell you that welcoming male users with a bunch of macho images boosts activation rates. What it might not tell you is that some subset of male users is turned off, or even offended, by the implicit assumption that they’re into “man caves” or pictures of “beautiful celebrities” who are all women.
Pinterest is working on ways to help users see themselves in the product. In January, the company rolled out one of the first products to spring from a diversity initiative helmed by Morgan and Omar Seyal, Pinterest’s head of core product: a palette selector that lets you filter beauty results based on your skin tone.
It’s an admirable first step, but not a perfect one, according to the company’s research. “We wanted to understand, would users want to proactively provide more about themselves to increase personalization? We found the answer is no — they just want the product to work for everyone,” Morgan says.





PPinterest has never attracted as much media scrutiny as the likes of Twitter and Facebook, but that doesn’t mean it’s immune to the problems that have caused scandals elsewhere. One of its notable critics is Mike Caulfield, a media literacy and online communications expert at Washington State University Vancouver. In 2017, he went looking for political culture on Pinterest, and what he found was just about as ugly as what you’d expect on any other social platform. There were boards full of fake news, ethnic stereotypes, and QAnon conspiracy theories.
Caulfield argued that Pinterest’s aggressive recommendation algorithm, coupled with its reliance on user-created “boards” of related images, can turn a user’s feed into a hate-filled cesspool within minutes. “After just 14 minutes of browsing, a new user with some questions about vaccines could move from pins on ‘How to Make the Perfect Egg’ to something out of the Infowarverse,” Caulfield wrote.
Part of the problem, as explained by Middlebury College’s Amy Collier, is that spammers game Pinterest’s algorithm by putting viral political memes on the same board as, say, T-shirts they want to sell. When users engage with the memes, the algorithm shows them other items from the same board, on the theory that they might also be of interest. Eventually, it shows them the T-shirts, some fraction of them buys one, and the spammer profits.
Caulfield says he’s accustomed to tech companies ignoring his critiques or getting defensive. So Pinterest’s reaction surprised him: They thanked him for highlighting the problem and invited him to meet with company executives and share ideas for how to solve it. And then, at least on the anti-vaxx issue, they followed through.
In August, Pinterest changed how its search engine treats queries about vaccines. Rather than surfacing the most popular vaccine-related pins, Pinterest said it would now show only pins from major health organizations, such as the World Health Organization and the CDC. Caulfield applauds the company on the move, which amounted to a more decisive stand than most other platforms have taken. It showed that the company was willing to override its own software to address systemic problems that the algorithm alone can’t solve.
To what extent that approach will scale to all of the other problems that face a platform with 300 million users remains to be seen. But Pinterest seems to be willing to find out.
“The reality is, tech companies can’t do everything on Earth.”





TThe conventional wisdom among social media companies is that you can’t put too much of the onus on users to personalize their own feeds. Facebook ascended to near-global dominance by building a News Feed algorithm that knows better than users themselves what they’re likely to click on. Instagram and Twitter resisted algorithmic feeds for years, but both eventually embraced automation and saw both their user base and financial fortunes rise. Every action you take further refines the engagement optimization machine, and giving users access to its levers would only gum up the works.
Pinterest, like other social platforms, judges itself against metrics like monthly active users and activation rate, as a January blog post by its chief growth engineer makes clear. And historically, its algorithm has been no less relentless in honing users’ feeds to show them more and more of what they’ve engaged with in the past. There are familiar criticisms with this kind of thing: optimizing for engagement can lead to mindless or addictive scrolling, and it may also trap users in filter bubbles suffused with misinformation (or worse).
But what if optimizing engagement isn’t your ultimate goal? That’s a question some other social networks, such as Facebook and Twitter, have recently begun to ask, as they toy with more qualitative goals such as “time well spent” and “healthy conversations,” respectively. And it’s one that Seyal, Pinterest’s head of core product, says paved the way for the new feature the company is rolling out this week.
One of Pinterest users’ top complaints for years has been a lack of control over what its algorithm shows them, Seyal says. “You’d click on something, and your whole feed becomes that.” The question was how to solve it without putting the algorithm’s efficacy at risk. “Every person who runs a feed for an online platform will say, ‘Oh, yeah, we tried to make it more controllable. But when we tried to launch it, it dropped top-line engagement.’”
Eventually, Seyal says he decided that was the wrong question altogether. Instead, he told the engineers tasked with addressing the user-control problem that they didn’t have to worry about the effects on engagement. Their only job was to find a fix that would reduce the number of user complaints about the feed overcorrecting in response to their behavior.
The result of that project was “Tune Your Home Feed,” which it has already made available to some users. (If you have it, it will show up here.) In allowing users to tweak how the algorithm responds to each of their actions, Pinterest will offer a level of customization that relatively few will care to employ. But Seyal says it became apparent in testing that those users overlapped heavily with the ones making the complaints. They also turned out to be some of Pinterest’s most loyal fans. And after all that, testing has yet to show any significant impacts on engagement.
Now Seyal sees it as a lesson. “This is a call to other platforms to open up for their users. It’s a hard problem, but one people are increasingly craving good solutions for.”





PPinterest is now giving users more control, but like any social network that relies on algorithmically driven recommendations, it ultimately relies on a kind of bias. Unlike its peers, Pinterest welcomes it — as long as it’s the right kind.
“We are at the end of the day a user-generated-content platform,” Seyal says. “We can’t understand every single thing on the way in. We do have spammers, we do have people want to use the platform to distribute what would be negative content. Some of it is adversarial.”
What the company can do to mitigate these problems, he says, is to look carefully at the types of content its system tends to amplify, and adjust the algorithm’s parameters to prioritize some over others.
For instance, Pinterest’s algorithm treats “saves” of a given pin as a much stronger positive signal than clicks. “People don’t really save an inflammatory article about the president, but they do save an outfit they want to buy in the future. So we’re biasing toward those types of interactions, and biasing away from interactions with your friends.”
Biasing away from interactions between friends might seem like an odd approach for a social media site. But Pinterest says it’s part of how the company has mitigated problems like harassment and viral propaganda. “Ultimately, we don’t see disinformation campaigns like other platforms do, because the algorithm just doesn’t reward it,” says Malorie Lucich, Pinterest’s head of product communications. “When you’re trying to massively spam or confuse people, you probably want that content to hit the ‘front page,’ and that’s just not going to happen as easily on Pinterest.”
Even within the category of clicks, the company’s software treats clicks to what it considers “high-quality” sites as more valuable than clicks to other sites. Whenever Pinterest tests a change to the algorithm, Seyal says, it looks at how that change affects outbound traffic to a hand-chosen index of reputable sites that are focused on topics such as lifestyle, fashion, and home decor. (And whereas Facebook has tried to establish “trusted sources” by surveying users, Pinterest admits it relies on old-fashioned, subjective, human judgment.) If the change sends less traffic to those sites and more to other sites, the product team will investigate why that is. It might be a sign that the change has opened up a loophole for less reputable sites to game the algorithm.
That type of intentional modification of the software is something all major social platforms do, whether they acknowledge it or not. Critics such as Zeynep Tufekci make a persuasive case that many of social media’s problems flow from their reluctance to own up to the fundamental biases in their algorithms: not in favor of liberal politics, or conservative politics, but eyeballs — more and more eyeballs.
Companies like Google, Amazon, Facebook, and Uber are famous for their boundless ambitions and seemingly limitless growth. Seyal says Pinterest has big ideas, too. “We could get so much better at doing what we do: new formats, new kinds of interactions, things other than pins.” He believes the future of Pinterest’s algorithm involves not only reflecting users’ tastes and styles, but helping to shape them, the way top fashion brands do. He looks to Spotify’s human-curated playlists, such as the influential RapCaviar, as a model.
But then he pauses and backtracks. The key for Pinterest as it grows, he says, is to remember its own limitations. “I think we want to only be good at what we can be good at. If you want to have every user spend every moment in your product, there’s kind of a lack of humility there. The reality is, tech companies can’t do everything on Earth.”

0 comments:

Post a Comment