A Blog by Jonathan Low

 

Nov 22, 2017

Google Has Picked An Answer For You: Too Bad So Many Are Wrong

While on a percentage basis Google gets most answers right, in absolute terms, millions are wrong. The reason has to do with its algorithm and its sourcing. The chart to the left shows that the number of less authoritative, but remunerative sources (in deeper purple) is rising. JL


Jack Nicas reports in the Wall Street Journal:

Google’s search engine answered 74.3% of 5,000 questions, and on those answers it had a 97.4% accuracy rate. Both percentages are higher than services from Amazon.com Inc., Apple Inc. and Microsoft Corp. Yet since Google handles trillions of queries a year, even a 2.6% error rate suggests Google serves billions of answers a year that are incomplete, irrelevant or wrong.
Google became the world’s go-to source of information by ranking billions of links from millions of sources. Now, for many queries, the internet giant is presenting itself as the authority on truth by promoting a single search result as the answer.
To the question “Does money buy happiness?” Google recently highlighted a result that stated: “There is enough scientific research to prove” it.
“Who are the worst CEOs of all time?” Google answered with the names and photos of 11 chief executives, including Gordon Bethune of Continental Airlines and Robert Nardelli of Home Depot Inc.
Sometimes, Google’s response depends on how the question is asked. For “Should abortion be legal?” Google cited a South African news site saying, “It is not the place of government to legislate against woman’s choices.”
When asked, “Should abortion be illegal?” it promoted an answer from obscure clickbait site listland.com stating, “Abortion is murder.”

Answer Engine

Google is reshaping its search results to directly answer users’ questions versus directing them to other sources.

Percentage of searches using new formats
30
%
Featured
snippets
Knowledge
cards
20
‘People also
ask’ boxes
10
0
Jan.
2016
July
2015
Feb.
2017
Note: Stone Temple ran the same nearly 1.5 million queries in the given months and studied the results to see changes in Google’s response.
Source: Stone Temple Consulting
Google Has Picked an Answer for You—Too Bad It’s Often WrongThe promoted answers, called featured snippets, are outlined in boxes above other results and presented in larger type, often with images. Google’s voice assistant sometimes reads them aloud. They give Google’s secret algorithms even greater power to shape public opinion, given that surveys show people consider search engines their most-trusted source of information, over traditional media or social media.
Google typically lists the source below the answer—or credits the source first when reading an answer aloud—but not always. The worst-CEOs list was unsourced. “That’s the dumbest bunch of shit I’ve ever seen,” Mr. Bethune said in an interview. Mr. Nardelli declined to comment.
Also unsourced was an inaccurate answer that said former President Barack Obama and Rep. Peter King (R., NY) are Muslim members of Congress.
Google’s featured answers are feeding a raging global debate about the ability of Silicon Valley companies to influence society. Google and other internet giants are under intensifying scrutiny over the power of their products and their vulnerability to bias or manipulation.
Facebook Inc. was criticized for enabling the spread of false news reports during the 2016 presidential election, and Google has been called out for promoting discredited conspiracy theories, including about recent mass shootings in Nevada and Texas. Executives from Google, Facebook and Twitter Inc. were called before Congress in recent weeks to testify about Russia-backed accounts that used their platforms to sow misinformation. The companies say they are addressing the issues.
Google spokeswoman Susan Cadrecha said the company’s goal isn’t to do the thinking for users but “to help you find relevant information quickly and easily.” She added, “We encourage users to understand the full context by clicking through to the source.”
Featured snippets are “generated algorithmically and [are] a reflection of what people are searching for and what’s available on the web,” the company said in an April blog post. “This can sometimes lead to results that are unexpected, inaccurate or offensive.”
Google, a unit of Alphabet Inc., handles almost all internet searches. Featured snippets appear on about 40% of results for searches formed as questions, according to a July study completed for The Wall Street Journal by search-data firm SEMrush.
An algorithm chooses featured snippets from websites in part by how closely they appear to satisfy a user’s question, factoring in Google’s measure of a source’s authority and its ranking in the search results.
By answering questions directly, Google aims to make the search engine more appealing to users and the advertisers that chase them. The answers’ real estate is so attractive that there is a budding marketing industry around tailoring content so it becomes a featured snippet. (There is even a featured snippet for “how to get a featured snippet.”)
Digital-marketing firm Stone Temple Consulting, which tracked nearly 1.5 million searches, found that as Google expanded the use of featured snippets, it has relied more often on less authoritative sources, such as purveyors of top-10 lists and gossipy clickbait.
Those issues have spurred an internal debate on Google’s search team over how much they should meddle with the featured answers, which the group believes have a greater weight with users than typical search results, according to a former manager on the team.
University of North Carolina professor Zeynep Tufekci, who studies technology’s effect on society, said Google shouldn’t put its trusted seal of approval on answers it isn’t certain are accurate. “For them to wield their algorithm like this is very worrisome,” she said. “This is how people learn about the world.”
Gummi Hafsteinsson, who oversees Google’s virtual assistant, said in an interview that teams of Google employees try to weed out inaccurate answers, but that the answers overall help inform because they are almost always right. “The kinds of things we can answer are unbelievable,” he said. No one at Google writes an answer for “how to remove a red stain from a carpet,” he said, but Google’s algorithm finds a solution on the web and serves it to users.
“I think the benefit is tremendously big,” he said. “It’s always a balancing act in terms of quality.”
A study this year by Stone Temple, a prominent analyst of the industry, showed Google’s search engine answered 74.3% of 5,000 questions, and on those answers it had a 97.4% accuracy rate. Both percentages are higher than services from Amazon.com Inc., Apple Inc. and Microsoft Corp.
Yet since Google handles trillions of queries a year, even a 2.6% error rate suggests Google serves billions of answers a year that are incomplete, irrelevant or wrong.
An Amazon spokeswoman said it answers most questions with information from trusted third parties, and that it is careful with answers on sensitive topics. An Apple spokeswoman said it generally only answers questions for which it has clear factual answers. Microsoft said it aims to offer answers on its Bing search engine and Cortana virtual assistant that are “relevant, balanced and trustworthy.”
Google launched in 1998 and quickly attracted users. Its simple lists of blue links distilled the internet’s immense content, ranked by an algorithm based on how often websites cited each other.
In 2012, Google began answering basic factual queries with so-called knowledge cards, drawn from an internal encyclopedia called the knowledge graph that has more than a billion entries based on sources including Wikipedia and the Central Intelligence Agency’s World Factbook. For questions with clear answers, such as, “How tall is Shaq?” the knowledge graph proved reliable in helping users quickly find information. But it could handle only a small fraction of questions.
At the time, the company was developing Google Glass, a wearable computer resembling eyeglasses that placed a tiny screen in the user’s peripheral vision. The screen couldn’t display a webpage or scroll through lists, but it could answer questions.
About five engineers from the several thousand employees on the search team began developing a question-answer system that pulled answers from a wide range of internet sources, according to the former search manager.
Google Glass was a flop, but the search team saw the value of the answer system and made it a focus, the person said.
That proved prescient. Today, Google is locked in a race with four other U.S. tech giants—Apple, Microsoft, Amazon and Facebook—to win users with more intelligent services, including virtual assistants, that they hope will make them more central to users’ lives, and open new opportunities to sell ads and products.
The assistants are often built into devices with smaller screens or no screens at all, including smartphones, watches and voice-controlled speakers like the Amazon Echo and Google Home, where long lists of ranked sources are impractical.
“Searching on a mobile device is very different from a desktop computer. Speed and simplicity really matter,” Alphabet Chairman Eric Schmidt said in 2014, the year Google launched featured snippets. “It’s why the best answer is quite literally the answer.”
With many tech companies betting virtual assistants are the future of computing, question-answer systems are likely to become more common, and wield more influence, in the years ahead.
When Google debuted featured snippets, they appeared on roughly one of every 1,000 searches, the former manager said. The team tweaked the algorithm to “squeeze a little bit more out of it” and increased the number of snippets by roughly 5% to 10% each month, the person said. Subjects expanded to include health, law, business, politics and religion.
Stone Temple’s data indicate Google over time has pulled more answers from less reliable sources. The firm judged the sources using third-party ratings that approximate how Google measures a site’s authority.
Answers generated from sources with a 90 or higher rating, such as Wikipedia or the Journal, dropped to less than 40% of featured snippets in February from more than 60% in July 2015. Answers from sites rated below 40, including blogs and clickbait sites, rose to 15% from 6.5% over the period.
Sites with low rankings can generate unreliable answers. To the query “Why are Komodo dragons endangered?” the featured answer was volcanoes, fire and tourism. The source? A Canadian elementary school student’s report posted online. Komodo dragons aren’t endangered.
Because Google’s algorithm seeks answers that closely match users’ questions, its responses often reflect how a question is framed. That can lead to different answers to similar questions, and contribute to confirming biases.
A recent search for “Is milk good for you?” yielded an answer from a health site saying, “Milk can be good for the bones because it provides vitamin D and calcium.”
A separate search for “Is milk bad for you” featured an answer saying, “Calcium from animal milk is not absorbed as well as that from plant-based sources, and it can be accompanied by a number of dangerous health problems.” That came from People for the Ethical Treatment of Animals, a group that advocates against any consumption of animal products.
Jackson Miller, owner of a Nashville, Tenn., resale shop, said Google provided the incorrect dates of Tennessee’s tax-free weekend this summer, one of the busiest weekends for retailers.
Mr. Miller said he suspects some customers were fooled, because everyone trusts Google. “You treat it as a primary source,” he said. “It’s like, ‘Google says it, so it must be true.’ ”
Google is constantly changing its search results, so such results only appear some of the time. In the April blog post, Google said it returned “offensive or clearly misleading content” for one in every 400 queries. It said it would improve its algorithm and make it easier for users and employees to flag problem results.
The post came after several inaccurate and unsavory featured snippets got attention on social media, including results saying that several past U.S. presidents were Ku Klux Klan members and that women are evil.
Created with Highcharts 5.0.14Digital SmartsGoogle's search engine answers more questions, at a higher accuracy rate, than other virtualassistants.Google's search engine answers more questions, at a higher accuracy rate, than other virtual assistants.THE WALL STREET JOURNALSource: Stone Temple ConsultingNote: Based on 5,000 questions, tested in February and March this year. Wrong answers includeresponses analysts determined were incomplete, irrelevant or inaccurate.Note: Based on 5,000 questions, tested in February and March this year. Wrong answers include responses analysts determined were incomplete, irrelevant or inaccurate.
Created with Highcharts 5.0.14AnsweredComplete andcorrectComplete and correctGoogle SearchGoogle AssistantCortana (Microsoft)Siri (Apple)Alexa (Amazon)0%100255075Complete and correctxAlexa (Amazon)x87%
Google Has Picked an Answer for You—Too Bad It’s Often WrongMs. Cadrecha of Google told the Journal this week that the company recently changed its algorithm to limit featured snippets on sensitive topics, such as religion and politics. Article continues below
Mr. Hafsteinsson of Google said the system is designed to avoid unanswerable queries, but while a subjective question seems obvious to humans, “it might not be to the algorithms.”
Meanwhile, Google has expanded another element—“People also ask” boxes—that serves up answers to questions similar to a given search. The product appears to rely on the same algorithm as featured snippets and can push misleading information on topics users weren’t even searching for.
To a search for, “Are people born evil,” a box suggested the question, “Can a person be born homosexual?” Google, citing a website procon.org that presents differing opinions on controversial topics, answered that while “many ex-gays” say they were born gay, “the reality is that no scientific evidence has established a genetic cause for homosexuality.”
In February, Google included “People also ask” boxes in 16.3% of its search results, up from 1.4% a year earlier, according to Stone Temple data.
Promoting such answers suggests “they’ve given it their stamp of approval, to say this is the one versus these are the 10,” said Pete Meyers, an analyst who studies Google results for the marketing-analytics firm Moz Inc. “People generally trust Google, but now these answers aren’t coming from a trusted source.”

0 comments:

Post a Comment