A Blog by Jonathan Low

 

Sep 20, 2019

Why Algorithms Encode the Subjectivities Of Their Human Designers

The notion that algorithms, or data, or technology is somehow objective and unbiased, neutral in all it surveys, has become a matter of faith for those disillusioned with their human colleagues.

But the reality is that algorithms and the data from which they are derived, and the devices on which they are devised or reported, are infused with the education, training and belief systems of their creators. JL


Sidney Fussell reports in The Atlantic:

Algorithms interpret millions of data points, and the exact path from input to conclusion can be difficult to make plain. But the effects are clear. This is a powerful asymmetry: Anyone can notice a change in search results, but it’s difficult to prove what caused it. That gives algorithm designers deniability. Because of their opacity, algorithms can privilege or discriminate without their creators designing them to do so. Algorithms provide “strategic ignorance." Try as companies might to minimize personal accountability, it is humans who build, train, and deploy algorithms. Human subjectivities are encoded every step of the way.



In aviation, the black box is a source of knowledge: a receptacle of crucial data that might help investigators understand catastrophe. In technology, a black box is the absence of knowledge: a catchall term describing an algorithmic system with mechanics its creators can’t—or won’t—explain. Algorithms, in this telling, are unknowable, uncontrollable, and independent of human oversight, even as they promote extremist content, make decisions affecting our health, or act in potential violation of antitrust law.
In investigative reports and international courts, Amazon, Google, and other tech platforms have been accused of tweaking their search algorithms to boost their own profits and sidestep antitrust regulations. Each company denies interfering with its respective search algorithm, and because of the murky mechanics of how search works, proving the allegations is nearly impossible.
Amazon allegedly adjusted its search algorithm to prioritize private-label products, which earn the company a higher profit, over items from competitors, The Wall Street Journal reported Monday. According to the Journal, Amazon reworked its search tool so that when results are ordered by relevance, items that are more profitable to Amazon appear ahead of items that might be more popular or relevant, but would make it less money. Some of these products include the “Amazon Basics” line, with products such as paper towels, batteries, and clothing. Amazon’s private-label business represents only 1 percent of its overall retail sales, but engineers said they were pressured by the private-label sales team to emphasize these products in search. The unnamed engineers who told the Journal about the change said they protested it, but were ignored.
Amazon admits it takes profitability into account when displaying products, but it denies reworking its search tool to privilege its own products over the competition. In an emailed statement to The Atlantic, an Amazon spokesperson wrote, “The fact is that we have not changed the criteria we use to rank search results to include profitability. We feature the products customers will want, regardless of whether they are our own brands or products offered by our selling partners. As any store would do, we consider the profitability of the products we list and feature on the site, but it is just one metric and not in any way a key driver of what we show customers.”



Amazon’s carefully worded denial shows how difficult it is to prove that a search algorithm exhibits favoritism or discrimination. Amazon need not include a profitability metric in its formula to increase the profitability of its business. The Journal’s report alleges that Amazon’s search algorithms instead take into account proxies for profitability. The company makes more money from certain products than others, and if the more lucrative products have something in common (maybe where they’re sold or how they’re packaged), tracking that variable could theoretically scale with profitability. Engineers told the Journal that Amazon’s review committee wouldn’t approve any changes to the algorithm unless they also increased profitability.
Amazon is incredibly good at making money, which is why the possibility that it’s privileging its own products in its search tool is ringing alarm bells. Amazon controls the marketing, manufacturing, and distribution of its private-label products as well as the marketplace they’re sold on. Rigging the search algorithm, if the allegations are true, would only compound its advantages.
Online marketplaces have lately been under renewed scrutiny from federal regulators, both in the U.S. and abroad, as part of a larger conversation about breaking up tech companies and imposing greater fees for anticompetitive behavior. In 2017, the European Union fined Google’s parent company, Alphabet, $2.7 billion for allegedly pushing independent comparison-shopping sites to the second or third page of search results while displaying its own widget at the very top of results. Merchants pay Google when users click on these ads. And earlier this month, The New York Times reported that Apple reworked its App Store algorithm to unfairly privilege its own apps, suggesting Apple Music over Spotify when users searched for “music,” for example. (Apple denied this to the Times and, in an emailed statement to The Atlantic, reiterated that its algorithm takes a host of factors into account.)
Algorithms interpret potentially millions of data points, and the exact path from input to conclusion can be difficult to make plain. But the effects are clear. This is a very powerful asymmetry: Anyone can notice a change in search results, but it’s extremely difficult to prove what caused it. That gives algorithm designers immense deniability.






In 2016, for example, Bloomberg reported that Amazon Prime was much less likely to offer same-day service to predominantly black and Latino suburbs in Boston and New York. The algorithm that determines eligible neighborhoods, Amazon explained, was designed to determine the cost of providing Prime, based on an area’s distance from warehouses, number of Prime members, and so on. That explanation, Bloomberg reported, was a shield for the human designers’ choice to ignore how race and poverty correlate with housing, and how that inequality is replicated in Amazon’s products.
Because of their opacity, algorithms can privilege or discriminate without their creators designing them to do so, or even being aware of it. Algorithms provide those in power with what’s been termed “strategic ignorance”—essentially, an excuse that arises when it’s convenient for the powerful not to know something. The antitrust movement is built on trying to locate humans somewhere within Big Tech’s enormous, corporatized systems. Try as companies might to minimize personal accountability, it is humans who build, train, and deploy algorithms. Human biases and subjectivities are encoded every step of the way.

0 comments:

Post a Comment