A Blog by Jonathan Low

 

Apr 20, 2019

Digital Designers Are Creating 'Dark Patterns' To Mislead Consumers Into Buying

It's all part of the psychological power of 'choice architecture' increasingly driving digital behavior. JL

Brian Fung reports in the Washington Post:

“Choice architecture,” or the way in which choices are presented to consumers, can shape their subsequent behavior. “Dark patterns” (are) ways in which Web designers steer users toward completing transactions, such as signing up for an email newsletter, making a purchase or consenting to the collection or sharing of personal information.The rise of dark patterns reflects how tech companies have increasingly turned human psychology into a moneymaking tool  at the expense of consumers’ ability to make informed choices.
Two U.S. senators unveiled new legislation Tuesday targeting what they say are deceptive tricks, employed by websites and tech companies, that are designed to mislead or confuse Internet users into giving away their rights and choices as consumers.
The bill is another salvo in a widening congressional effort to rein in the tech industry, whose data breaches and other privacy mishaps have prompted calls for tougher regulation of Silicon Valley.
The legislation, known as the DETOUR Act and introduced by Sens. Mark Warner (D-Va.) and Deb Fischer (R-Neb.), zeroes in on a phenomenon known as “dark patterns”: The various ways in which Web designers subtly steer users toward completing certain transactions, such as signing up for an email newsletter, making a purchase or consenting to the collection or sharing of personal information.
The rise of dark patterns reflects how tech companies have increasingly turned human psychology into a moneymaking tool — at the expense of consumers’ ability to make truly informed choices, Fischer said in a statement.
“Misleading prompts to just click the ‘OK’ button can often transfer your contacts, messages, browsing activity, photos, or location information without you even realizing it,” she said.
On Tuesday, Warner launched into a series of tweets showing how dark patterns are commonly found across the Internet.
But dark patterns, and the logic behind them, are hardly a new idea. More than a decade ago, University of Chicago economist Richard Thaler and Harvard University law professor Cass Sunstein helped shed light on the psychological aspects of decision-making with their 2008 book “Nudge.”
The book explored how “choice architecture,” or the way in which choices are presented to consumers, can powerfully shape their subsequent behavior. Examples included how, by automatically enrolling their employees in a 401(k), companies could help increase Americans’ retirement savings.
How companies ask consumers to make choices online is becoming increasingly important as more firms turn to personal data as a business model, analysts say. Nowhere is that more evident than in the tech industry, where giants such as Facebook and Google have built multibillion-dollar products out of the data that’s generated when users click on ads and enter search terms.
Without naming those businesses in particular, Tuesday’s bill appears to focus on the largest tech companies, aiming to make it illegal for firms with more than 100 million users to create user interfaces "with the purpose or substantial
effect of obscuring, subverting, or impairing user autonomy, decision-making, or choice to obtain consent or user data.”
Under the proposal, tech companies would also be required to set up independent review boards akin to those on college campuses that oversee human research studies, in order to perform testing on user engagement.
“Our choice architectures are just completely muddled and clouded by the little tricks companies play to get you to consent, even though you may not want to,” said Paul Ohm, a law professor at Georgetown University, at a Washington conference on digital privacy Tuesday hosted by the Federal Trade Commission.
The Internet Association, a trade group that represents Silicon Valley’s biggest firms in Washington, declined to comment.

0 comments:

Post a Comment