A Blog by Jonathan Low

 

Aug 19, 2018

Smartphones and the Attention Economy

It might better be called the not-paying-attention economy. And there are operational and financial consequences we are only beginning to understand. JL


Casey Schwartz reports in the New York Times:

“We don’t understand how modern technology and changes in our culture impact our ability to sustain our attention on our goals.” When Facebook or Google points their “supercomputers” toward our minds, he said, “it’s checkmate.”Devices that come with us everywhere we go introduce a brand new dynamic: Rather than compete with their siblings for their parents’ attention, children are up against iPhones and iPads, Siri and Alexa, Apple watches and computer screens.
It was the big tech equivalent of “drink responsibly” or the gambling industry’s “safer play”; the latest milestone in Silicon Valley’s year of apology. Earlier this month, Facebook and Instagram announced new tools for users to set time limits on their platforms, and a dashboard to monitor one’s daily use, following Google’s introduction of Digital Well Being features.
In doing so the companies seemed to suggest that spending time on the internet is not a desirable, healthy habit, but a pleasurable vice: one that if left uncontrolled may slip into unappealing addiction.
Having secured our attention more completely than ever dreamed, they now are carefully admitting it’s time to give some of it back, so we can meet our children’s eyes unfiltered by Clarendon or Lark; go see a movie in a theater; or contra Apple’s ad for its watch, even go surfing without — heaven forfend — “checking in.”
“The liberation of human attention may be the defining moral and political struggle of our time,” writes James Williams, a technologist turned philosopher and the author of a new book, “Stand Out of Our Light.”

Mr. Williams, 36, should know. During a decade-long tenure at Google, he worked on search advertising, helping perfect a powerful, data-driven advertising model. Gradually, he began to feel that his life story as he knew it was coming unglued, “as though the floor was crumbling under my feet,” he writes.
Increasingly public incidents of attention failure, like Pablo Sandoval, when he was the third baseman for the Red Sox, getting busted checking Instagram in the middle of a game (and getting suspended), or Patti LuPone taking away an audience member’s phone, both in 2015, would likely come as no surprise to him.
Mr. Williams compares the current design of our technology to “an entire army of jets and tanks” aimed at capturing and keeping our attention. And the army is winning. We spend the day transfixed by our screens, thumb twitching in the subways and elevators, glancing at traffic lights.
We flaunt and then regret the habit of so-called second screening, when just one at a time isn’t enough, scrolling through our phones’ latest dispatches while watching TV, say.
One study, commissioned by Nokia, found that, as of 2013, we were checking our phones on average 150 times a day. But we touch our phones about 2,617 times, according to a separate 2016 study, conducted by Dscout, a research firm.
Apple has confirmed that users unlock their iPhones an average of 80 times per day. Screens have been inserted where no screens ever were before: over individual tables at McDonald’s; in dressing rooms when one is most exposed; on the backs of taxi seats. For only $12.99, one can purchase an iPhone holster for one’s baby stroller … or (shudder) two.
This is us: eyes glazed, mouth open, neck crooked, trapped in dopamine loops and filter bubbles. Our attention is sold to advertisers, along with our data, and handed back to us tattered and piecemeal.
Image
CreditLucy Jones

You’ve Got Chaos

Mr. Williams, 36, was speaking on Skype from his home in Moscow, where his wife, who works for the United Nations, has been posted for the year.
Originally from Abilene, Tex., he had arrived to work at Google in what
could still be called the early days, when the company, in its idealism, was resistant to the age-old advertising model. He left Google in 2013 to conduct doctoral research at Oxford on the philosophy and ethics of attention persuasion in design.
Mr. Williams is now concerned with overwired individuals losing their life purpose.
“In the same way that you pull out a phone to do something and you get distracted, and 30 minutes later you find that you’ve done 10 other things except the thing that you pulled out the phone to do — there’s fragmentation and distraction at that level,” he said. “But I felt like there’s something on a longer-term level that’s harder to keep in view: that longitudinal sense of what you’re about.

He knew that among that his colleagues, he wasn’t the only one feeling this way. Speaking at a technology conference in Amsterdam last year, Mr. Williams asked the designers in the room, some 250 of them, “How many of you guys want to live in the world that you’re creating? In a world where technology is competing for our attention?”
“Not a single hand went up,” he said.
Mr. Williams is also far from the only example of a former soldier of big tech (to continue the army metaphor) now working to expose its cultural dangers.
Jaron Lanier has plenty more to say about screens and social media
In late June, Tristan Harris, a former design ethicist for Google, took the stage at the Aspen Ideas Festival to warn the crowd that what we are facing is no less than an “existential threat” from our very own gadgets.
Red-haired and slight, Mr. Harris, 34, has been playing the role of whistle-blower since he quit Google five years ago. He started the Center for Humane Technology in San Francisco and travels the country, appearing on influential shows and podcasts like “60 Minutes” and “Waking Up,” as well as at glamorous conferences like Aspen, to describe how technology is designed to be irresistible.
He likes a chess analogy. When Facebook or Google points their “supercomputers” toward our minds, he said, “it’s checkmate.”

Back in the more innocent days of 2013, when Mr. Williams
and Mr. Harris both still worked at Google, they’d meet in conference rooms and sketch out their thoughts on whiteboards: a concerned club of two at the epicenter of the attention economy.
Since then, both men’s messages have grown in scope and urgency. The constant pull on our attention from technology is no longer just about losing too many hours of our so-called real lives to the diversions of the web. Now, they are telling us, we are at risk of fundamentally losing our moral purpose.
“It’s changing our ability to make sense of what’s true, so we have less and less idea of a shared fabric of truth, of a shared narrative that we all subscribe to,” Mr. Harris said, the day after his Aspen talk. “Without shared truth or shared facts, you get chaos — and people can take control.”
They can also profit, of course, in ways large and small. Indeed, a whole industry has sprung up to combat tech creep. Once-free pleasures like napping are now being monetized by the hour. Those who used to relax with monthly magazines now download guided-meditation apps like Headspace ($399.99 for a lifetime subscription).
HabitLab, developed at Stanford, stages aggressive interventions whenever you enter one of your self-declared danger zones of internet consumption. Having a problem with Reddit sucking away your afternoons? Choose between the “one-minute assassin,” which puts you on a strict 60-second egg timer, and the “scroll freezer,” which creates a bottom in your bottomless scroll — and logs you out once you’ve hit it.
Like Moment, an app that monitors screen time and sends you or loved ones embarrassing notifications detailing exactly how much time has been frittered away on Instagram today, HabitLab gets to know your patterns uncomfortably well in order to do its job. only, we now need our phones to save us from our phones.

Researchers have known for years that there’s a difference between “top-down" attention (the voluntary, effortful decisions we make to pay attention to something of our choice) and “bottom-up” attention, which is when our attention is involuntarily captured by whatever is going on around us: a thunderclap, gunshot or merely the inviting bleep that announces another Twitter notification.
But many of the biggest questions remain unanswered. At the top of that list, no smaller a mystery remains than “the relationship between attention and our conscious experience of the world,” said Jesse Rissman, a neuroscientist whose lab at U.C.L.A. studies attention and memory.
Also unclear: the consequence of all that screen time on our bedraggled neurons. “We don’t understand how modern technology and changes in our culture impact our ability to sustain our attention on our goals,” Dr. Rissman said.
Britt Anderson, a neuroscientist at the University of Waterloo in Canada, went so far as to write a 2011 paper titled “There Is No Such Thing as Attention.”
Dr. Anderson argued that researchers have used the word to apply to so many different behaviors — attention span, attention deficit, selective attention and spatial attention, to name a few — that it has become essentially meaningless, even at the very moment when it’s more relevant than ever.

Are the Kids … All Right?

Despite attention’s possible lack of existence, though, many among us mourn its passing — Ms. LuPone, for example, and others who command
an audience, like college professors.

Katherine Hayles, an English professor at U.C.L.A., has written about the change she sees in students as one from “deep attention,” a state of single-minded absorption that can last for hours, to one of “hyper attention,” which jumps from target to target, preferring to skim the surface of lots of different things than to probe the depths of just one.
At Columbia University, where every student is required to pass a core curriculum with an average of 200 to 300 pages of reading each week, professors have been discussing how to deal with the conspicuous change in students’ ability to get through their assignments. The curriculum has more or less stayed in place, but “we’re constantly thinking about how we’re teaching when attention spans have changed since 50 years ago,” said Lisa Hollibaugh, a dean of academic planning at Columbia.
In the 1990s, 3 to 5 percent of American school-aged children were thought to have what is now called attention deficit hyperactivity disorder. By 2013, that number was 11 percent, and rising, according to data from the National Survey of Children’s Health.
At Tufts University, Nick Seaver, an anthropology professor, just finished his second year of teaching a class he designed called How to Pay Attention. But rather than offering tips for focusing, as one might expect, he set out to train his students to look at attention as a cultural phenomenon — “the way people talk about attention,” Dr. Seaver said, with topics like the “attention economy” or “attention and politics.”
As part of their homework for the “economy” week, Dr. Seaver told his students to analyze how an app or website “captures” their attention and then profits from it.
Morgan Griffiths, 22, chose YouTube. “A lot of the media I consume has to do with ‘RuPaul’s Drag Race,’” Mr. Griffiths said. “And when a lot of those videos end, RuPaul himself pops up at the very end and says, ‘Hey friends, when one video ends, just open the next one, it’s called binge viewing, go ahead, I encourage you.’”

A classmate, Jake Rochford, who chose Tinder, noted the extreme stickiness of a new “super-like” button. “Once the super-like button came into play, I noticed all of the functions as strategies for keeping the app open, instead of strategies for helping me find love,” Mr. Rochford, 21, said. After completing that week’s assignment, he disabled his account.
But Dr. Seaver, 32, is no Luddite.
“Information overload is something that always feels very new but is actually very old,” he said. “Like: ‘It is the 16th century, and there are so many books.’ Or: ‘It is late antiquity and there is so much writing.’
“It can’t be that there are too many things to pay attention to: That doesn’t follow,” he said. “But it could be that there are more things that are trying to actively demand your attention.”
And there is not only the attention we pay to consider, but also the attention we receive.
Sherry Turkle, the M.I.T. sociologist and psychologist, has been writing about our relationship with our technology for decades. Devices that come with us everywhere we go, she argues, introduce a brand new dynamic: Rather than compete with their siblings for their parents’ attention, children are up against iPhones and iPads, Siri and Alexa, Apple watches and computer screens.
Every moment they spend with their parents, they are also spending with their parents’ need to be constantly connected. It is the first generation to be so affected — now 14 to 21 years old — that Dr. Turkle describes in detail in her most recent book, “Reclaiming Conversation.”
“A generation has grown up that has lived a very unsatisfying youth and really does not associate their phones with any kind of glamour, but rather with a sense of deprivation,” she said.
And yet Dr. Turkle is cautiously optimistic. “We’re starting to see people inching their way toward ‘time well spent,’ Apple inching its way toward a mea culpa,” she said. “And the culture itself turning toward a recognition that this can’t go on.”

0 comments:

Post a Comment