A Blog by Jonathan Low

 

Jul 19, 2019

Let's Face It - People Want To Be Seen

Is technology adapting more to humanity, or is humanity adapting more to technology? JL


Colin Horgan reports in Medium:

The tech giants who own and control the platforms and tools that mediate our lives didn’t just monetize human behavior — they helped shape it. In adopting the new tech platforms, accepting and incorporating their underlying algorithms into our everyday lives, we have simultaneously been acclimatized to not just their operations, but the overlying expectations these tools carry. Primary among those expectations is not only that we should be seen, but that we should want to be seen.
Two weeks ago, the Economist’s Asian technology correspondent, Hal Hodson, started a Twitter thread. He was worried. “Something really massive is happening, and I feel like society is barely grasping the tendrils of the implications,” he wrote. “Technology is eroding one of the great levees of human society — the ability to move around the physical world anonymously.”
Hodson, like the rest of us, has reason to be concerned. Technology companies are deep into an ongoing shift from collecting endless data about what we do online to collecting endless data about what we’re doing offline too. The so-called “internet of things” is expanding, and the scope of what’s knowable about users — what we do, where we go, who we know, or what we look like — is widening quickly. That expansion is creating some unnerving scenarios.
There are products that scan students’ electronic communications, and others that track their movements, doorbell cameras that watch the streets, facial recognition replacing tickets at airports, programs to track employees, cameras that capture you at the mall and others at concerts, and apps that are surreptitiously recording your every location. The list goes on.
We can easily imagine the grander implications of it all, an end-state scenario. It’s so easily imaginable that living within an all-seeing, all-knowing infrastructure is the subject of some of our most celebrated fiction. And, if we still can’t quite wrap our heads around it, China offers a current, real-world example. In other words, we can see clearly into the future, and yet despite some efforts to retain our invisibility, by and large, we continue barreling onward.
Why?
It’s important to remember that the tech giants who own and control the platforms and tools that we depend on to mediate our lives didn’t just monetize human behavior — they helped shape it. They did so in myriad ways: by adding a layer of unpredictability into our everyday lives via systems of communication designed to confuse us; by introducing a new perspective on commodification–of our homes, cars, and other possessions; and even changing the way we think about finding love.
But the overarching shift has been about values, about what inherently matters most to people. In adopting the new tech platforms, accepting and incorporating their underlying algorithms into our everyday lives, we have simultaneously been acclimatized to not just their operations, but the overlying expectations these tools carry. Primary among those expectations is not only that we should be seen, but that we should want to be seen.
As Taina Bucher explores in her book, If… Then: Algorithmic Power and Politics, we are generally in the dark when it comes to explaining the specific mechanics of the algorithms at the heart of our favorite apps, but what we do regularly guess at is how to make them notice us. As Bucher discovered in conversation with social media users, people tweak the content of their posts on Facebook, or even the time of day they post, in an attempt to catch the algorithm’s eye and ride the wave of its amplification. This is familiar to anyone who’s used social media, and it’s why news organizations repost old content when a topic is trending on Twitter, or why you see a million hashtags at the end of Instagram posts. Everyone is just trying to get noticed.
In a society mediated by online platforms, visibility is the only thing that matters.
Because being noticed is vital. Platforms reward and validate participation in purposefully confusing ways to keep us guessing, and, like a slot machine, returning for more. But the tech companies design them in a way that the experience can actually improve the more we use them. So, we strive to be noticed. If we’re noticed, we might get more reaction on our Facebook posts, we might get more requests on Airbnb, more swipes on Tinder, or simply more money on Depop. With more attention comes, potentially, a better life. And the opposite is also true: If you’re not seen, maybe you speak to fewer people, maybe you make less money, and maybe you never find your soulmate. With no attention, life can really suck–and that’s especially true now.
In a society mediated by online platforms, visibility is the only thing that matters. In fact, while we commonly think of the attention economy as hinging on the limited time we have to give on any one platform, it’s likely that the attention those platforms can give us is as valuable a commodity.
The thirst for visibility might not fully explain our complacency to the worrisome growth of surveillance technology — we might just be lazy — but it’s not surprising that a lot of people are normalized to an even closer examination of our lives now that the tech giants have been training us for more than a decade
For most people, deeper surveillance sounds less like a threat and instead like the route to a more personalized user experience. Which it is! But it’s also going to mean something else, too: a change in how we operate within our society. Where we go and who we meet, and what we say and do when we get there–or, in some cases, whether we get there at all–might all one day soon be monitored. In other words, what will actually change is our human experience. We can already see what that might mean. For one, society’s marginalized will face (and already do) even greater targeting and profiling. And for everyone, but most specifically for those who already can’t afford it, the privilege of privacy will come at a steeper and steeper price — a kind of final bill we’ll all finally be asked to pay for years of free services that gave us seemingly unlimited VIP treatment on the house.
Still, while we may have found ourselves in a bad place, Hodson’s lament also points to an opposing trend: the growing resistance to surveillance tech. For instance, a recent test of movement tracking tech in New York schools ended after public outcry. Similarly, San Francisco banned police use of facial recognition technology, and there is a lively debate over whether Detroit should do the same. At a more personal level, as Wired reports, teenagers are, ironically, using TikTok, the newest popular social media app with its own security concerns, to vent their frustrations about Life360, a surveillance tool marketed to parents who want to keep tabs on their children’s whereabouts.
A full-scale rejection of surveillance tech seems unlikely — too much of it is already out there. And we should remain concerned about its increasing prevalence in our lives, lest we realize the worst of what we’ve previously imagined as possible. But just as social values shifted toward emphasizing our personal visibility, they can, and probably will, shift back. Just as we accepted that social equity was measured in visibility, we can make obscurity valuable again. In other words, anonymity isn’t lost forever, it’s just invisible for now.

0 comments:

Post a Comment