A Blog by Jonathan Low

 

Jul 17, 2016

Why You Should Be Nice To Your Robots

Do unto others as you would have them do unto you: especially when they designed to learn from the behaviors they observe and adapt accordingly. JL

Oliver Burkeman comments in the Guardian:

How’s a four-year-old supposed to learn that other household members aren’t simply there to do her bidding, when one (electronic) household member was designed to do exactly that? It’s one thing to act with compassion when you spend your days with family and neighbors. It’s quite another with various human-like devices, which invite us to order them around. It demands effort to remember that humans are humans. And that computers  aren’t.
I became highly confused the first time I used the Amazon Echo, a voice-activated “smart home assistant” that sits in the corner and responds to the name Alexa – as in “Alexa, play some music!” or “Alexa, how many ounces in a kilogram?” Partly, this was because the only person I know who owns an Echo is herself called Alexa, and she was home at the time. But that aside, it’s hard to bark orders at a machine without feeling like the kind of obnoxious person who barks orders at waiters. That is, unless you start young. “We love our Amazon Echo… but I fear it’s also turning our daughter into a raging asshole,” the Silicon Valley investor Hunter Walk fretted recently. Alexa doesn’t need you to say please or thank you; indeed, she responds better to brusque commands. “Cognitively, I’m not sure a kid gets why you can boss Alexa around, but not a person,” Walk wrote. How’s a four-year-old supposed to learn that other household members aren’t simply there to do her bidding, when one (electronic) household member was designed to do exactly that?
Such worries will grow more urgent as we interact with more convincingly humanesque devices. As the tech writer John Markoff puts it: “What does it do to the human if we have a class of slaves which are not human, but that we treat as human?” Most of us would agree with Immanuel Kant that it’s unethical to treat others as mere means to our own ends, instead of ends in themselves. That’s why slavery damages the slaveholder as well as the slave: to use a person as if they were an object erodes your own humanity. Yet Alexa (like Google Home, and Siri, and the rest) trains us to think of her as both human yet solely there to serve. Might we start thinking of real humans that way more frequently, too?Or maybe I’m just a curmudgeon, convinced – like every generation in history – that society’s getting coarser, when really it isn’t. At first glance, that’s what you might conclude from a new study, highlighted on the Research Digest blog, about how we interpret everyday rudeness. People probably aren’t getting ruder, the researchers argue. Rather, it’s that we interpret the behaviour of strangers as ruder than our own, or our friends’. (If I snap at a shop assistant, it’s because I’m having a bad day; if someone insults me on Twitter, it’s because they’re a terrible person.) And thanks to urbanisation, globalisation and the internet, we encounter more strangers than ever. So we wrongly conclude rudeness is on the rise.
Then again, maybe this just proves another point: empathy is hard, and it’s only going to get harder. It’s one thing to act with compassion when you spend your days with family and neighbours, as people once did. It’s quite another when we interact so much with strangers – against whom we’re biased to begin with – and with various human-like devices, which invite us to order them around. It demands real effort to remember that humans (even the awful ones) are humans. And that computers (even the subserviently friendly ones) aren’t.

0 comments:

Post a Comment