Because, research informs us, we bring human norms, biases and expectations to our interactions with everything, especially those designed to communicate with us. Which makes us both endearing and rather vulnerable.
Or, if we are being a bit more harsh, it makes us look like suckers. But if it facilitates commercial and personal interfaces, reduces friction, enhances effectiveness and efficiency or otherwise improves the quality while heightening convenience, it's probably worth investing in. As the following article explains, we are governed by the cultures in which we are raised. They provide us with markers and frameworks that make life safer and easier. Once upon a time those were survival mechanisms. Now, they perform some of those same functions but with less apocalyptic implications when failures occur. The reality is that such guidance helps with adaptability and reinforces the positivity of the experience. All of which makes your iPhone saying 'sorry' both a commercial and interpersonal advantage. JL
Alex Mayyasi reports in PriceEconomics:
For over a decade, computer scientists, psychologists, and designers have studied the many ways in which people consistently treat their computers like human beings. As early as 2000, they confidently concluded that “individuals mindlessly apply social rules and expectations to computers.”
During the dial up years, AOL greeted us with a cheery, “Welcome!” Today, Siri apologizes when she doesn’t understand a query. And this author’s old cell phone never failed to say “hello” and “goodbye” when it turned on and off. Who decided that we need our soulless machines to act like eager to please friends? It’s not like we think they’re human.
Except that we kind of do.Even though we are fully aware that electronics are not human, they act human enough (by being interactive, responding with words at times, and performing roles often filled by humans like answering questions and giving directions) that we draw on expectations and scripts from social interactions. Let’s look at a few examples given by Professors Clifford Nass and Youngme Moon of Stanford and Harvard, respectively.One way that we treat computers like humans is by applying human stereotypes. In one experiment, a computer with either a male or female voice tutored research participants who then rated their virtual tutor. Conforming with gender stereotypes and biases, they rated the male computer more knowledgeable about technology and the female computer more knowledgeable about love and relationships.In another experiment, Korean participants were faced with a hypothetical situation and then advised on the best course of action by a computer that had either a Korean voice and a Korean face on screen, or a Caucasian voice and a Caucasian face on screen. Although fully aware that the voice and picture did not represent the person who wrote the computer’s script, the research subjects were more swayed by the Korean voice and rated the advice from the “Korean computer” as more intelligent and trustworthy. The effect was as strong as running the same experiment but with video chats with actual people of that ethnicity.People also bring norms of politeness and reciprocity to human-computer interactions. Just as a mediocre teacher who asks his or her students for feedback will get sugarcoated responses, Nass and Moon found that people were too polite to give honest feedback in the form of an on screen evaluation to a mediocrely helpful computer. But when they evaluated the computer’s helpfulness on another computer, people proved as forthcoming as students privately complaining about their terrible teacher. And just as we will generally go to greater lengths to help people that have helped us, Nass and Moon found that participants asked to “help” a computer spent much more time doing so with computers that had provided helpful search responses than computers that returned bad search results.We don’t treat computers exactly like humans. We don’t cry when they die (usually) and we don’t excuse ourselves when we suddenly leave our computers to grab a coffee. But to a surprising extent, we do apply rules and expectations from the social world to interactions with computers. If you’ve ever wondered why designers at times seem to design our electronics as if we are simple children that don’t understand that they’re not human, now you have your answer. It’s because we don’t.



















0 comments:
Post a Comment