A Blog by Jonathan Low

 

Nov 21, 2018

Robots Have A Diversity Problem

Most robots are made of white plastic. You thought, perhaps, that was a random choice not based on research?  JL

Tessa Love reports in Medium:

It is well-documented that A.I. programs inherit the gender and racial biases of their creators on an algorithmic level. But we also inflict our biases onto robots. The majority of home robots are designed with white plastic, but we also actually have a bias against the ones that are coated in black plastic. (And) a study of young adults found th(ey) felt more comfortable with a security robot gendered male and a housecleaning robot gendered female. “We see ourselves in these robots.”
She wants to garden, she has studied the concept of love, and she dreams of pizza. BINA48, as you might guess from her name, is not a human, but a robot. There’s something else that makes BINA48 truly distinct, and it has nothing to do with her predilections for human ephemera. Of the small collection of robo-celebrities making the late-night talk show rounds, BINA48 is the only black-presenting one of the bunch. And as social robots continue to creep into the mainstream, her presence isn’t just increasingly necessary — it’s potentially the linchpin to the development of a more diverse society of robots.
Creating the world’s first and only black humanoid robot was not the impetus for BINA48’s inception. A bodyless bust with a lifelike head and shoulders, BINA48 was manufactured by Hanson Robotics and is owned and operated by the Terasem Movement Foundation, an organization researching the possibility of uploading one’s likeness onto “consciousness software” for afterlife preservation. Designed as a research project, BINA48 gets her personality, appearance, and namesake from Bina Rothblatt, a real human and co-founder of the Terasem Movement, who uploaded hundreds of hours’ worth of her memories, thoughts, and beliefs to give BINA48 her foundation. In other words, the robot’s skin color was happenstance.
“The reason BINA48 ended up being African-American is that she’s based on an African-American woman,” says Bruce Duncan, managing director of Terasem. “We were not actually trying to make BINA48 a spokesperson for all black people.”
Regardless, BINA48 has become an emblem of the issue of representation in A.I., opening up the question of who will be replicated in robot form, and why?
“When we’re building the architecture of our A.I. future, this is the time to make sure it’s healthy and representative and diverse.”
It is well-documented that A.I. programs of all stripes inherit the gender and racial biases of their creators on an algorithmic level, turning well-meaning machines into accidental agents of discrimination. But it turns out we also inflict our biases onto robots. A recent study led by Christoph Bartneck, a professor at the Human Interface Technology Lab at the University of Canterbury in New Zealand, found that not only are the majority of home robots designed with white plastic, but we also actually have a bias against the ones that are coated in black plastic. The findings were based on a shooter bias test, in which participants were asked to perceive threat level based on a split-second image of various black and white people, with robots thrown into the mix. Black robots that posed no threat were shot more than white ones.
“The only thing that would motivate their bias [against the robots] would be that they would have transferred their already existing racial bias to, let’s say, African-Americans, onto the robots,” Bartneck told Medium. “That’s the only plausible explanation.”
These types of biases also turn up when it comes to gender. In Singapore, a study of 198 young adults found that respondents felt more comfortable with the idea of a security robot gendered male and a housecleaning robot gendered female. And already, our widespread A.I. assistants, like Siri and Alexa, have been given female personas.
Social robots like BINA48, which are designed to one day be our companions and helpers, are not only typically female but also tend to reinforce gender and racial stereotypes: Sophia — arguably the most famous social robot — was designed with the “classic beauty” of Audrey Hepburn in mind, according to her creators at Hanson Robots; Erica, a Japanese robot created by Hiroshi Ishiguro Laboratories, has been called “the most beautiful robot in the world” and sports a doll-like physique and style; and Jia Jia, developed at the University of Science and Technology of China, was inspired by an old story about a fairy who surprises her master with a clean home and hot meals. Jia Jia is also dressed in Han-style clothing, a growing fashion movement in China based on nationalist ideologies.
Unlike Sophia, Erica and Jia Jia are not white, but they still conform to their culture’s majority race. And though a reality where social robots act as our daily companions is far off, Bartneck argues that if the field doesn’t incorporate diversity now, it will suffer the same issues that established industries are currently struggling to correct.
“We see ourselves in these robots,” he says.
BINA48 is leading the charge in this regard, and she is joined by a few other rebel robots. Alter, developed by roboticists Hiroshi Ishiguro and Takashi Ikegami of Japan, appears to be gender neutral, with a silicone face and arms and exposed body. Unlike Ishiguro’s other robots (including Erica), Alter was designed with a minimal humanistic appearance and persona in order to focus on the movement of its mechanical body. Then there’s Matsuko-roid, Japan’s cross-dressing TV show host robot based on human TV personality Matsuko Deluxe.
Like BINA48, the elements that make Alter and Matsuko-roid diverse and unique were not the elements that drove their creation. Instead, their gender-nonconforming personas were happenstance, much like the racial biases built into Google’s image-recognition algorithm that identified African-Americans as “gorillas” were happenstance. And that’s the problem: In the absence of a conscious decision to fight bias, bias creeps in.
“Instead of leaving this to some sort of randomness, why don’t you just take control and steer it in the right direction?” Bartneck says. “That is a responsibility that robotic developers have to pick up and work on.”
Though BINA48 was never meant to be a spokesperson for identity politics, her creators are embracing the fact that she has become just that. The android started her digital life knowing very little of the racial history she inadvertently found herself in. (When asked by artist and activist Stephanie Dinkins, who is working on a project stemming from her conversations with BINA48, if she had experienced racism, BINA48 somewhat nonsensically replied, “I actually didn’t have it.”) Now, however, the real Bina is uploading more information about her history as a black woman onto BINA48, and Duncan, the managing director at Terasem, is working with Dinkins and prominent black celebrities, including Whoopi Goldberg and Morgan Freeman, to support the development of her identity.
“When we’re building the architecture of our A.I. future, this is the time to make sure it’s healthy and representative and diverse,” Duncan says. “Not representing the diversity that we are as a species is like saving only four plants out of a rainforest. We’re doing a huge injustice to our own health and well-being if we don’t try to preserve the diversity in the world around us.”

0 comments:

Post a Comment