Kiwi scientists have found an intriguing problem with designing humanoid robots that may walk and work among us in the future: racism. Photo / 123RF
New Zealand scientists have found an intriguing problem with designing humanoid robots that may walk and work among us in the future: racism.
New University of Canterbury-led research, just presented to a Chicago conference, has found we hold similar automatic biases toward darker-coloured robots as we do toward people with darker skin colour.
Most robots already being sold or developed are either stylised with white material or have a metallic appearance - and the researchers warn that if biases carry over into the robot world, there could be a lack of diversity among them in the future.
In the research, a collaboration between the university's Human Interface Technology Lab (HIT Lab NZ) and psychology department, the study team investigated whether we automatically ascribe a race to robots as we do with people.
To do that, they ran tests based on psychology's "shooter bias" experiment which had demonstrated how people from many backgrounds were quicker to shoot at armed black people over armed white people, while also more quickly refraining from shooting unarmed white people over unarmed black people.
They observed the same effect in tests using robots as well as humans - and the study participants were even found willing to racialise robots depending on their colour even when given a "does not apply" option.
"Our research shows that people show automatic biases towards darker-coloured robots just as they do toward people with darker melanation," HIT Lab's Associate Professor Christoph Bartneck said.
"This result should be troubling for people working in social robotics given the profound lack of diversity in the robots available and under development today."
A Google image search result for "humanoid robots" showed predominantly robots with gleaming white surfaces, or which had a metallic appearance.
There were currently very few humanoid robots that might plausibly be identified as anything other than white and sometimes Asian, Bartneck said.
Most of the world's leading research platforms for social robotics were also stylised with white materials and are presumed to be white.
There were some exceptions: renowned roboticist Hiroshi Ishiguro's creations were modelled on Japanese faces, while Hanson Robotics' famed android BINA48 was racialised as black.
But the general theme of robot colour was troubling.
"This lack of racial diversity amongst social robots may be anticipated to produce all of the problematic outcomes associated with a lack of racial diversity in other fields," Canterbury social psychologist Dr Kumar Yogeeswaran said.
"An even larger concern is that this work suggests that people respond to robots according to societal stereotypes that are associated with people possessing the same skin colour.
The researchers believe their findings suggest that people carry over their negative stereotypes from humans to robots which can have negative implications for how people react to robots of different colours when they potentially operate as teachers, carers, police, or work alongside others in a factory.
"The development of an Arabic-looking robot as well as the significant tradition of designing Asian robots in Japan are encouraging steps in this direction," Bartneck said.
"Especially since these robots were not intentionally designed to increase diversity, but the result of a natural design process.
"We hope that our paper might inspire reflection on the social and historical forces that have brought what is now quite a racially diverse community of engineers to – seemingly without recognising it – design and manufacture robots that are easily identified by those outside this community as being almost entirely 'white'."