Zoom and Twitter are among those sites where Black people have struck facial recognition problems. Photo / 123RF
COMMENT:
I'm not wanting to become the Herald's social media correspondent, but over the weekend educator and PhD student sent some fascinating tweets about how Zoom and Twitter's facial recognition works.
Well, how poorly it works if you're Black that is. Madland noticed that his Black colleagues lost theirheads when Zoom virtual backgrounds were enabled and there was no way to fix this.
The problem doesn't happen with people who have lighter skin. Next, Madland spotted that Twitter would crop the previews of photos containing a Black and a white person so that the former did not show up.
Other Twitter users tried it and sure enough: the Black person was "disappeared". One harrowing example used former United States President Barack Obama and US senator Mitch McConnell and the same thing happened: no Obama in the preview, only "Moscow" Mitch as some Americans refer to him as.
Even stock photos got the same treatment, with white models showing up in the preview but not Black ones.
Twitter owned up to gaffe, to its credit but this is nothing new, and it's a problem that keeps, err, cropping up no matter how many times it's being raised.
In 2009, HP got into hot water after it was discovered that its webcams were only able to track white people, and not Black people.
That was blamed on contrast issues in certain lighting situations which few people believed, funnily enough.
The reality is that facial recognition algorithms used by software and nowadays, artificial intelligence and machine learning, are still biased against people with dark skin.
This despite a vast increase in computing power, algorithm refinement and learning from years of use of facial recognition. Last year, the US National Institute of Standards and Technology checked out facial recognition code from over 50 companies.
In one test with photos of white women and of Black women, NIST found that the latter were misidentified at a rate 10 times as high as the former.
The software was from French biometric tech company IDEMIA which builds, among other things, those cool passport kiosks you might bump into at immigration at airports.
Biased software is a serious problem because facial recognition is being rolled out everywhere at a rate of knots. Police want to use it to go through mugshot databases, which could have absolutely horrific consequences with high misidentification rates.
Getting it wrong means facial recognition systems have already put innocent people behind bars, and missed the guilty ones.
Facial recognition tech that erases or crops out people of certain ethnicities could also lead to innocent people not having an alibi from surveillance cameras.
That the problem still exists after all these years shows developers aren't prioritising a fix for it. Startups train their tech on big data sets, many of which are freely available. How many startups check to ensure that the data sets are sufficiently diverse?
The last few years have seen ethics commissions and groups set up to work on how to avoid biases, discrimination and outright errors in algorithms and tech. They tend to not be very diverse and manned (literally) with people from wealthy European and North American countries.
Not having the experience and voices of people from a wide variety of ethnicities and backgrounds suggest the software that will control important parts of our lives will continue to be biased, unfortunately.
Facial recognition tech and AI that learns from it won't go away. It'll increasingly be used to verify people's identities as part of biometrics tech for transactions, travel, security, health services and policing. It's even used to determine "liveness", to ensure that they're actually humans.
Ignoring the issue is likely to create a massive problem globally, with millions of people potentially being locked out of services and markets because the technology doesn't recognise their faces. And that's quite frankly a ridiculous thought.
This is of course a business opportunity. There are companies in Africa that have recognised the gap in the market created by the bias; here's hoping they'll be able to move quickly and create better solutions than their Western and Asian counterparts who rely on people being "contrasty" enough.