Amber Thomas:
Abimbola Akanwo-Hood:
Being able to flag potential issues in a design or project takes courage - in a workplace culture of fear (job, promotion, annual pay increase/bonus, potential ridicule by peers), most people don't need/want the hassle.This is a really good point. Managers need to take this into account and try to create an environment where everyone's voice can be heard and in which staff are called on to question things along the way. Scrutiny should always be invited to make sure that whatever you are working on is the best version that it can be, and company culture is certainly a big part of making this happen.
20 February 2019 The University of Manchester
21 February 2019 Assembly Building, Belfast
The article discusses the experience of a black man who tried to renew his passport online had his image flagged by an automated photo checker because it mistook his closed lips for an open mouth. “When I saw it, I was a bit annoyed but it didn’t surprise me because it’s a problem that I have faced on Snapchat with the filters, where it hasn’t quite recognised my mouth, obviously because of my complexion and just the way my features are,” he said.
The article goes on to discuss similar experiences shared online. One (black) woman became frustrated after the system told her it looked like her eyes were closed and that it could not find the outline of her head. “The first time I tried uploading it and it didn’t accept it. So perhaps the background wasn’t right. I opened my eyes wider, I closed my mouth more, I pushed my hair back and did various things, changed clothes as well – I tried an alternative camera.” She added that she was irritated about having to pay extra for a photo booth image when free smartphone photos worked for other people.
Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield said it was known that automated systems had "problems with gender as well [as race],” He said: “[Automatic systems] have a real problem with women too generally, and if you’re a black woman you’re screwed, it’s really bad, it’s not fit for purpose and I think it’s time that people started recognising that. People have been struggling for a solution for this in all sorts of algorithmic bias, not just face recognition, but algorithmic bias in decisions for mortgages, loans, and everything else and it’s still happening.”
We're about to take you to the IET registration website. Don't worry though, you'll be sent straight back to the community after completing the registration.
Continue to the IET registration site