This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

When Bias in Product Design Means Life or Death

I've just read this fantastic post on the importance of considering diversity in product design and wanted to share it here:

https://www.linkedin.com/pulse/when-bias-product-design-means-life-death-carol-reiley


I won't copy everything over, but here are just a couple of the points made that I found particularly concerning:


"In the 1960s, the vehicular test crash protocol called for testing with dummies modeled after the average male with its height, weight, and stature falling in the 50th percentile. This meant seatbelts were designed to be safe for men and, for years, we sold cars that were largely unsafe for women, especially pregnant women. Consequently, female drivers are 47% more likely to be seriously injured in a car crash."


"Microsoft’s vision system was reported to fail to recognize darker skinned people. Today, one of the most prominent applications of computer vision is self-driving cars, which rely on these systems to recognize and make sense of the world around them. If these systems don’t recognize people of every race as human, there will be serious safety implications."


"White men viewing a crowd with 17% women perceived it to be 50–50, and when it was 33% women, they perceived it to be majority women. A simple overestimation like this illustrates how difficult it can be to see the world from another’s perspective."
Parents
  • I came across this article today which discusses racial bias in automated systems and this thread (although rather old now) seemed like as good a place as any to share it: https://www.independent.co.uk/news/uk/home-news/black-man-lips-passport-photo-home-office-joshua-bada-a9111711.html

    The article discusses the experience of a black man who tried to renew his passport online had his image flagged by an automated photo checker because it mistook his closed lips for an open mouth. “When I saw it, I was a bit annoyed but it didn’t surprise me because it’s a problem that I have faced on Snapchat with the filters, where it hasn’t quite recognised my mouth, obviously because of my complexion and just the way my features are,” he said.


    The article goes on to discuss similar experiences shared online. One (black) woman became frustrated after the system told her it looked like her eyes were closed and that it could not find the outline of her head. “The first time I tried uploading it and it didn’t accept it. So perhaps the background wasn’t right. I opened my eyes wider, I closed my mouth more, I pushed my hair back and did various things, changed clothes as well – I tried an alternative camera.” She added that she was irritated about having to pay extra for a photo booth image when free smartphone photos worked for other people.


    Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield said it was known that automated systems had "problems with gender as well [as race],”  He said: “[Automatic systems] have a real problem with women too generally, and if you’re a black woman you’re screwed, it’s really bad, it’s not fit for purpose and I think it’s time that people started recognising that. People have been struggling for a solution for this in all sorts of algorithmic bias, not just face recognition, but algorithmic bias in decisions for mortgages, loans, and everything else and it’s still happening.”


Reply
  • I came across this article today which discusses racial bias in automated systems and this thread (although rather old now) seemed like as good a place as any to share it: https://www.independent.co.uk/news/uk/home-news/black-man-lips-passport-photo-home-office-joshua-bada-a9111711.html

    The article discusses the experience of a black man who tried to renew his passport online had his image flagged by an automated photo checker because it mistook his closed lips for an open mouth. “When I saw it, I was a bit annoyed but it didn’t surprise me because it’s a problem that I have faced on Snapchat with the filters, where it hasn’t quite recognised my mouth, obviously because of my complexion and just the way my features are,” he said.


    The article goes on to discuss similar experiences shared online. One (black) woman became frustrated after the system told her it looked like her eyes were closed and that it could not find the outline of her head. “The first time I tried uploading it and it didn’t accept it. So perhaps the background wasn’t right. I opened my eyes wider, I closed my mouth more, I pushed my hair back and did various things, changed clothes as well – I tried an alternative camera.” She added that she was irritated about having to pay extra for a photo booth image when free smartphone photos worked for other people.


    Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield said it was known that automated systems had "problems with gender as well [as race],”  He said: “[Automatic systems] have a real problem with women too generally, and if you’re a black woman you’re screwed, it’s really bad, it’s not fit for purpose and I think it’s time that people started recognising that. People have been struggling for a solution for this in all sorts of algorithmic bias, not just face recognition, but algorithmic bias in decisions for mortgages, loans, and everything else and it’s still happening.”


Children
No Data