This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

When Bias in Product Design Means Life or Death

I've just read this fantastic post on the importance of considering diversity in product design and wanted to share it here:

https://www.linkedin.com/pulse/when-bias-product-design-means-life-death-carol-reiley


I won't copy everything over, but here are just a couple of the points made that I found particularly concerning:


"In the 1960s, the vehicular test crash protocol called for testing with dummies modeled after the average male with its height, weight, and stature falling in the 50th percentile. This meant seatbelts were designed to be safe for men and, for years, we sold cars that were largely unsafe for women, especially pregnant women. Consequently, female drivers are 47% more likely to be seriously injured in a car crash."


"Microsoft’s vision system was reported to fail to recognize darker skinned people. Today, one of the most prominent applications of computer vision is self-driving cars, which rely on these systems to recognize and make sense of the world around them. If these systems don’t recognize people of every race as human, there will be serious safety implications."


"White men viewing a crowd with 17% women perceived it to be 50–50, and when it was 33% women, they perceived it to be majority women. A simple overestimation like this illustrates how difficult it can be to see the world from another’s perspective."
  • Former Community Member
    0 Former Community Member

    Amber Thomas:




    Abimbola Akanwo-Hood:

    Being able to flag potential issues in a design or project takes courage - in a workplace culture of fear (job, promotion, annual pay increase/bonus, potential ridicule by peers), most people don't need/want the hassle.




    This is a really good point. Managers need to take this into account and try to create an environment where everyone's voice can be heard and in which staff are called on to question things along the way. Scrutiny should always be invited to make sure that whatever you are working on is the best version that it can be, and company culture is certainly a big part of making this happen.



    Of course the other issue is the lack of diversity within the project teams!  If you have an appropriately diverse workforce, then many of these issues would be designed for almost sub-consciously, rather than being the victim of [unconscious] bias...

  • There is a talk on "Engineering a fair future: Why we need to train unbiased AI" which will be part of the IET Turing EngTalk and BCS Lecture Series in February 2019. The events are free to attend as follows: 

    18 February 2019 IET London: Savoy Place 

    20 February 2019 The University of Manchester



    21 February 2019 Assembly Building, Belfast


    Hopefully the talk will be filmed, and if so I will try to find the link after the event. Read more at: https://events.theiet.org/turing/index.cfm

  • I came across this article today which discusses racial bias in automated systems and this thread (although rather old now) seemed like as good a place as any to share it: https://www.independent.co.uk/news/uk/home-news/black-man-lips-passport-photo-home-office-joshua-bada-a9111711.html

    The article discusses the experience of a black man who tried to renew his passport online had his image flagged by an automated photo checker because it mistook his closed lips for an open mouth. “When I saw it, I was a bit annoyed but it didn’t surprise me because it’s a problem that I have faced on Snapchat with the filters, where it hasn’t quite recognised my mouth, obviously because of my complexion and just the way my features are,” he said.


    The article goes on to discuss similar experiences shared online. One (black) woman became frustrated after the system told her it looked like her eyes were closed and that it could not find the outline of her head. “The first time I tried uploading it and it didn’t accept it. So perhaps the background wasn’t right. I opened my eyes wider, I closed my mouth more, I pushed my hair back and did various things, changed clothes as well – I tried an alternative camera.” She added that she was irritated about having to pay extra for a photo booth image when free smartphone photos worked for other people.


    Noel Sharkey, a professor of artificial intelligence and robotics at the University of Sheffield said it was known that automated systems had "problems with gender as well [as race],”  He said: “[Automatic systems] have a real problem with women too generally, and if you’re a black woman you’re screwed, it’s really bad, it’s not fit for purpose and I think it’s time that people started recognising that. People have been struggling for a solution for this in all sorts of algorithmic bias, not just face recognition, but algorithmic bias in decisions for mortgages, loans, and everything else and it’s still happening.”