DEV Community

Uchi Uchibeke
Uchi Uchibeke

Posted on

A.I is very biased and we should stop it

As a black male living in North America, it is sometimes hard to admit the biasthat is inherently present in me. This is especially difficult because most
things that I see on social media and the news are stories of people like me experiencing biased treatments — stories of unfair treatment due to skin colour. Additionally, it’s hard to admit that I am biased because I have also experienced bias myself and admitting that I am also biased is clouded by the few negative experiences I have had with a few bad apples.

I found the talk to be very interesting because Rebecca did not only focus on Racial and Gender Bias.

At Collision Conference in New Orleans, I had the privilege of listening to the talk Diversity, coding & bias in AI by Rebecca Parsons of ThoughtWorks. The talk shed light on some the biases that exist in our society today.


Rebecca Parsons of Thouthworks at Collision 2018

AI is biased

I found the talk to be very interesting because Rebecca did not only focus on
Racial and Gender Bias. In addition to covering racial and gender biases,
Rebecca did a 360 degrees coverage of biases in our society and demonstrated the importance of fostering a culture of inclusivity to create an equitable tech future. For me, some takeaways from the talk are:

  • The thoughts, intentions, and biases of the makers of an artificial intelligence are transferred to the AI.
  • Accepting that we are biased is an important step in improving equitability in tech.
  • Developers should not only build for those like them but should build for and test with people of different background and orientations.
  • The actions and decisions of AI is affected by what the developers thought were important while they were building it.

A No-Bias Resolution

To make any improvement or succeed in any endeavor, it is pertinent to first acknowledge the problem and make plans and set a goal to solve the problem. In addition to planning and setting goals, a consistent conscious effort is required to reach a long-term solution.
Going forward, I have resolved to do the following three things to promote inclusivity for an equitable tech future. I encourage you to tweak and adapt the following for your unique situation:

1. Acknowledge and reaffirm my own bias:

For me, I believe that acknowledging and reaffirming my biases is the first step to getting rid of them. Acknowledging my biases will enable me to tackle them and reaffirming them will serve as a constant reminder and check for me.

2. Promote inclusion in user testing:

Advocate that people of various demographics be brought in for user testing. This will enable the products I work on to work well for people of more demographics and prevent situations like what happened with a Personal Assistant designed for Doctors that worked well for male voices (Doctors) but not for female voices (Doctors).


Photo by Victoriano Izquierdo on Unsplash

3. Speak up when someone makes an inappropriate comment:

Speaking up against inappropriate comments is something that’s close to my heart. I want to continue working on it especially as it relates to women. I care about this because I have three sisters and I have other strong and powerful women in my life that I care about and work with. Sadly, a few years ago, I did not stand up against an inappropriate comment. While having lunch with some guys, one of them made a comment like “Why won’t she get it? She’s a pretty girl in skirts”. I was shocked and my mouth dropped in disbelief. Despite my shock, I didn’t confront the guy. After that situation, I learned not to be silent when I hear inappropriate comments like that. I have since been speaking up against such comments and I am now more determined to continue speaking up.

What we should all do

I am biased. You are biased. AI is biased. We are all biased.


Photo by Leighann Renee on Unsplash

As we continue to advance in technology and develop AI that can almost pass for a person, may we collectively and individually acknowledge and check our biases, include different demographics in user research and development and speak up against inappropriate comments made about people that are not like us.


Top comments (4)

Collapse
 
defman profile image
Sergey Kislyakov

test with people of different background and orientations.

How are they different in biological background than any other human? I'll spend my time testing my product's a11y (because there is a lot of people who are blind, deaf, visually impaired or hearing impaired), but I don't care about your orientation. You can be an Apache Helicopter if you feel like it, but I don't see how it affects my product in any way (unless I'm asking for user's gender, so I can use "he/she/they").

Collapse
 
uu profile image
Uchi Uchibeke

I see your point. However, just like in the example I gave where female Drs were unable to use a Personal Assistance because it was tested with only male Drs or masculine voices, there can be some inherent difference people of different orientation depending on the product you're building.

But again, I see your point, we cannot test with people of all orientations. The key, I believe, is making an effort to cover at least 90% of the target audience.

Collapse
 
andrewlucker profile image
Andrew Lucker

Math is not social science, but maybe useful nonetheless:

"Bias is the difference between what you know and what you don't; prejudice is what you do about it."

Collapse
 
uu profile image
Uchi Uchibeke

Great quote!