DEV Community

Cover image for Letโ€™s talk about Coded Bias
hildri
hildri

Posted on

Letโ€™s talk about Coded Bias

๐Ÿ‘‹๐Ÿพ Hi there. I re-watched #CodedBias yesterday and...

๐Ÿ’ฏ I find it better than The Social Dilemma. At least more useful.

โœ… More and more, #algorithms will decide who gets hired, who gets healthcare, who gets police scrutiny.

โœ… Meanwhile, those same systems (that we now trust) have not been vetted for racial o #genderbias, or for #discrimination.

โœ… We are increasingly dependent on #AI technologies to make decisions for us. There is simply way too much data to be considered. Thousands of job or college applications, uncountable products to consider on Amazon, so much content available on Netflix.

๐Ÿ‘‰๐Ÿฟ What's next? I truly think, just from a citizen's perspective, we need to able to talk about:

โฉ Data and who owns it (for real). Should data rights be considered civil rights or even human rights?
What can we all do to advance diversity in tech, as one of the several ways to mitigate the risk of bias in AI.

โฉ One of the many tools to mitigate the risk of bias is by having diverse teams. If the world is 51% women, your team should be 51% women. The same for different demographics profiles.

๐Ÿ‘‰๐Ÿฟ What do you think? But more importantโ€ฆ What do you have to say?

Top comments (3)

Collapse
 
mellen profile image
Matt Ellen

I watched it for the first time last night.

I had an inklinking of the problems with machine learning, and the other algorithms in play, but I had no idea the scope of implementation. I've been slowly trying (and failing) to wean myself off social media for a while now, because I dislike all the predictions they're making about me, but the breadth of use by governmental institutions and private enterprise surprised me.

As a Brit, I was truly appalled by how the Met are using face recognition in London. They know they have a racism problem in their regular police force, so it seems stupidly short-sighted to automate their biases.

For all the terrible things the pandemic has wrought, at least we've got a good excuse to walk around London in masks.

The fact that they still try to justify intimidating minorities and the working class as "keeping people safe" is an utter joke.

Collapse
 
hildri profile image
hildri • Edited

I agree, to see those examples in London were a surprise for me as well. Then I started to investigate more and, here in Spain, the police are also using AI-based software. And biases were already such a complicated human feature... without tech. I'm truly worried about how we are going to live in a few years. I don't think I will get to see the world ruled by machines, I just want to be able to know, in a hypothetical scenario, why an algorithm rejected my loan application or why my CV got rejected by an automated HR system.

Collapse
 
mellen profile image
Matt Ellen

The section about the teachers being fired by algorithm was mind blowing. Calling it a "value add system" is Orwellian doublespeak if ever I heard it!

It reminds me of the algorthim they recently used in the UK to determine who got into university because students couldn't take exams due to covid. Funnily enough rich kids got in and poor kids did not. Shocker.