DEV Community

Cover image for How Can we Change Prejudice in YouTube's Algorithm?

How Can we Change Prejudice in YouTube's Algorithm?

Isabella Floriano
・2 min read

Much has been discussed about biases when it comes to Artificial Intelligence. A bias is inevitable and is the result of our experiences, our personality, the place we occupy in society. But when a bias is reflected in an algorithm with the potential to impact thousands, millions of lives, it can become very dangerous. We know that it's not an algorithm that is prejudiced in its essence, but how it learns from our view of the world. We have seen examples of this in several studies, whether Google SERP can be sexist or how artificial intelligence is deciding life in the prison system.

This itself is a problem, but what happens when a company is aware of its bias and still maintains its algorithm? Can we hold this type of behavior accountable?

I was really impacted by a video I saw today from Vice. It shows a group of youtubers suing the platform for prejudice against their content aimed for the LGBTQ community. The group claims that YouTube restricts, blocks, unfairly demonizes content with words "gay", "bisexual" or "transsexual" in the titles.

The power of a company like Google to educate, form opinions and allow many to live off the content they produce, around the world, is impressive. And seeing the use of algorithms to target or directly affect someone or a group is just astounding. As a student in the field, seeing that kind of behavior from a company that is said to support diversity is extremely disheartening and scary. As people in the area, talking and studing technology, what can we do to change this reality?

Photo by Szabo Viktor on Unsplash

Discussion (0)