What happens when we don’t include trans and non-binary people in our products? How do our products cause harm? Why is education so important and change so hard?
The Markup’s discovery that Google allowed advertisers to exclude non-binary people from job and housing ads, while blocking them from excluding men or women for those ads, is a great example of what happens when we don’t include or prioritize minoritized groups in our products.
How Google allowed this discrimination to happen.
TL;DR: Careless design leads to broken features that open the door for discriminating trans and non-binary people.
When signing up for a Google account, users are given four gender options:
- Rather Not Say
When picking the “custom” option, users can write their gender in a text box.
But advertisers only get three checkboxes they can use when picking an audience for their ads:
The Markup found that everyone who picked the “rather not say” or “custom” option in their settings gets grouped within the “unknown” category for advertisers.
Image from The Markup.
While Google doesn’t allow ads to exclude men or women from jobs, housing, or financial products, they did allow advertisers to exclude the “unknown” category, leaving those outside of the gender binary excluded as well. All of this during a time where housing, jobs and financial aid are crucial for everyone, but even more so to those who are already more often exposed to discrimination and abuse.
Alienated and erased by design.
Just the way this data gets labeled already shows how much of an afterthought gender diverse people were.
As non-binary people, we first have to indicate that we’re “other” or “custom”, and then that information gets disregarded and Google categorizes us as “unknown”. First we’re alienated, then we’re forgotten about.
We should include and prioritize the needs and safety of minoritized groups, as early in the process as possible.
The data that comes out of these gender selection boxes isn’t just sitting in a database somewhere, it’s actively being used. In this case, it decides who gets to see certain ads, and Google’s lack of care for trans and non-binary people led to vulnerable groups being excluded from housing and job ads.
But data on gender is frequently collected and used, from targeted ads and demographic research to dating apps and medicine.
If we don’t include and prioritize people from minoritized groups in our design and tech practices, we risk not only collecting incomplete or incorrect data, but also processing and using it in biased and harmful ways.
Preventing this goes beyond just hiring more trans people or doing user tests with a diverse audience, though.
While trans and non-binary people would probably have flagged the potential harm behind those features, they do need to be given trust, safety and support as well.
Just recently, Google fired Timnit Gebru and Meg Mitchell from their ethical AI team because of a research paper critical of AI systems that process language. And meanwhile Amazon is paying its employees to quit as a way to block unionization efforts.
Lack of data, visibility and accountability.
As The Markup’s story pointed out as well, it’s difficult to know which ads you’re missing out on because of your gender. While we often can get information about why we’re seeing a certain ad, we can’t ask “why am I not seeing this ad?” for ads we’re not seeing, making it hard to hold companies accountable.
Similarly, if those companies analyze our data to understand what issues we’re facing on their platform but don’t have accurate data on non-binary people, they’ll easily ignore our needs.
I’ve participated in countless of employee satisfaction surveys where non-binary wasn’t an option at all, meaning the answers of non-binary folks are miscategorized. When companies then use that data to analyze what they need to improve upon, the issues that are trans and non-binary specific are lost because there simply is no data for it.
This is why data isn’t neutral or objective, but influenced by those who collect it, and later possibly further compromised by the biases of those that use it.
It’s easy to label these inequalities as “accidental” or “unintended side effects”. But how accidental or innocent is this really? This isn’t the first example of Google (and other tech companies) causing harm to minoritized people, and it won’t be the last one either.
Even Google’s gender selection form at sign-up has received criticism for a long time (I wrote about it as well), and neither ethical design nor trans people are new concepts.
Most trans and non-binary people are all but surprised that something like this could happen, given we constantly experience the consequences of trans-exclusionary design.
As I touched upon earlier, to me this shows that either no trans or non-binary folks were involved or consulted on this (which is a lack of user research as well, aka bad design), or their concerns just weren’t listened to.
We must, collectively, be better at including and protecting minoritized groups in our designs. After all, features and data don’t exist in a vacuum. Technology is so embedded into our society that even seemingly small features can cause real-world harm.
Top comments (14)
Kudos to bringing light to something we should all be mindful of.
But in what cases would advertise to non-binary genders help? Like, take example, tampons or absorbents. A business selling this would like to know if you have a menstrual cycle or not, right? It would not be possible to know that from your custom gender you entered.. just thinking out loud.
This is about letting advertisers exclude non-binary people from housing and job ads. As explained in the article, google doesn't allow excluding people from job, housing or credit ads based on gender, but does let advertisers filter out non-binary people as a result of poorly designed forms and data structures.
And your question is actually a really good point why advertising based on gender, especially when it's very binary, doesn't work well. Trans men and non-binary people can need menstrual products. And trans women may not need them.
It’s really important to note that the article is saying Google made a bad design decision that then excludes people based on gender. Maybe or maybe not on purpose, but it appears that it does.
While you went right to tampons, you might consider that there is an opportunity for companies to sell products and services to specific audiences across all markets and not just hygiene products. There are thousands of brands, including banks, clothing designers, the travel industry, and anything in between looking to service a typically underserved market.
As developers, we always have that need to solve for custom fields in our products. Since we don’t know exactly what someone might put there, we let them type whatever they want. After a period of use, you can see patterns and then ponder adjusting your app to support what people hand-type most frequently. For example, Google could probably run a super simple algorithm to learn that NB, N-B, non binary, and Non-binary are all the same thing and simply offer non-binary as an option. Further, anyone in the gender inclusion space can spend about an hour typing up a list for Google to start with.
If that’s not how you identify, don’t select it. If it is, you feel more welcome. Advertisers are eager to reach new customers, so that helps them target typically underserved markets and gain trusted relationships. Also, Google gets itself away from adding custom, hand-typed data to reduce engineering problems and increase profits.
As for Google not allowing advertising for housing and other services that, at least in the US, could be seen as discriminatory, this is something very concerning. I have not investigated it, so only have this article to go by, but if it is the case, then it is worth getting Google to make change, either from within or through support of your elected officials.
Do you realize you're accusing a huge bunch of people of lying about something you have absolutely no way of confirming?
Might as well claim that headaches don't exist. You just have no way whatsoever to confirm this.
Thanks for this informative post! So many things we don't think about when we build things. Good argument for ensuring we create diverse teams.
Oh, shut up and get a grip.
I think treating this as a gender issue is missing the point. Google doesn't care about non-binary people, nor does it care about binary people. Google cares about advertisers.
If you enter a custom gender on your profile, you're adding lots of complexity that google just has no business incentive to untangle. How would you normalise gender data? how would the UI look like? What about languages other than english?
Honestly, being able to select gender at all is a footgun. Google knows better than you, who your ad should be shown to. It's probably in their best interest to be able to show ads to everyone they think might click them, so if they're giving the option to override this clearly shows that there's probably a demand among advertisers for this.
In conclusion: Google doesn't care about any of us. We're all just data points that can be discarded when they can't be processed.
Google may not care about any of us but it is indeed a gender issue..?
Whether or not there are more than two genders is a question of how you define the word. But that's not what you're saying.
You're saying people can't identify as anything other than man or woman. What people identify as is completely within their heads. You just can't claim people don't identify as anything other than male or female without actually looking into their minds.
Okay, so I will assume you define gender as a purely biological thing, most likely decided by peoples genitalia, am I correct? In that case, assuming you still use gender-specific pronouns, why are you so obsessed with what's between peoples legs that you have to structure your sentences according to it? Creepy!
Jokes aside, defining gender based only on biologic sex is mostly useless to society.
And I think you should take your BS out of this comment section.
Some comments have been hidden by the post's author - find out more