DEV Community

Discussion on: The Future (of AI) is Female: How Hiring Bias Mitigation in NLP Can Be Great for Women Now and in the Future

 
nicolasini profile image
Nico S___

Look at the example brought to twitter by DHH (creator of Ruby and founder of Basecamp), he and his wife applied for the new Apple Card, but she got less credit than him. This happened even thought they do all their finance together, and she has better scores than him.
As it turned out, the Apple Card is backed by a Goldman Sachs bank and it uses an AI algorithm to make those decisions.
Many other people started sharing similar issues applying for it, including Steve Wozniak, whose wife also got less credit approved even tho she also has a better credit score than him.
When Apple customer service was called they all seem to just say is the algorithm making the decision, and no human has a way to rectify that decision (this is the scary part).
As you suggested, regulation and law is what will be required to ensure we don't hand over decision making to AIs that were trained using biased data (thats all the data we have).
Or, we could look for a way to remove bias from data!