DEV Community

Craig Nicol (he/him)
Craig Nicol (he/him)

Posted on • Originally published at craignicol.wordpress.com on

Ethics in technology

This is an extension of a twitter thread I wrote in response to this tweet, and thread about the Cambridge Analytica revelations.

One of the key modern problems is how easy it is to access these tools. You don’t need professional training to string these together.

It’s as dangerous as if someone invented a weapon that could kill 10s or 100s of people, light enough to carry anywhere, and available in any store, without training. And expecting owners to police themselves.

People are terrified of AI. We know we don’t need AI to disable hospitals. We don’t need AI to intercept Facebook logins (although FireSheep and the pineapple are less effective now). We don’t need AI to send a drone into a crowded market.

Make a website the only place for government applications, such as medicare or millennials railcards and it’s easy to remove access for all citizens.

But combine all that with data and you can fuck up someone’s life without trying. You can give 2 people the same national insurance number or other id. You can flag them on the no fly list.

You can encode prejudice into the algorithm and incarcerate someone because they grew up in a black neighborhood.

The algorithm is God. The algorithm is infallible. Trust the algorithm.

Even when it tells you someone is more capable than the humans says she is, and punishes them.

(unless you’re under GDPR where you have the right to question the algorithm)

But tell anyone that people will use data for purposes they hadn’t considered (like using RIPA anti-terror legislation to see if someone’s in the school catchment area) then you’re paranoid.

Be paranoid. People will always stick crowbars in the seams. Whatever your worst case scenario for your code is, you’re probably not even close.


You can see my original tweet, and the repies, here:

The Guardian has a great interview on AI, existential threats and ethics on their podcast here.

Top comments (0)