We become developers with the understanding that our mission with the software we create is to meet the needs of users. But what happens when the software we create is being used to change or coerce the behaviour of users, rather than help them work, play and behave the way they want to? What are the ethical considerations of developing software that tries to manipulate users? Where does the profit-driven motivation go too far?
Coercive: using force to persuade people to do things that they are unwilling to do. (Cambridge Dictionary)
I recently started thinking about this topic, when Facebook pushed out new changes that cut off my last method of checking my messages on my phone. I refuse to install the Messenger app, and instead I primarily use Facebook in the Chrome/Brave mobile browser on my phone. Facebook began as a website. I have always used it this way.
A couple years ago they released an update that would tell me to download the Messenger app to view my messages. Since then I could turn on 'Desktop Mode' in my browser and still access my messages. In a broken layout with a horizontal scrollbar, but it was at least usable. Today, even Desktop Mode doesn't work. I get this:
In theory I could find a mobile browser that lets me spoof the useragent and try that... but using a different browser would disrupt all my other activities on my phone, and I can't even be sure that would work - they were closing that loophole a few years ago. I've decided to stop trying to force Facebook to let me use it, and just stop using Facebook so much.
Let's talk about this for a moment.
Facebook spent considerable time drawing all of us in. Making themselves the de facto method of connecting with our friends and family. Few people in my circles call, text or email each other anymore. They use FB messages. Facebook became a necessary default in our lives, and they are now coercing me into their app riddled with privacy issues, onto my older phone with limited space for more apps.
Facebook isn't even trying to meet my needs as a user. If I need to check the messages for my business page, I can still do so on my phone. So the code for the mobile messenger website is there.
This type of coercive behaviour is everywhere now.
If I visit many websites from my phone - Twitter, Reddit, Facebook, etc I get a popup that offers their mobile app. I can usually choose "Get The App" or "Not Now". There is never, ever, a "Don't ask me again" option. I'm nagged by these popups everywhere.
One of many reasons I refuse to install social apps, is the bombardment of notifications and 'unread' indicators that are all over my digital experience these days. Those tiny read circles with numbers indicating messages I've yet to check. I disable these whenever I can, but they are still everywhere. Designed intentionally to press your buttons to make you clear them. I don't have my email push to my phone for this reason.
I want to control when I'm notified about unread emails. I want to decide when I'm willing to work or be reachable, and when I'm not.
I've seen other examples of software and websites trying to force my use of them in one way or another, often in ways that are detrimental to my user experience. For example:
- Opting me into the most invasive privacy or marketing settings by default.
- Burying important options down several layers deep in the settings.
- Intentionally hiding phone numbers or email addresses from a company's support page, even though these exist.
- Delivering notifications in a staggered asynchronous way so that even once you've 'cleared' all your notifications, more appear even if they are from hours before you last cleared them all.
- Intentionally making features unavailable on the web version and limiting it to app users.
- Unethical billing practises such as opting users into auto-renew subscriptions without their consent. Ignoring cancellation requests.
- Disguising buttons to encourage accident clicks. The Amazon Prime signup button is a good example of this.
This is a topic I've just begun to think about, so I don't have a lot to say about what we as developers should be doing about this topic. Do we just take the approach that "well Facebook is free so you can't complain"? Does that even apply if I don't even have the choice to pay and use it how I want?
Do we as developers have the power, or the desire to push back against these practises?
I know that I have decided to step back from my use of Facebook over this issue, but it is a steep price to pay, since I'll lose out on a lot of connections. And even if I choose to take this stance, none of my friends or family care enough about privacy to make the same decision.
What do you think about it fellow developers? And what other examples of this behaviour have you seen?
Oldest comments (2)
We absolutely have the power. If you work for a company that wants to do this, you can prevent this. Or you can quit. It might be a weird hill to die on, bu remember that you are part of the problem the company you work for creates. You are not absolved because you are "just an employee".
As a software developer you also have to consider ethics. The ACM and IEEE-CS have a code of ethics. The first three are
Coercive patterns (as described in this article) and dark patterns are a violation of those ethics.
This is a complaint I've had about Facebook as long as they've been pushing their mobile apps on users. (Not enough that Facebook has one mobile app - they obviously need two, right?)
I use a "lite" version of the Facebook Messenger app, but I dislike it. Fortunately there's only a couple people I communicate with that way, might be time to just delete it.
I've been tempted more and more to remove my Facebook account entirely. It just gets a bit harder now that I have 11 or 12 years worth of photos, comments, memories, etc. on there. :(