Should I Build It?

remotesynth profile image Brian Rinaldi Originally published at remotesynthesis.com ・3 min read

Those of us who began our coding career in the late 90's grew comfortable asking ourselves "Can I build it?" Programming, especially for the web, was defined by its limitations back then. What we wanted to build and what we could feasibly create were often light years apart.

Today, there are few limits to what we can build, but also a growing consensus that "the internet is broken" (the internet, of course, being the backbone of so many of these technologies). This presents an entirely different dilemma. It's no longer a matter of whether it can be built but whether it should be built.

Our Tools Can Be Misused

We know that the things we build can be misused largely because so many of the things we've built have already been arguably misused. But is that experience reflected in the decision-making of most developers? A recent study by NC State says no. It seems to show that awareness of the Association for Computer Machinery Code of Ethics and Professional Conduct had no impact on the way participants responded to certain real-world ethical situations faced by developers. And, the fact of the matter is, very few of us (including me) have ever heard of that or any other code of ethics for our industry.

Technological professionals are the first, and last, lines of defense against the misuse of technology.

The above quote is from Cherri M. Pancake when discussing why the Association for Computing Machinery (of which she is currently serving a two year term as president) decided to update their code of ethics. I think she has it right in that developers today need to think about the potential impact of what they are creating but also how the things they have created are actually being used - i.e. it's both forward and backward looking.

Let's think about a simple example. 10 years ago, developers were building simple face tracking tools for the web using Flash. I remember seeing an incredible demo at a conference that had all kinds of fun with this technology. Should we have foreseen that this same technology would eventually enable the modern-day surveillance state? And what is our obligation now that we know the tools we helped create are being used in morally questionable ways?

No Easy Answers

I' not writing this post because I have any answers to those questions. In many ways, I am insulated in my role from some of the more difficult moral dilemmas...or, at least, like to believe I am. I do think about them a lot. This is especially true given calls to break up big tech, which, regardless of how effective you think it will be, only addresses the problem in hindsight - as in, it is in part an attempt to repair the fact that the things we helped build were misused but doesn't help address the issue of the things we are currently building (which is what the code of ethics is hoping to do).

At the risk of dragging you down the rabbit hole with me, what are your thoughts on the ethical considerations that developers face? Have you faced an ethical dilemma? How did you handle it?

Posted on by:

remotesynth profile

Brian Rinaldi


Brian Rinaldi is a Developer Advocate at StepZen. Brian has worked for over 9 years focused on developer community and developer relations at companies like Progress Software and Adobe.


markdown guide

I think it is hard to teach ethics especially when your from different countries with different cultures context.

Peter F Drucker tried to define it as well but in the end focus on philosophy structure than a general ethics.


When working with people I'm surprised when some people dont think of certain ethics when developing - using api's and you are using peoples data. You have to be careful what you think should be shown of their data and have discussions about it.


Should we have foreseen that this same technology would eventually enable the modern-day surveillance state?

We did, we foresaw all of of this bar the specific technologies.


I think we should acknowledge the fact that all tools may be misused. The only thing we can do is reduce the likelihood of opportunities for misuse to arise.