✨ What is this post about: As a part of my professional growth, I make time to watch conference talks on Ruby, Rails, JS, React, tech writing, and tech trivia. Previously, I'd just watch them but now I will take and publish notes for future reference. This talk was a part of RailsConf 2021 that I'm participating in at the time of writing.
✨ Talk: 'The Rising Storm of Ethics in Open Source' by Coraline Ada Ehmke
✨ If you can't watch the talk, read Coraline's article: A Six-Month Retrospective on Ethical Open Source
✨ One-paragraph summary: The increased debate around ethical source threatens to divide the OSS community. In his book "The Structure of Scientific Revolutions", philosopher Thomas Kuhn posits that there are three possible solutions to a crisis like the one we're facing: procrastination, assimilation, or revolution. Which will we choose as we prepare for the hard work of reconciling ethics and open source?
✨ Impression: YES. This is the talk we need to hear over and over and over again, and which should become a staple in developer education. Coraline is an AMAZING speaker (knowledgeable, concise, direct, charismatic) and her expertise is mind-blowing. Please do watch the talk.
- Coraline Ada Ehmke is a gem:
- Christine Peterson coined the term 'Open Source' in 1998 and Bruce Perens wrote its definition
- In 1998, the Open Source Initiative was founded
OS is nowadays used to violate human rights, and it is a feature, not a bug:
Can I stop "evil people" from using my program?
No. The Open Source Definition specifies that Open Source licenses may not discriminate against persons or groups. Giving everyone freedom means giving evil people freedom, too.
"But under what other circumstances, in human society, do we grant complete freedom to evil people? Why is it different with software?"
Fundamental question: "Are we responsible for how the technologies we develop are used?"
OS is used to suppress protests, to increase surveillance, etc.
Ethical qs in tech are not new: Edmund Berkeley (one of the pioneers on computer engineering) cofounded the Association for Computing Machinery (ACM) with the mission of "...serving public interests by fostering the open interchange of information and by promoting the highest professional and ethical standards"
in 1958 Berkeley sat on the Committee of the Social Responsibility of Computer Scientists, which published a foundational report on the ethical responsibilities of technologists, and the findings of the report boiled down to four statements:
- "They cannot rightly ignore their social responsibilities"
- "They cannot rightly delegate their social responsibilities"
- "They cannot rightly neglect to think about how their special role can benefit or harm society" (or, we have to consider how our special capacities can help to advance socially desirable applications and prevent undesirable outcomes)
- "They cannot avoid deciding between conflicting responsibilities" (we must think how to choose)
The scientists concluded: "The scientists' credo knowledge for the sake of knowledge's sake easily comes into conflict with our ethical responsibilities" (even if there's a large middle ground between what's socially desirable and undesirable, the undesirable part should not be enabled through our tech)
1972 the Vietnam War was waging when Berkley delivered a talk at a 25th-anniversary dinner of the ACM:
- he said that anyone who works on technology for unethical goals should quit their jobs (he even called the audience members by their names to a great public upset; admiral Grace Hopper was among those who left the room);
- he concluded the talk by saying it was "gross neglect of responsibility that computer scientists were not considering their impact in terms of societal benefit or societal harm"
"How would we feel about the complicity of the IBM in the Holocaust, if their punch card system was released under the MIT license?"
Melvin Kranzberg (tech historian): "Technology is neither good nor bad; nor is it neutral"
prof. Lila Green: "When technology is implicated in social processes, there is nothing neutral about technology" ("Framing Technology: Society, Choice and Change")
"MINASWAN isn't an ethical framework"
It's time for us to go beyond "nice". Frankly, I'm sick of "nice". Nice is meaningless if we're not just. Nice is meaningless if we are not equitable. We can't keep using "nice" as a shield that we hide behind ignoring our impact.
- No chemical manufacturer in the US will produce a solution that's used for death by lethal injections.
- The IBM was complicit in the Holocaust
- in 1998 the greatest conceivable evil that can be borne out of software was market domination by Microsoft