DEV Community

Cover image for Would you program a human?
edA‑qa mort‑ora‑y
edA‑qa mort‑ora‑y

Posted on • Originally published at mortoray.com

Would you program a human?

With tales of Crispr babies in the news, I've been pondering the implications of programming human beings. This may or may not have been spurred by a crime novel I recently read, which hinges on this topic. Were I offered a job to edit the genome of humans, would I take it?

I understand the question was easier in China where the morality issue is less problematic. The overt goal doesn't sound terrible either: guaranteeing the dependents of over a billion people will not be born with genetically inherited diseases. Who wouldn't want that? Though, our ethics boards are not likely inclined to approve experimentation on humans -- at least not until we see China racing ahead.

For the sake of the thought experiment, let's assume the ethics boards have granted permission for this work to go forward, and I've been hired as a programmer. How does it start?

As a good programmer, I start with a user story, something that describes the people involved and what they'd like to accomplish with the software. When I get back from a consulting meeting in Beijing, I draw up this user story:

Xi, the wholly elected leader of a prosperous, populous nation, is facing challenging stability decisions. He has great concern for the citizens and does not sleep easy knowing they are suffering. He's worried about people succumbing to addiction -- drugs, alcohol, individualism. He's successfully launched monitoring programs to identify these people and alert neighbours to their plight. Now he wishes to go further. He's looking for a genetic solution that would eradicate addiction.

Great, that sounds like a noble cause. With this insight, I can now recommend a variety of possible solutions. Naturally, I start with the latest version of the CRISPR gene editing technology.

I install the toolkit.

Immediately, I'm not impressed. The documentation is a mess. It's outdated, and most of the API is just blank. StackOverflow is filled with questions about programs randomly crashing, and smug answers belittling the poster. I'll have to poke around blindly looking for something that works.

You laugh, but this is the true state of affairs for the software that runs your phones, cars, medical devices, and military hardware. Do we really expect that we'd approach human programming more rigorously? We can't stop the development of technology just because we haven't figured out programming yet. We also can't use the argument, "but these are people!" That holds little weight to the decisions made by giants like Facebook and Google -- who essentially already control our lives through the software they write.

Alright, I have some code.

I can't just deploy this. I need to test it. I wonder how this will even work. Do we have some emulator? Maybe, but it looks buggy. I see there's an offer from India to outsource my testing. They've got a web form and ability to upload the code. I'm best off not thinking about what happens on the backend. As long as I get my results, it's their concern, not mine.

I pay the contractors for the Indian testing. I've given them direct access to my issue system so they can file any defects they find.

Issue #18: Results in random clucking like a chicken
> CLOSED: not reproducible
> COMMENT: Second case of clucking
> REOPENED: Confirmed
> SEVERITY: Cosmetic, PRIORITY: Low
Enter fullscreen mode Exit fullscreen mode

Given the randomness of the API, I'd expect mostly wacky results, but some might be promising.

Issue #37: Side-effect: blocks cerebral palsy progression
> ASSIGNED: George
> COMMENT: Identified the issue, working on a patch
> FIXED: Removed unintended side-effects
Enter fullscreen mode Exit fullscreen mode

Perhaps such valuable side-effects don't come up too often in programming. But all those medical researchers out there might wish to share their opinions on the amount of research that is buried, destroyed, and/or locked behind paywalls. If I'm on a deadline, or trying to save face, I'm likely to keep my head down and focus on getting the job-at-hand done.

This demands the question: what level of completeness is okay? It's currently impossible to set a deadline and a planned feature set in software. This isn't a lack of planning ability, it's a fundamental uncertainty in how the profession works. There's no reason to assume human programming will be any different.

What defects are we willing to accept when it comes to gene editing? If the program is to cure cerebral palsy, I imagine random clucking would be acceptable. But would moral judgements even allow the discussion of side-effects? There are some correlations between high IQ and other neurological disorders. Is it acceptable to edit genes that make a trade for higher IQ, at the risk of other genetic disorders?

If I think about the self-driving car discussions, we have a fundamental lack of knowledge to deal with morality and software. And the repercussions will be played out at a high political level, it might never even involve the programmer -- despite them being essential to the answer.

Nonetheless, while the debate plays out, I continue coding. I won't hold off on releasing the code. Until I'm explicitly told not to, I continue with my job. Plus, I feel safe. It's not like I'll ever be held personally responsible for the defects. Nobody is held liable for any kind of defects now. Consider mass data breaches; how severely are those companies punished? Even in the medical world, there's a litany of drugs with questionable side effects, yet those pharma concerns are still around.

I may be painting a bleak picture of gene editing. There are all sorts of positive uses for it, including the ability to eradicate genetically inherited diseases. But, realistically, how do we answer the questions about testing and defects? Should we argue whether this is even programming? It'd be hard not to call it that since we legitimately have a paradigm called "generative programming" which has been used for over half a century.

Nothing but questions.

How about you? Would you accept a position in human programming?


If you're like me and enjoy viral outbreak stories, I recommend the Year of the Rabid Dragon. If I've scared you into wanting to know more about programming, I have my own book coming out right away.

Top comments (12)

Collapse
 
kspeakman profile image
Kasey Speakman

It's a tough call. I am a huge fan of the movie Gattaca precisely for the exploration of some of these questions. I could be on board with it for treating harsh diseases, but not for making designer children or "enhanced" humans. Such goals seem to align with very evil figures from history. And that's before we even get into the downsides. There's always a trade-off.

We've already seen it happen to food... not through direct editing but through generations of selective breeding. At this point, many of our commercially grown foods have a severe lack of genetic diversity. For example, wheat rust became a big problem in recent history. Growers were understandably focused on bigger yields. Selective breeding for that goal had inadvertently filtered out strains of wheat that had genetic resistance to the fungus.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

Yes, we don't need to edit genomes to create problems, it can however accelerate the timeline compared to selective breeding.

There are a lot of questions here, but it's inevitable that we will edit our genome. I also think that designer babies are inevitable -- prohibition rarely works. :/

Collapse
 
kspeakman profile image
Kasey Speakman • Edited

Oh, for sure gene editing is already happening in humans (see below). But I was speaking to your specific question of whether I would accept a position in "human programming". For disease treatment I probably would but beyond that, no way.

Check out this annual biotech guidebook, which highlights various advancements in biotech for the year. One of the articles (page 14) highlights human gene editing to combat Hunter syndrome. Much like the feeling I get after attending a computer security conference, upon reading this guide I feel both more knowledgeable and terrified at the future.

Edit: Also relevant, and I think generally a good foundation for ethics in biotech. UNESCO - Universal Declaration on the Human Genome and Human Rights

Thread Thread
 
mortoray profile image
edA‑qa mort‑ora‑y

Okay, so you'd accept a position in disease treatment... here comes that annoying part where I present uncomfortable options.

What if your company is treating diseases, but you're working on something more fundamental, a framework, or process, that will enable it. You know full well it'll also enable undesirable changes to the genome. Are you okay open-sourcing that foundation?

Thread Thread
 
kspeakman profile image
Kasey Speakman • Edited

But by the same token, what if I do something as benign as discover some breakthrough in app dev. So that solid apps can be made rapidly and cheaply. Maybe that also means resilient malware/ransomware can be made faster and more cheaply. Would it be unethical to do then?

Technology is neither good nor bad. It's a tool that is used for good or bad, depending on how the person wields it. So the answer is that I don't know. It would depend on the situation, including the people involved and the specific limitations of the tech.

Thread Thread
 
mortoray profile image
edA‑qa mort‑ora‑y

I'm going to have to side with technology is neither good nor bad. Thus it'd be ethical to develop tools that make malware easy, or make horrific genome edits easy.

I don't trust everybody, but the people I trust the least are those that do things in secret. Thus given the option between trying to keep a lid on technology, or making it open, I'm choosing open every time.

Collapse
 
puritanic profile image
Darkø Tasevski • Edited

I really doubt that software engineers would do this job. Yeah, we can help build programs/simulations that can make modifying DNK a bit easier for the bioengineers (I guess) but that would be it.

On the other hand, this position would be so responsible that many people (me included) will just move past and do something else, because, you know, some human beings' life would depend on the changes you did to his DNK.

Btw, I don't really see what this (ridiculous) user story and Indian QA companies are doing in the same post mentioning CRISPR at the beginning, I guess that I don't have much sense of humor, but I've found this a bit distasteful...

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

The reason I allude to programmers is because essentially that's what would be happening. You're quite right, the current generation of programmers would not likely fit this role. However, if we look at a lot of near-future sci-fi, the idea of programming human implants and genes by hackers is not uncommon.

Medical researches will likely take on this role. Which could be as scary, if not scarier. They don't exactly have open and well defined processes.

To your second note, I also fear that reasonable people will pass on the position. That would leave a second class of people willing to do the position. But we really need the reasonable people working in this area.

Black humour is my way of dealing with harsh subjects.

Collapse
 
qm3ster profile image
Mihail Malo

киселина

Collapse
 
gmartigny profile image
Guillaume Martigny

The first thing that come to my mind was "Open source".
I, personally, would not take a job for messing with DNA. On the other hand, I would play with this hypothetical API on open-source.

In my opinion, having a plethora of solutions available "freely" online would hinder the possibility to have one bad solution doing damage. Much like it's better to have a diversified genes pool.

Collapse
 
mortoray profile image
edA‑qa mort‑ora‑y

It's a debate we could probably have, but not the intent of this blog post. I've replaced the affiliate links with direct links.

Collapse
 
qm3ster profile image
Mihail Malo

Why did you? What compelled you to do so?