re: You are a wizard because this is all magic VIEW POST

FULL DISCUSSION
 

Hi Evan, I really like where you're going with this article.

Couldn't your argument be used against those very abstractions that keep up sane and functioning every day?

For example: I have a fairly vague idea of how a device driver works, let alone how to build one. If I had to do it, I'd have to study quite a bit of systems programming. When I create software I don't think of things like device drivers because someone built abstractions on those. Same argument for a lot of different things I reckon.

So how deep this wizardry should go? Should everyone be Gandalf the white in the end?

You say:

Someone has to know how these things work

but in truth, someone does. Might not be the person that's been asked the question in that very moment, but somewhere in the world someone knows (well, maybe). Even if they are not willing to share that info.

What really scares me (and I think it scares you too) is how fast we're incorporating ML/AI algorithms into our apps without fully knowing how they do what they do.

Aside from the huge and paramount concerns of privacy and the understandable parallels to the best (or worst) dystopian novels, isn't AI supposed to become smarter than humans? That would be the very definition of falling for the magic ;-)

 

So how deep this wizardry should go? Should everyone be Gandalf the white in the end?

Probably pretty deep. I'm not arguing that everyone become Gandalf for every subject, but that being a Hermione or Harry is a good ideal to strive towards. It's really easy to accept abstractions without understanding how they work.

I'm not arguing against abstractions nor the practicality of using them (they're definitely important), but that developers should strive to dig deeper.

There's a point somewhere between a more senior and a more junior engineer where the first instinct to solving a problem isn't just to Google the error message, but to read the code of the libraries that caused the problem. But often what happens is devs will go to N level of the stack and if they can't find their problem there, they get stuck.

But like 60% of the time when a developer gets stuck, it's not a lack of knowledge, but a fear to go deeper.

Computers are these machines built by wrapping complexity in a neat little toaster box and then wrapping that box in a TV box and that box in refrigerator box. But if you're trying to figure out what's inside by picking it up and rattling it around, you won't be any better off than the lay person who thinks their phone is spying on them.

You gotta open the box.

how fast we're incorporating ML/AI algorithms into our apps without fully knowing how they do what they do.

I worry a bit about this framing. I think we say "ah we don't know how neural nets work" and a lay person interprets that to mean they've been summoned from some dark pentagon-filled ritual in some SF basement.

What we really mean is "this system is extremely complex and you can't easily know how the system works without knowing the entirety of the system." But as devs we have a solid understanding that these systems were built with intent, can be changed with intent, and are "eventually" knowable (even if they're practically impossible).

isn't AI supposed to become smarter than humans? That would be the very definition of falling for the magic

When that happens, sign me up for either the perfect Utopia or the Apocalypse. 🤷‍♀️

 

But like 60% of the time when a developer gets stuck, it's not a lack of knowledge, but a fear to go deeper.

It's also lack of interest and time constraints. Don't know the exact percentages.

What we really mean is "this system is extremely complex and you can't easily know how the system works without knowing the entirety of the system."

True that.

Thanks for the reply, I agree in its totality and don't have anything else to add :)

Programming is a curiosity driven journey.

code of conduct - report abuse