Programming changes the way you think.
Let me qualify that: programming has changed the way I think - and might have changed yours.
This isn't to say I know all the ways it's changed the way I think, though. I've had a hunch for a few weeks now that the brain action (that's a real thing, right?) of writing code has in turn reshaped the way I think in a more far-reaching way than at the keyboard, and this is that: a hunch I think is worth exploring out loud.
Every Detail Matters
Have you ever considered how you read? Do you skim? Do you read a single sentence, or devour lines at a time? Do you give pause at words you don't understand, or do you pick up meaning from context?
Aoccdrnig to a rscheearch at Cmabrigde Uinervtisy, it deosnβt mttaer in waht oredr the ltteers in a wrod are, the olny iprmoetnt tihng is taht the frist and lsat ltteer be at the rghit pclae. The rset can be a toatl mses and you can sitll raed it wouthit porbelm. Tihs is bcuseae the huamn mnid deos not raed ervey lteter by istlef, but the wrod as a wlohe.
This won't work in programming. Even though your brain can tell what's meant by cnost
, it isn't const
. One works - and the other doesn't.
Over time, I think this forces you to develop a different kind of thinking when reading: that of reading characters over words and sentences.
The same kind of required-detail constraint applies to higher-level thinking as well. When you're designing an application interface, you now need to consider a host of details that must work in specific and contextually correct ways - everything from the address of your calls to a server to what happens after it completes or fails.
Personally, since I've started learning to code, I've stopped losing my keys and wallet.
It Isn't Working? Good News!
Another candidate for this thought's title was big red bold THIS IS BAD text is actually my friend.
Seriously, before programming, when were errors a good thing, really? Sure, you either win or you learn or every failure is an opportunity and other thoughts like that I could write entire blog posts about, but when has feedback ever been so thorough and downright useful?
This is about more than reading errors, though. The simple act of programming is literally going from failure to failure until something works - and then breaking it again later and re-writing until it works a different way. Failure and making mistakes become your actual access to the end product.
At some point along the way, I've (mostly) stopped kicking myself for mistakes in other parts of my life and started dealing with - - forgive me for this - debugging them instead.
It's (Almost) Always My Fault
Have you ever found yourself thinking to yourself why isn't this working? (Don't you hate it when blog authors ask you rhetorical questions?) If you're like me, you've probably followed that with this (insert test/technology/external library here) isn't working.
Maybe this is just me, but no matter how much I know that the reason something isn't working is due to code I wrote, I always blame something else. Clearly Rails is doing something weird somewhere deep in the class inheritance tree, or this unit test has personally declared war on me and everything I hold dear.
You already know the punchline: it was (99% of the time) actually something I wrote that caused the bug.
I've gotten a bit better at short-circuiting this initial (freight) train of thought, but I find this one especially interesting because no matter how many times I've realized that I was actually the man behind the curtain, I'm still bent in this direction when something doesn't work.
This Is Not The End
^ What he said. The best part of exploring this is what's left to explore. I don't know what I'm going to learn next (just kidding, I do know: it's TypeScript), and not knowing is exciting.
Is there any way that programming has changed the way you think? Are squirrels secretly plotting their hostile takeover of society? Leave me a comment.
Top comments (0)