Why Programming Languages Are Hard

Marek Zaluski on March 04, 2019

Years ago when I first tried learning Python, within the first five minutes of playing around with it I encountered one of these error messages: ... [Read Full]
markdown guide

Everyone starts out as a beginner.

Yes, we do--and many of us have managed to progress as programmers despite the challenges of learning a new skill in an unfamiliar domain. If someone quits programming because they see an error message they don't understand, it may be that programming isn't there bag. I generate errors I don't understand every day and I've been doing this for years. Of course, I have the mental tools now to deal with difficult errors better than when when I was starting, but I didn't give up.

And you didn't give up either, and neither do a lot of people.

I think my biggest issue with this post is that it doesn't recognize how difficult the problem programming languages are trying to solve is: moving from transistors and logic gates to provide a framework for describing data and processes of arbitrary complexity. Programming languages aren't hard because programmers are elitist (a rather cowardly accusation), they are hard because they need enough firepower to solve arbitrarily hard problems in a way that reduces them to machine instructions. It's a wonder, the kinds of things computers are able to do today.

If you think it can be done better, prove it. Build us a language that is at once friendly and suitable for dealing with really hard problems. I don't think it's impossible, I just think it's easier to take potshots in a blog post.

“The housecat may mock the tiger,” said the master, “but doing so will not make his purr into a roar.”


If someone quits programming because they see an error message they don't understand, it may be that programming isn't there bag.

The existance of unhelpful error messages isn't a meaningful way to find out if programming is right for you. It's just an arbitrary barrier to entry.

Programming languages aren't hard because programmers are elitist

The elitism isn't in making programming languages hard. The elitism is in the act of saying it's fine that they're hard and that beginners should just deal with it or get out. I don't agree with that.


So I agree with what you're saying now: unhelpful error messages are bad. I think Python's syntax errors don't qualify, however. In addition to printing the text you gave, SyntaxError: invalid syntax, they also display the line and indicate to token at which the parser detected the error. I don't know what more you could ask for in the case of a syntax error. Of course, linters can sometimes use more robust static analysis to provide a more specific error messages, like "unmatched parenthesis", but that kind of analysis is non-trivial to implement, if I understand correctly--I've implemented some simple parsers, but never a complete static analysis tool.

The only so-called problem I can see is that the word "syntax" is sort of a jargon term--but even then, that's what it's called. Programmers have to learn new terms and concepts and if the instructional material doesn't prepare them to encounter syntax error messages, that's a failure of instruction, not language tooling.

But I agree, in principle, that error messages should be as helpful as possible given the constraints of the compiler/runtime to diagnose and accurately report the problem. If the compiler/runtime. The programmer should be given the best information available with in the form of a precise, readable message.


I think beginners should start with C. It's strict, doesn't allow sloppy code, forces the importance of Type down your throat, the use of pointers are the most straight forward way to understand memory, and has some of the best debugging tools in the industry. Starting with Python is just going to lead to greater frustrations down the road.


C teaches a lot of really valuable lessons and I agree that it's a great way to learn how memory works. I 100% recommend learning some C as a way to become a better programmer.

But do you need to know those things when you're learning how to write your first program? Definitely not.


I love C, but C is none of those things.

It's strict

C has macros

doesn't allow sloppy code

int array[3];
//What if I use the wrong size? Or <=? C has no protections against accessing
//out of bounds memory, and by that I don't mean it errors out, it just accesses
//it, which is the leading cause of security bugs in software today
for(int index = 0; index < 3; ++index)
index[array] = index + 1;
//yes, the above is valid, works just like array[index], which is
//understandable if you know what it's doing and think about it
//( *(array + index) is commutative), but a beginner won't
printf("Value at index %d is %d\n", index, array[index]);

forces the importance of Type down your throat

char c = 30; // this won't even throw a warning

Plus you have void*, plus you can cast anything to anything else, and it'll just reinterpret the bytes whether it's valid or not. There's a reason C++ introduced static_cast, dynamic_cast, reinterpret_cast, const_cast so you can avoid doing c-style casts.

the use of pointers are the most straight forward way to understand memory

Leaning to use pointers correctly will help you understand memory, but you don't have to learn to use it correctly to get working code. Memory leaks is just another type of sloppy code C allows.

has some of the best debugging tools in the industry

Just about every bytecode language will have much better debugging tools. Editing code in a debugger to see the effect than moving the program counter will corrupt the stack in C debugging tools more often than it will work.

Pascal / Delphi is a much better learning programming language. The syntax actually forces you to think about many of these things, and then you can take those habits with you to C. Unfortunately the language fell out of favor even in the teaching context it was originally created for, and most universities get people started with Java or Python instead.


I agree, when you get started with C you will grasp concepts of reference (pointers), static and dynamic memory allocation, memory leaks and manual memory deallocation. It is a simple language, yet so powerful at the backend. I build my first spellchecker with it. I actually started with Fortran and I failed badly during my physics degree because I could not grasp the fundamental programming concepts - I was a rookie. I believed for many years I was not good at programming - which probably accounts for most of us biological beings as we are not accustomed to work like a computer. Then I did some C# and ASP.NET and honestly it was a painful experience for me because quite a lot of low level concepts were abstracted away, which is fine but you somehow don't learn the fundamentals. Now I am doing Python and still have to go few levels down and learn/re-learn the fundamentals but I am getting more comfortable, but now I understand the limitations of strict dynamic typing. I would like to pick up new languages but for now I would like to master Python. I know EcmaScript fairly well but I also played a just bit with Racket/Scala/Haskell/c++/Java - I love Racket.


Nope. C has too many pitfalls. There are libraries full of books about C pitfalls! (and let's not mention C++).


Everything has a certain level of difficulty, but I think that is more applicable if you use the term Learning Curve.
You can't just abstract a programming language and harness its full power.
Beginners need to understand that, if you want to get into programming I suggest watching a computer science course on YouTube from Harvard, why? Because
learning the essential principles of memory efficiency and generally how computers work is more important as a beginner.
Programming itself is not writing in your language of preference, it's actually solving any given problem.
Wrapping your head around that will help you understand that a language is just another tool and they're expendable, but knowing the science or general concept behind how computing machines work is substantial.
Learning to "program" with tools made to oversimplify programming gives you the wrong concept, and it's hard to stop thinking that way later on, and when you get errors like those ones you feel extremely incompetent, but you're just overreacting, because you're just not looking at the big picture due to the fact that those tools made you think that everything must make sense to YOU in your perfect world of drag and drops. In reality it has to make sense to the compiler.
A language like Java after learning C is the best point to start programming. In the ends, problem solving is what counts.


Learning to "program" with tools made to oversimplify programming gives you the wrong concept, and it's hard to stop thinking that way later on

Why do you think it's hard to stop thinking that way later on?

What stops you from picking up the missing principles at a later time?


For instance, people who have a very strong addiction, perhaps smoking or consuming addictive substances (nothing personal), even if they've only done it for a couple months, it's going to be hard to quit (depending), generally, at least. And in several cases they're going to have to make radical decisions if they want to quit.
In "programming", if you get used to the sweet and delicious simplicity of dragging and dropping you're going to think that programming is very easy, incredibly easy.
You don't have to worry actual problem solving or language and computing related concepts like "cryptic" errors, efficiency, architecture, frameworks, libraries or any other programming concept which may drastically vary depending on the tools you use, but that is essential to write testable, maintainable, functional, meaningful and just good code itself, for you and the computer. NOTE that I'm not saying you should worry about that as a beginner, but it starts you off with the wrong foot if you don't get exposed to none of that. And once you do get into real programming (not necessarily profesional programming) you're going to be like that person that tried to quit smoking or drinking and just couldn't because it was too big of a challenge, because of how long they've been feeding that addiction (or how long they've spent "coding" with visual languages, cough Scratch). You're not going to like it, you're going to back off and become disappointed. It happens to a lot of us, we had the completely wrong mindset, we weren't looking at the big picture.
The big picture that I'm talking about is having that knowledge or right mindset. That allows you think the right way, that a Programming Language is just a tool, (if Java dies what's the big deal? (Says someone who programs in Java)) there are other languages, you shouldn't focus on just learning a language, because they should be quite straight forward, but focus on problem solving. That's what we do as programmers, we don't spent our lives learning a language, that's absurd, that's like studying English for your whole life and never using it to start a conversation. You need to start off with the right foot. I'm not saying you can't quit smoking, but is better if you never do it. In other words and less metaphorically, I'm not saying you can't change your mindset, but is better of you start with the right one. In the end, if you want to smoke, go ahead, if you want to start "programming" that way, go ahead, if you ever want to quit is going to be harder, if you ever want to program for real is going to be tougher.
Real programming makes you better, it doesn't matter what language you do it on, you can always translate your knowledge to other languages, just like you would in real life. If you know how to say something in English, then there is a way to say it in Spanish, even if you have to use context and other things to add substance to the meaning of a phrase or word. If you can't that just means you don't know enough about the the other language (If you want to learn multiple anyways). But at least you have the idea, the mindset.

If you're kid, or have a kid. Then drags and drops would be just fine.


I want to clarify something, I don't disagree with most of the stuff you posted, which means, I'm not a negative, more of a neutral plus-ish on what you claimed.


For a beginner anything is "hard", for example riding a bike is hard, riding a bike off road is even harder and riding a bike down a world cup downhill course is hardest.

Is programming really that hard or are there just bad/lazy developers in the world? I would say the latter. It's a skill where few are great, some a very good, most are good and a lot are bad.

As a developer of 15+ years, I've mostly worked with C, C++ and Python. When I look at python code others have written it's not too often when I think "wow, that was a neat solution". Most of the time I read code I think "wow, that is stupid". Why? Because yes, programming is hard and most programmers I come across hate listening to critique of any code they have written. It comes across as a personal insult and the first human reaction to insult is to defence. If you defend your bad code and don't learn how do you get better? The answer is you don't, and you spill into the pool of "good/bad" programmers and never allow yourself to become "great".

Programming is hard; why should it be easy? Going back to the principle of riding a bike, you can make it easy by adding stabilisers, but your not going far off road with stabilisers and your certainly not going anywhere near a world cup downhill course! I'm all for newbies becoming great, but they need to ensure they are completely open to critique so they can become great.


I'd like to thank Marek for this article, which makes many valid points and helps us stay grounded, remembering that not everyone knows as much as we do. Just a couple of points:

-- "programming languages which are designed to be easy to learn tend to be useless"

Hmmm, that doesn't always have to be the case. If they were designed just to be easy, with no further aim in mind, then it might be true. But there are a lot of easy-to-learn languages that are far from useless. SQL is the obvious leading light. The key point is that a language should be good at what it's designed for, and that's not always incompatible with ease of use.

-- "HyperCard"

Wow, what an inspiration that was. Glorious times that should never be forgotten. Did you know it actually lives on, having inspired both Revolution and Live Code?

-- "If you want to show people why programming is worthwhile and help them understand how they can use it, you need to show them how to solve a real-world problem."

And that's why a good choice of domain-specific, easy-to-learn languages/tools is valuable. We don't all have the time or patience to learn programming "properly" from the ground up when our target is to animate part of a web page or set up a chain of sound filters. The user of your solution doesn't care HOW you did it as long as you did it. Remember, "if you can't tell the difference, there IS no difference".


I highly recommend LiveCode, drag’n’drop coding for beginners.


I learned to program with HyperCard when I was ten. I probably played in BASIC with my dad before that, but HyperCard really made a difference. It’s nice to see it mentioned here. It’s worth noting as well that the first version of Myst was made in HyperCard (using a custom extension to enable colour, if my memory serves me right).


It's remarkable that HyperCard led to the creation of Myst, that's a great story.


This problem is already solved in Ruby; take an eye to the "did_you_mean" gem: github.com/yuki24/did_you_mean

It was created as a separate gem, but it's now part of ruby core. Any ruby syntax error comes now with a nice "did you mean..." with useful suggestions.


Good article. I agree with your assessment of how we got into the current situation. However, I believe the onus is entirely on the developer community to create better paradigms, languages, and tools to assist less technical people in getting stuff done. Computers are crazy powerful, but much of that is locked away for the average person.

On the topic of error messages, have a look at Elm. Great care was paid by the developers to make compiler errors easy to understand for regular humans. It will even show examples and guess at what might fix the problem.

However I think we may be stuck in an old paradigm - textual source files - that's also holding us back. Bret Victor gave an interesting talk on forgotten lessons from decades past at worrydream.com/dbx/. He has also been working on re-imagining a computer as a public space for experimentation that exists in the physical world at dynamicland.org

I am really excited when people are able to think past our current ways of doing things, designed to meet constraints that largely no longer exist. I hope that programming will one day be unrecognizable to us now, and become just another thing you do in your day to day life.


On the topic of error messages, have a look at Elm.

Yes!! Best error messages ever. I have never felt happier as a developer.



Great article! I've been working with beginners for a bit, and I totally agree with your sentiment. There are definitely artificial barriers to entry in this field, and we should do a better job of tearing them down.


It seems weird to comment at myself, but I was just thinking that improving the user experience of programming languages would be good for everyone, not just beginners.

I wrote a grading tool for myself last semester, and it worked great at the time. Since then, I’ve expanded it and added more automation. Unfortunately, I’ve sort of forgotten some of the test cases, so when a test fails I don’t always know exactly what went wrong.

Today, I went back and cleared up the error messages, so I’ll know exactly what goes wrong in the future. Naturally, I thought of this awesome article. Thanks, again!


This post resonated with me since I started my career as a financial analyst and have been using Excel for 10+ years. I never thought of Excel as "programming," but as I dug deeper into data structures, automating tasks with VBA, and sanitizing data, I got more interested in SQL, data pipelines, and data manipulation using scripts.

Going from Excel to programming was a difficult leap but showed me that there is so much more you can do beyond Excel. I'm not a coder by any stretch of the imagination, but have found a sweet spot with no-code tools. I consider Excel to be in this "no-code" camp. Wrote a post about it here: dev.to/coda/the-overlooked-benefit...


Ironically, I'm trying to solve this puzzle these days: making programming more beginner friendly.

One gimmick that I use to address "invalid syntax" issue is...

I show a text with numerous grammatical 3rrors in it. And peple still able 2 read it.

Then I show a working production code with 10k+ lines of code. Then I misspell one 'word' in it, and show how the entire thing breaks.

Then I show an excel sheet with 1000 numbers and show how fast excel can add all of them.

And when I ask humans to do the same. They begin to get the difference ;)

IMHO, the approach to teaching had been created by .. electrical engineers (not programmers) and is long overdue for a complete re-write.

I'm helping kids with their first and second course... what to they start with?

  • console applications

Now if you consider that an 18-20 had been born after consumer console applications were succeeded by GUI apps. And they had never seen or used a console application in their life...
Then what are we doing here?
Are we teaching someone how to drive .. using a car with a manual starter, manual transmission, carburetor and no power steering.

In the end ... programming is simple. So there should be a way of teaching it so that the student goes "yeah it is simple"


Sounds like my car. I like it and learned to drive with it.


programming languages which are designed to be easy to learn tend to be useless

Ok, well you left yourself an "out" by the word "tend" - but I'm going to relate my experience and understanding:

I first started my software engineering path, though I didn't know it at the time, with the first computer I "owned" (ok, my parents bought it) - a TRS-80 Color Computer 2 with 16K of RAM, a TV, and cassette tape "drive" for storage! Woohoo!

Ok - this was in 1984, and truth be told, it wasn't my first computer nor first "programming experience" (prior to that, I had a Milton Bradley Big Trak, which was "programmed" in a very primitive kind of "LOGO" using a simple keypad, and no display).

Anyhow - that first experience was with the built-in language of the computer, Microsoft's "Extended Color BASIC"! And my first error, IIRC - was a syntax error.

Or - more precisely for BASIC:


Honestly I don't recall the real line number, but it was likely "10" because for some reason all BASIC programs started with line 10, and each line increased in multiples of 10. It would be a few years before I learned about structuring a program and setting aside blocks of numbers so I could build things better - but hey, I was only 10 years old...

So - how did that error occur? Well - because I had typed in the code as the book for the computer showed. The thing was, it never said anything about hitting the < ENTER > key at the end of the line (what is the < ENTER > key? Well - it was another name for what we call the RETURN key; strangely, though, some keyboards still have it labeled with the word "Enter" - such as the Model M keyboard I am typing on right now).

I was (a little) familiar with a typewriter - so I hit the spacebar until the "end of the line" on the screen, at which point the cursor jumped to the next line, right under my first line - perfect! From there I continued to type in the code (fortunately, on the Color Computer 2, the initial screen is only 32 characters wide - so "spacing over" wasn't too much of an effort).

Some more "spacing over" - then I typed in RUN, which strangely in the manual did tell you to press the < ENTER > key! And...


But I didn't feel bad about it - it just made me want to find out what I did wrong. With some careful reading (which probably should've been done first - I learned my lesson there!), there was mention of pressing the < ENTER > key at the end of the lines put in. I cleared things out, re-entered the code, typed RUN and < ENTER >...

My world changed. Today, I'm a well paid software engineer who enjoys going into work every day, then coming home and...browsing reddit!

(ok - occasionally I get a bug up my butt and do more coding, but this isn't as often as it was in my teens and 20s; sometimes you need to veg)

That's my tale of my beginnings programming...in BASIC.

Now - BASIC stands for "Beginner's All-purpose Symbolic Instruction Code" - it was a language developed at Dartmouth in the 1960s specifically to be easy for beginners to learn how to program with.

But to say, just because it was (fairly) easy to learn, that it would "tend to be useless" is to do a huge disservice to the language itself.

Was it perfect? No. Did it have rough edges? You bet! But was it useless?

In a business sense?

Hell no!

In fact, there was a whole subset of BASICs created in the 1970s and 80s specifically geared toward business, strangely enough they were termed "Business BASIC" languages. Probably one of the best known of them was "PICK Basic":


But that isn't to say other BASICs weren't popular for business use purposes; indeed, Microsoft partially built their empire around BASIC. For business, their BASIC compilers of QuickBasic 4.5, PDS 7.1, and Visual Basic 1.0 were the "final DOS BASIC languages" of the early 1990s, and a lot of businesses used and were built on these languages.

Then came VB 3.0 for Windows, and that really changed the business world. Today, there are a ton of VB 6.0 developed software still running businesses, so much so that Microsoft has been virtually forced to keep the runtime available, even now in Windows 10.

But back in the earlier days of personal computing, BASIC was about all you had, because the cost to purchase a compiler for almost any other language (if your system could even use it) could be prohibitive. But usually you could get a simple assembler (or on some machines, like the Apple IIe, you could drop into the ROM monitor and type in the hex opcodes) to help along your BASIC code. I am certain there were businesses out there that used a ton of BASIC on various machines of the era, whether that was an Apple, IBM, TRS-80, or anything else. It was done because it was approachable, a lot of people knew it (built in base of developers), and it was low cost.

I've since seen BASIC pop up in other use cases - for instance, at one point one of the major manufacturers of industrial robot arms could program their arms using a form of BASIC (it was called something like "ROBO-BASIC" or "ARM-BASIC" or something like that; all I ever found was the manual, I never actually used it).

And BASIC today? Well - maybe not used as much for business, but VB still survives (somewhat) in the form of VB.NET. There are also quite a few open source BASICs available - many quite advanced:




...plus several others out there as well. Oh - and who can forget BASIC for embedded microcontrollers:



Heck - with that last one - it is arguable that Parallax brought microcontrollers to a whole new audience by naming it the "BASIC Stamp" and integrating PBASIC into the system, over a decade before the Arduino!

Toy language? Useless? Hah!


Very Insightful. I teach logic formulation and have taught several programming language (Assembly, BASIC,Pascal, C & C++). To my mind, syntax is easier to hurdle than algorithms. Coders should also acquire fundamental concepts on control and data structures.


Very useful article for frustrated programmers. I would recommend that instead of syntax error, it should display a list of solutions (let's say I mistyped a certain function, the output should be: "Syntax error, and put some suggestions to correct the code"). I know it's hard but once done, it would get viral and highly recommended not just for beginners, but for experts also.


I think the premise of this post is too broad. Having friendly & helpful error messages doesn't make languages automatically easier to learn. Programming at a professional level is just damn hard, because it's not just about learning syntax of a language and how to read error messages.

Programming is mostly about software architecture, reducing a complex domain problem to instructions and applying solid design patterns ( pun intended)

The current trend of marketing software development education to people that "programming is easy" or that "anyone can pick it up in 6 months in a bootcamp", may be a big disservice and setting them up for failure. Some can and some cannot, but not everyone. I had several friends who tried to go into software testing because of all the hype a few years ago, but just realized it was not for them after wasting a considerable amount of time and money.

Let's use an analogy: many people can learn basics of flying and controlling a small airplane in a "6-months bootcamp" or even in a simulator, but would you entrust them with a big passenger or cargo jet, especially if they find it hard to read one of the controls? How many people actually go through years of training it takes to become a real professional pilot, that can fly anything bigger than a equivalent of sky scooter, capable of making difficult landings in difficult conditions and somebody who really understand the relevant topics in aeronautics, mechanical engineering, customer service, emergency situation management, etc ?

I'm not trying to say that programmers are like pilots, all I'm saying that any profession takes time & dedication, and may not be for everyone. And in some cases having a bit of a technical barrier is inherent to the profession.

I think you are also arguing for "better tools". But better tools don't make it accessible to everyone, nor should they. Planes that are flying now are ages more advanced then 60 years ago. They can practically fly themselves, but does it mean you want everyone to have a pilot license?

Having said that, making readable error messages, while being helpful, isn't a panacea. For example Elm and Elixir and Crystal - all have much better error messages than what "older" languages used to have. But many people still can't easily grok them, especially functional aspects of Elm & Haskell. Even people who've been doing programming for a while. ( This is just anecdotal based on my experiences with other programmers)


I agree, it's not accurate to tell people "programming is easy". Unlike a pilot's license though, there isn't any harm or danger coming from a higher number of people learning an introductory level of programming.

I'd love to hear more about the experiences of people who have invested real time and effort into learning (a bootcamp or something equivalent) and then decided it's not for them. What were the factors that made them reach that conclusion? That would help us understand what we should be saying instead of "programming is easy / anyone can do it".


I remember trying to execute my first "hello world" script in Java a few years back in high school. Couldn't figure out why system.out.printIn(yes, with an I, a capital i) did not work.


Hi Marek, I agree in theory with your thesis, though I don't have a solution for that.

I also think we perpetuate the hardship by teaching those languages the same way we were taught, could it be partly this?

Looking at the landscape of languages I don't really see this aspect getting better anytimes soon.

I've heard about Dark on Twitter but nobody has seen it publicly so we'll have to wait:


I mostly disagree. Natural languages are hard, and that is the problem. The syntax and grammar feel arbitrary. There are no proper tools available to verify if the sentence you constructed produces the correct result.
This is all much easier with programming languages. But when something wrong is detected it needs to be communicated back to the user by means of natural language. This is where the issue surfaces.


So, I think Hypercard is dead because Apple decided they had something cooler. Also, they had trouble with color. But, a company made SuperCard and had great color. I used that for a while for multimedia educational tools. In fact, you could build stuff with such ease, you have to wonder while people go through so much trouble on the web.

But, SuperCard did not last. And, there was a guy who wrote Revolution with RevTalk. RevTalk is now LiveScript (I think). That is a write once run everywhere system. A Scottish company took it over.

I think that besides QT, LiveScript is one of the best cross-platform solutions out there. But, QT requires more work all around, yet gives you C++. But, LiveScript is about the fastest turn around in development I ever encountered. It needs more love for sure.

As for just rummaging through every computer language known to man and several natural ones, I have to say that I have never been fully satisfied. For complex languages, what you get is a crowd of people who are really snotty about little useless features and you don't get simple development paths. You get lots of hype. For instance, there is a common belief that Python will prevent bad programming from all programmers. That is not true, I have seen horrible programs written in Python. But, that does not make Python bad, it's the hype that's problematic.

I used to think that SETL would be a language in which Set Theory expressions could be used. But, I found out that some guys just wanted to extend the 'set' command line statement found in bash. (Correct me if I am wrong.)

I started a language def of my own. But, of course, who would fund its development? So, it's sort of a joke. github.com/rleddy/acai. I tried to make set theory language parsable. And, I even used the name of a food you might have at breakfast with coffee.

But, if you want to keep me from starving and would like a language that works for you, you could request that I do that and help out the ghost (who is starving) patreon.com/coffeeshopghost. - got to get groceries somehow.


I'm currently learning Python. They say this language is quite "Friendly".
But then i read this post and you said that "friendly coding language doesn't solve any problems and quite useless", so does that mean that learning Python is useless?
I mean, i aim to Machine Learning but this seems to be so.....lost to me at the moment.
I too, am a beginner in programming


I find Python error messages to be one of the strongest sides of the language.


What are some examples of languages that are designed to be easy to learn and are useless?

code of conduct - report abuse