DEV Community

Marek Zaluski
Marek Zaluski

Posted on

Why Programming Languages Are Hard

Years ago when I first tried learning Python, within the first five minutes of playing around with it I encountered one of these error messages:

SyntaxError: invalid syntax
Enter fullscreen mode Exit fullscreen mode

It sounded like Python was telling me: you did something bad, and you should feel bad.

I did feel bad, in part because I had no idea what a SyntaxError meant, and the phrase "invalid syntax" didn't help clarify anything at all.

Today, I know a lot more about programming languages and about how parsers work. But the error message is still bad. It's bad UX, or more precisely: bad DX (developer experience).

If you're learning programming for the first time, then the error is a bad experience on two levels:

  1. First, it doesn't tell you what just happened. Did I do something wrong? Did my program do something wrong? Or did Python do something wrong?
  2. Second, it offers no indication of what the reason for the error might be, or what I could try to do to fix it.

Now, the reality is that syntax errors happen all the time (even after years of experience) and that most of a developer's life is spent fixing bugs and troubleshooting all kinds of coding problems.

But as a beginner, you don't know that.

When you're a beginner, this error actually sounds like it's a big deal, and more critically, it sounds like you did something wrong.

The problem: programming languages are created by programmers

Programming languages are created by programmers, for programmers.

That's a problem when you want to make programming more accessible and easier to learn. It's a target audience mismatch.

On top of that, there are limitations that come from how programming languages are created. We end up with messages like "invalid syntax" not just because someone wrote the error message like that, but because of the way that the language grammar and parser work. Maybe the parser actually doesn't have access to the exact grammar rule that caused the syntax error, because it was coded that way. So the error message isn't capable of being more helpful for technical reasons.

It's very likely that it just wasn't one of the priorities for Guido when he was writing the first version of the Python parser. He was writing it for himself and for other programmers.

After all, when Python was first released in the early 1990s, it was already much simpler and easier to learn than most other languages at the time. (I think that might still be true today.)

A secondary problem: programming languages which are designed to be easy to learn tend to be useless

There are lots of programming languages that are designed to be easier to learn and to be accessible to beginners. Very few of them are successful or popular, and the problem is that you can't do any serious work with them.

They may achieve their goal of making it easier for people to get started with programming. But a lot of the time they fail to illustrate what problems programming can solve, and how it fits into a bigger picture.

So in a way, they actually sabotage themselves before they can fulfill their goal.

If you want to show people why programming is worthwhile and help them understand how they can use it, you need to show them how to solve a real-world problem. Otherwise, most of the time, it ends up looking like a toy.

HyperCard and Excel

HyperCard was one of those applications that emerged in the late 1980s that had an influence on a huge number of computer users and I would bet that it inspired a lot of people to become programmers.

The programming language that was available inside of HyperCard, called HyperTalk, wasn't particularly great. But it was aimed at non-programmers while at the same time being capable enough to solve real problems.

Many people who thought they would never be able to program a computer started using HyperCard for many automation and prototyping tasks, a surprise even to its creator. (Wikipedia)

Excel is the other application that was huge in bringing programming to non-programmers.

Based just on formulas and cells, Excel's model managed to be half-code and half-graphical. It's generic enough to solve a whole range of problems, but it still sticks to a sweet spot: it's specific to problems that can be expressed as a spreadsheet. That's what keeps Excel so powerful: it doesn't try to do everything. It's a spreadsheet.

Making a tool customizable until it becomes a poorly designed programming language

In software, we're always faced with a trade-off between writing code that solves a specific solution, and writing code that allows the user enough customizability to solve a wide variety of problems.

It's the tool specificity problem.

Excel and HyperCard are tools, not programming languages. But both of them were pulled in the direction away from specificity, to try to solve all problems for everyone.

What's the most customizable form of a piece of software? It's a programming language itself. So if you take those tools to an extreme point of customizability, they become poorly designed programming languages.

Why is HyperCard dead today? I think it's because it was too generic and wasn't a perfect fit for any specific problem.

While it did put a lot of power into the hands of non-programmers, eventually it turned out that the economics of software is such that it's usually more worthwhile to have specific tools built for specific problems.

Programmers solve their own problems

It's normal to expect that if you leave programmers to their own devices, they're going to make tools to make their own work easier.

Sometimes programmers are guilty of a certain kind of elitism, where they put up barriers to entry instead of making things easier for newcomers.

Once you get past the initial hurdles and you learn how to write code, you start to forget what it felt like when you were just getting started.

It's remarkably easy to forget how helpless you felt when you were trying to learn the basic syntax of your first programming language and everything seemed stacked against you.

Everyone starts out as a beginner.

So, what can we do to make things better?

Rants aside, here are some actionable suggestions.

  • If you're an experienced coder, share stories about your first encounters with programming to show that we all start out clueless and intimidated before we got the hang of it.
  • If you're a beginner, try to forgive the creators of programming languages for the cryptic and unhelpful error messages. Creating a programming language is a lot of work, and it's not always obvious how to make it easy for beginners while also making it useful for solving real problems.
  • Also if you're a beginner, you can help the programming language community by sharing your experiences and your feedback. A lot of the time, the experienced programmers need a reminder that beginners don't see things the same way that they do, and they need to snap back to reality sometimes.
  • Forgive the beginner-friendly programming languages if you can't make anything useful with them. They can still be worthwhile learning tools, and if they can get people excited to learn more about programming in general, then that's the best we can hope for.

Let me close with one last thing.

I was too harsh above when I called those languages useless, but I meant it in relation to real-world, so-called "business" problems.

If you can use a tool to make cool or creative things and then show those things off, then that tool is 100% useful and valuable to humanity. Let's make more of those.

Top comments (34)

Collapse
 
ninjaaron profile image
Aaron Christianson

Everyone starts out as a beginner.

Yes, we do--and many of us have managed to progress as programmers despite the challenges of learning a new skill in an unfamiliar domain. If someone quits programming because they see an error message they don't understand, it may be that programming isn't there bag. I generate errors I don't understand every day and I've been doing this for years. Of course, I have the mental tools now to deal with difficult errors better than when when I was starting, but I didn't give up.

And you didn't give up either, and neither do a lot of people.

I think my biggest issue with this post is that it doesn't recognize how difficult the problem programming languages are trying to solve is: moving from transistors and logic gates to provide a framework for describing data and processes of arbitrary complexity. Programming languages aren't hard because programmers are elitist (a rather cowardly accusation), they are hard because they need enough firepower to solve arbitrarily hard problems in a way that reduces them to machine instructions. It's a wonder, the kinds of things computers are able to do today.

If you think it can be done better, prove it. Build us a language that is at once friendly and suitable for dealing with really hard problems. I don't think it's impossible, I just think it's easier to take potshots in a blog post.

“The housecat may mock the tiger,” said the master, “but doing so will not make his purr into a roar.”

Collapse
 
marek profile image
Marek Zaluski

If someone quits programming because they see an error message they don't understand, it may be that programming isn't there bag.

The existance of unhelpful error messages isn't a meaningful way to find out if programming is right for you. It's just an arbitrary barrier to entry.

Programming languages aren't hard because programmers are elitist

The elitism isn't in making programming languages hard. The elitism is in the act of saying it's fine that they're hard and that beginners should just deal with it or get out. I don't agree with that.

Collapse
 
ninjaaron profile image
Aaron Christianson

So I agree with what you're saying now: unhelpful error messages are bad. I think Python's syntax errors don't qualify, however. In addition to printing the text you gave, SyntaxError: invalid syntax, they also display the line and indicate to token at which the parser detected the error. I don't know what more you could ask for in the case of a syntax error. Of course, linters can sometimes use more robust static analysis to provide a more specific error messages, like "unmatched parenthesis", but that kind of analysis is non-trivial to implement, if I understand correctly--I've implemented some simple parsers, but never a complete static analysis tool.

The only so-called problem I can see is that the word "syntax" is sort of a jargon term--but even then, that's what it's called. Programmers have to learn new terms and concepts and if the instructional material doesn't prepare them to encounter syntax error messages, that's a failure of instruction, not language tooling.

But I agree, in principle, that error messages should be as helpful as possible given the constraints of the compiler/runtime to diagnose and accurately report the problem. If the compiler/runtime. The programmer should be given the best information available with in the form of a precise, readable message.

Collapse
 
chefindan profile image
Daniel Green

I think beginners should start with C. It's strict, doesn't allow sloppy code, forces the importance of Type down your throat, the use of pointers are the most straight forward way to understand memory, and has some of the best debugging tools in the industry. Starting with Python is just going to lead to greater frustrations down the road.

Collapse
 
marek profile image
Marek Zaluski

C teaches a lot of really valuable lessons and I agree that it's a great way to learn how memory works. I 100% recommend learning some C as a way to become a better programmer.

But do you need to know those things when you're learning how to write your first program? Definitely not.

Collapse
 
trekkiegod profile image
TrekkieGod

I love C, but C is none of those things.

It's strict

C has macros

doesn't allow sloppy code

int array[3];
//What if I use the wrong size? Or <=? C has no protections against accessing
//out of bounds memory, and by that I don't mean it errors out, it just accesses
//it, which is the leading cause of security bugs in software today
for(int index = 0; index < 3; ++index)
{
index[array] = index + 1;
//yes, the above is valid, works just like array[index], which is
//understandable if you know what it's doing and think about it
//( *(array + index) is commutative), but a beginner won't
printf("Value at index %d is %d\n", index, array[index]);
}

forces the importance of Type down your throat

char c = 30; // this won't even throw a warning

Plus you have void*, plus you can cast anything to anything else, and it'll just reinterpret the bytes whether it's valid or not. There's a reason C++ introduced static_cast, dynamic_cast, reinterpret_cast, const_cast so you can avoid doing c-style casts.

the use of pointers are the most straight forward way to understand memory

Leaning to use pointers correctly will help you understand memory, but you don't have to learn to use it correctly to get working code. Memory leaks is just another type of sloppy code C allows.

has some of the best debugging tools in the industry

Just about every bytecode language will have much better debugging tools. Editing code in a debugger to see the effect than moving the program counter will corrupt the stack in C debugging tools more often than it will work.

Pascal / Delphi is a much better learning programming language. The syntax actually forces you to think about many of these things, and then you can take those habits with you to C. Unfortunately the language fell out of favor even in the teaching context it was originally created for, and most universities get people started with Java or Python instead.

Collapse
 
kamilliano profile image
kamilliano

I agree, when you get started with C you will grasp concepts of reference (pointers), static and dynamic memory allocation, memory leaks and manual memory deallocation. It is a simple language, yet so powerful at the backend. I build my first spellchecker with it. I actually started with Fortran and I failed badly during my physics degree because I could not grasp the fundamental programming concepts - I was a rookie. I believed for many years I was not good at programming - which probably accounts for most of us biological beings as we are not accustomed to work like a computer. Then I did some C# and ASP.NET and honestly it was a painful experience for me because quite a lot of low level concepts were abstracted away, which is fine but you somehow don't learn the fundamentals. Now I am doing Python and still have to go few levels down and learn/re-learn the fundamentals but I am getting more comfortable, but now I understand the limitations of strict dynamic typing. I would like to pick up new languages but for now I would like to master Python. I know EcmaScript fairly well but I also played a just bit with Racket/Scala/Haskell/c++/Java - I love Racket.

Collapse
 
ogamita profile image
Pascal Bourguignon

Nope. C has too many pitfalls. There are libraries full of books about C pitfalls! (and let's not mention C++).

Collapse
 
mattconway1984 profile image
Matthew Conway

For a beginner anything is "hard", for example riding a bike is hard, riding a bike off road is even harder and riding a bike down a world cup downhill course is hardest.

Is programming really that hard or are there just bad/lazy developers in the world? I would say the latter. It's a skill where few are great, some a very good, most are good and a lot are bad.

As a developer of 15+ years, I've mostly worked with C, C++ and Python. When I look at python code others have written it's not too often when I think "wow, that was a neat solution". Most of the time I read code I think "wow, that is stupid". Why? Because yes, programming is hard and most programmers I come across hate listening to critique of any code they have written. It comes across as a personal insult and the first human reaction to insult is to defence. If you defend your bad code and don't learn how do you get better? The answer is you don't, and you spill into the pool of "good/bad" programmers and never allow yourself to become "great".

Programming is hard; why should it be easy? Going back to the principle of riding a bike, you can make it easy by adding stabilisers, but your not going far off road with stabilisers and your certainly not going anywhere near a world cup downhill course! I'm all for newbies becoming great, but they need to ensure they are completely open to critique so they can become great.

Collapse
 
gtanyware profile image
Graham Trott

I'd like to thank Marek for this article, which makes many valid points and helps us stay grounded, remembering that not everyone knows as much as we do. Just a couple of points:

-- "programming languages which are designed to be easy to learn tend to be useless"

Hmmm, that doesn't always have to be the case. If they were designed just to be easy, with no further aim in mind, then it might be true. But there are a lot of easy-to-learn languages that are far from useless. SQL is the obvious leading light. The key point is that a language should be good at what it's designed for, and that's not always incompatible with ease of use.

-- "HyperCard"

Wow, what an inspiration that was. Glorious times that should never be forgotten. Did you know it actually lives on, having inspired both Revolution and Live Code?

-- "If you want to show people why programming is worthwhile and help them understand how they can use it, you need to show them how to solve a real-world problem."

And that's why a good choice of domain-specific, easy-to-learn languages/tools is valuable. We don't all have the time or patience to learn programming "properly" from the ground up when our target is to animate part of a web page or set up a chain of sound filters. The user of your solution doesn't care HOW you did it as long as you did it. Remember, "if you can't tell the difference, there IS no difference".

Collapse
 
agilebelma profile image
Belma

I highly recommend LiveCode, drag’n’drop coding for beginners.

Collapse
 
fj2c profile image
Fernando Calatayud

This problem is already solved in Ruby; take an eye to the "did_you_mean" gem: github.com/yuki24/did_you_mean

It was created as a separate gem, but it's now part of ruby core. Any ruby syntax error comes now with a nice "did you mean..." with useful suggestions.

Collapse
 
oscherler profile image
Olivier “Ölbaum” Scherler

I learned to program with HyperCard when I was ten. I probably played in BASIC with my dad before that, but HyperCard really made a difference. It’s nice to see it mentioned here. It’s worth noting as well that the first version of Myst was made in HyperCard (using a custom extension to enable colour, if my memory serves me right).

Collapse
 
marek profile image
Marek Zaluski

It's remarkable that HyperCard led to the creation of Myst, that's a great story.

Collapse
 
albertc44 profile image
Al Chen

This post resonated with me since I started my career as a financial analyst and have been using Excel for 10+ years. I never thought of Excel as "programming," but as I dug deeper into data structures, automating tasks with VBA, and sanitizing data, I got more interested in SQL, data pipelines, and data manipulation using scripts.

Going from Excel to programming was a difficult leap but showed me that there is so much more you can do beyond Excel. I'm not a coder by any stretch of the imagination, but have found a sweet spot with no-code tools. I consider Excel to be in this "no-code" camp. Wrote a post about it here: dev.to/coda/the-overlooked-benefit...

Collapse
 
fadookie profile image
Eliot Lash

Good article. I agree with your assessment of how we got into the current situation. However, I believe the onus is entirely on the developer community to create better paradigms, languages, and tools to assist less technical people in getting stuff done. Computers are crazy powerful, but much of that is locked away for the average person.

On the topic of error messages, have a look at Elm. Great care was paid by the developers to make compiler errors easy to understand for regular humans. It will even show examples and guess at what might fix the problem.

However I think we may be stuck in an old paradigm - textual source files - that's also holding us back. Bret Victor gave an interesting talk on forgotten lessons from decades past at worrydream.com/dbx/. He has also been working on re-imagining a computer as a public space for experimentation that exists in the physical world at dynamicland.org

I am really excited when people are able to think past our current ways of doing things, designed to meet constraints that largely no longer exist. I hope that programming will one day be unrecognizable to us now, and become just another thing you do in your day to day life.

Collapse
 
nqthqn profile image
nqthqn

On the topic of error messages, have a look at Elm.

Yes!! Best error messages ever. I have never felt happier as a developer.

ellie-app.com/4XjyF9wJB8na1

Collapse
 
renegadecoder94 profile image
Jeremy Grifski

Great article! I've been working with beginners for a bit, and I totally agree with your sentiment. There are definitely artificial barriers to entry in this field, and we should do a better job of tearing them down.

Collapse
 
renegadecoder94 profile image
Jeremy Grifski • Edited

It seems weird to comment at myself, but I was just thinking that improving the user experience of programming languages would be good for everyone, not just beginners.

I wrote a grading tool for myself last semester, and it worked great at the time. Since then, I’ve expanded it and added more automation. Unfortunately, I’ve sort of forgotten some of the test cases, so when a test fails I don’t always know exactly what went wrong.

Today, I went back and cleared up the error messages, so I’ll know exactly what goes wrong in the future. Naturally, I thought of this awesome article. Thanks, again!

Collapse
 
marek profile image
Marek Zaluski

Learning to "program" with tools made to oversimplify programming gives you the wrong concept, and it's hard to stop thinking that way later on

Why do you think it's hard to stop thinking that way later on?

What stops you from picking up the missing principles at a later time?

Collapse
 
rleddy profile image
Richard Leddy

So, I think Hypercard is dead because Apple decided they had something cooler. Also, they had trouble with color. But, a company made SuperCard and had great color. I used that for a while for multimedia educational tools. In fact, you could build stuff with such ease, you have to wonder while people go through so much trouble on the web.

But, SuperCard did not last. And, there was a guy who wrote Revolution with RevTalk. RevTalk is now LiveScript (I think). That is a write once run everywhere system. A Scottish company took it over.

I think that besides QT, LiveScript is one of the best cross-platform solutions out there. But, QT requires more work all around, yet gives you C++. But, LiveScript is about the fastest turn around in development I ever encountered. It needs more love for sure.

As for just rummaging through every computer language known to man and several natural ones, I have to say that I have never been fully satisfied. For complex languages, what you get is a crowd of people who are really snotty about little useless features and you don't get simple development paths. You get lots of hype. For instance, there is a common belief that Python will prevent bad programming from all programmers. That is not true, I have seen horrible programs written in Python. But, that does not make Python bad, it's the hype that's problematic.

I used to think that SETL would be a language in which Set Theory expressions could be used. But, I found out that some guys just wanted to extend the 'set' command line statement found in bash. (Correct me if I am wrong.)

I started a language def of my own. But, of course, who would fund its development? So, it's sort of a joke. github.com/rleddy/acai. I tried to make set theory language parsable. And, I even used the name of a food you might have at breakfast with coffee.

But, if you want to keep me from starving and would like a language that works for you, you could request that I do that and help out the ghost (who is starving) patreon.com/coffeeshopghost. - got to get groceries somehow.

Collapse
 
ciri3dg profile image
Da2@ciri3dg

Very Insightful. I teach logic formulation and have taught several programming language (Assembly, BASIC,Pascal, C & C++). To my mind, syntax is easier to hurdle than algorithms. Coders should also acquire fundamental concepts on control and data structures.

Collapse
 
rhymes profile image
rhymes

Hi Marek, I agree in theory with your thesis, though I don't have a solution for that.

I also think we perpetuate the hardship by teaching those languages the same way we were taught, could it be partly this?

Looking at the landscape of languages I don't really see this aspect getting better anytimes soon.

I've heard about Dark on Twitter but nobody has seen it publicly so we'll have to wait:

Collapse
 
hussein_cheayto profile image
hussein cheayto

Very useful article for frustrated programmers. I would recommend that instead of syntax error, it should display a list of solutions (let's say I mistyped a certain function, the output should be: "Syntax error, and put some suggestions to correct the code"). I know it's hard but once done, it would get viral and highly recommended not just for beginners, but for experts also.

Collapse
 
elmuerte profile image
Michiel Hendriks

I mostly disagree. Natural languages are hard, and that is the problem. The syntax and grammar feel arbitrary. There are no proper tools available to verify if the sentence you constructed produces the correct result.
This is all much easier with programming languages. But when something wrong is detected it needs to be communicated back to the user by means of natural language. This is where the issue surfaces.