DEV Community

Arnaud Dagnelies
Arnaud Dagnelies

Posted on

How was coding in the last century?

I already started coding in the previous century at an early age. At that time, 8 Megabytes RAM were great and the internet was using your phone line using weird noises to communicate. It was so slow that you could literally "see" the progress when an image was being slowly loaded. Today, even cheap smartwatches are more powerful.

It's a time where google was unheard of, stackoverflow did not exist, and you relied on books and offline help files. This was very different from today where you google everything, but it had advantages too. Books were mostly well written, gave you a broad coverage, and looking up the usage of a method was the matter of pressing F1 which is faster than switching to the browser then googling it.

The IDE I had as a kid to get started was "Visual Basic 5". That's what it looked like:

visual

code

It's also not a scaled down display, the vast majority of monitors were 800x600 at that time, with way less colors than today.

But that's not what it is a about. Last century, this IDE already had:

  • autocompletion
  • inline documentation
  • run/debug with a single click
  • breakpoints and watchers
  • a visual editor for "forms"

It was super easy to use out of the box, even as a kid.
I could write my first little game in a matter of days, and I think after a couple of weeks or months I wrote my first Tetris. Yay!

And now, I'm an adult, a "senior developer" with lots of years of experience under my belt. Yet, strangely, I often struggle with the simplest tasks, like have the typescript debugger working properly or wondering why the autocompletion is sometimes broken. This is quite ironic from my point of view, that several decades after, the dev experience became more difficult than in the past.

Likewise, it's also funny to see the success of VS code, which to me, looks like a glorified text editor having less features that we had last century. ...at least out of the box. There are certainly plugins for all sort of things, but you have to search them, to find them, to try it out and then the next one and still sometimes they don't really work that well.

Despite all negativity expressed here, there are also some great things and bright sides. You can write a simple javascript function deploy it worldwide with a simple command line. You have a wide variety of languages, libraries, tools, plugins for whatever you can imagine. And now you even see AI pointing its nose to assist you. Developing certainly has grown in diversity and is more heterogenous.

The VS code popularity also reminds me of the situation with the Eclipse IDE. In the beginning, it was a super lightweight editor. It was snappy and you wondered if it was truly written in Java (which was not heavily optimized at the time). It also had a plugin architecture making it very modular. Sounds familiar? And with the years came more features, more plugins, more stuff, more fluff... until it became the big bloated sluggish IDE it is today.

I always have the feeling this is a common disease of software nowadays. Somehow, always adding features is a trend, like something you must do. Especially the sales teams will push for it "We need to add features to have some news and push sales". There are also certainly features being great and worthy. But don't add them blindly, they have a price! And I'm not talking about development costs here.

The price is:

  • more cluttered UI
  • one more thing for the user to learn
  • one more thing that might distract the user
  • slightly slower software
  • slightly more memory/disk/bandwidth/cpu usage
  • might make other changes slightly more complex
  • is something you must support and maintain for many years

Usually, one or two features added is not a problem. But the habit of continuously adding features tends to transform the good little tool you liked into a bloated beast you want to avoid.

Please don't understand me wrong, I don't say that tools should be minimalistic and have as few features as possible. My argument here is that there is a golden balance. Too few features make it useless, but too many and it becomes bloated. Think carefully about what you add and what you purposely leave out.

I think Microsoft did it very well a decade ago regarding their office suite. They redesigned the UI and, in the process, simply dumped a lot of rarely used features. Sure, there was some backlash of the users using this or that specific feature, but in the grand scheme of things they made their office software better that way. It became less cluttered, easier to use, less maintenance/support and facilitated their upcoming move to web/cloud technologies. Dropping features was the key, yet it remains a taboo in most software. Of course, you could go to the other extreme, like google creating and deprecating APIs like seasons.

Like for trees, cutting branches makes it possible for the tree to better grow again better and healthier during the next cycle. Simplicity is a key for good growth while complexity brings the shadow of stagnation.

That said, I'm curious to see how developing will evolve in a decade or two. Here are my tips:

  • typing will come to javascript, or ts support in browsers
  • because of that, many toolchains and build pipelines will become obsolete
  • AI assistance will become widespread, as a form of superior autocompletion
  • flat design will leave its place back to light bevel effects that make buttons/inputs stand out more
  • passwords will be phased out in favor of webauthn/passkeys
  • github will remain the central source of OSS
  • js will remain the most widely used language
  • diversity of the ecosystem will increase further
  • low code will not replace our jobs, just like drag&drop html editors did not, it's just a tool in the box

So, what are your tips for the next two decades?

Thanks for reading!

Top comments (0)