The web has replaced and outlived a lot of closed technologies because of its resilience. Its underlying technologies are simple and sturdy and one of its main design principles is to - no matter what happens - never punish the user for developer mistakes.
User needs come before the needs of web page authors, which come before the needs of user agent implementers, which come before the needs of specification writers, which come before theoretical purity.
This is why web technologies are forgiving. Enter some wrong HTML, and the browser will try to make sense of it and automatically close elements for you. Enter some invalid CSS, and the line of code gets skipped.
Endangered species: the indie web publisher
This is excellent for end users, but it always felt wrong to me when it comes to building things for the web. Sure, the web should be a read/write medium and anyone consuming it should also have only a small step to take to become a publisher on it.
But let’s be honest: most publishing on the web doesn’t happen by writing HTML and CSS, but inside other systems. You can still run your own server, set up your own blog and all that, but the majority of people who put content on the web don’t ever touch any code or own any of the infrastructure their content is published on. Whether that's a good thing or not is irrelevant - we lost that battle. And whilst we are always smug when the likes of Twitter or other platforms get into trouble, it still means a lot of people adding to the web will go somewhere else. To another product, and not start writing HTML or hosting their own blog.
It works, why care?
My problem with a forgiving platform is that it makes it a lot harder to advocate for quality. Why should a developer care about clean HTML and optimised CSS when the browser fixes your mistakes? Even worse, why should HTML and CSS ever be respected by people who call themselves “real developers” when almost any code soup results in something consumable? There are hardly any ramifications for coding mistakes, which means that over the years we focused on developer convenience rather than the quality of the end result.
HTML and CSS are compilation targets
It feels like we never embraced the alien that is the web to our software world. In almost any other environment you write code, and you compile optimised code to a certain environment. On the web, compilation wasn’t needed, but we ended up in a place where we do it. And we compile to the unknown, which in itself doesn’t work.
Compiling into the unknown
Tooling tells us what's wrong - but who listens?
I work in developer tools of the browser which are excellent compared to what I had to work with when I started. I find myself often at a loss though what I could still give developers to make it even more obvious that what they are doing hurts end users. Open any web product and take a look at the issues tool in developer tools and you are greeted by a deluge of problems that can be a barrier for end users and - even more annoying - are easy to avoid.
You even see squiggly underlines in the DOM tree when something is wrong - much like Word shows when you are making writing mistakes.
If you use certain extensions in editors, you even get that live while you are writing your code with explanations why what you do is a problem and how to fix it.
And yet, what’s on the web is to a large degree terrible. Which brings me to the main question I am pondering: is the web development stack and environment too lenient? Should developers have a harder time making obvious mistakes instead of getting away with them? I remember when XHTML was an idea and a single wrong character encoding would have meant our end users can’t access a web site. That was a clear violation of the main guiding design principle of the web. But in a world where we do convert source code to web code anyways, shouldn’t our bundlers, frameworks and build scripts be more strict and disallow obvious issues to get through and become something the browser has to deal with? We do write tests for our code, shouldn’t a system that checks your final product for obvious issues also be part of our delivery pipe?
Should bad code be something we always expect?
Top comments (2)
You are so old-fashioned in a positive way! Every web developer MUST™ read this post. But what would we change in our daily work when proceeding to code tomorrow morning? Well, I am so happy and thankful for the rare moments when I actually write code instead of Googling obscure software problems - React, Webpack, WordPress, WooCommerce, Symfony, some quirky behavior of some outdated Safari browser on a certain iPhone, some quirky behavior of some app on my partner's Android phone etc.
When I am lucky enough to code, I often try to take my time to do it properly. Question the requirements, add tests, study documentation, avoid
!importantand try to make sense of the suggestions of stylelint, eslint, SonarLint, PhpStan, PhpCS, JetBrains code inspection, and at some time I even tried to use Copilot and ChatGPT.
Anyway, thanks for your post and hope to see you live on stage some day. What about beyond tellerrand conference 2023?
The semicolon debate is one of those things that people tend to have "religious" attitudes about, but diffs like these are what convinced me to drop them in my own projects:
Aside from developer convenience (no fiddly
UpArrow, End, Backspace, DownArrowetc. before you can chain another method), the second diff is undeniably more readable. So arguably, "avoid semicolons where possible" is preferable for the same reason many people prefer to include trailing commas in multiline arrays.
The tradeoff is you need to insert them at the beginning of statements that start with open parentheses, open square brackets, untagged template strings, or regex literals. But that's fine once you get used to it.