Let's start with containers.
Containers are so ubiquitous, they've become a shorthand for "shippable", and pretty much everything you can possibly think of has become, in effect, a rectangular container, with fastenings at each corner.
Yes, I really am talking about physical shipping containers, not Docker, Kubernates, or any of those. No software here, these are the big metal boxes, most often used as large boxes to put other stuff in. But they're not just used as a dumb box - in fact, the format has become the ubiquitous shape of pretty much everything you can think of.
The uses range from military drones to shipping tanks (as in either large containers of fluids, or tracked vehicles with big guns). There are refrigerated containers. There are containers that contain machine shops. There are sets of containers that contain large engineering workshops. There are deployable Pizza Huts (really!), that ship in containers. There are shipppable toilet blocks, and dog kennels. There's even my favourite, the delightfully-named TActical Reconnaissance Deployed Imagery System, or TARDIS - which is a sort of mobile office for RAF Tactical Imagery. This is where the term "containerized" comes from.
All of these work so well because once something is a container, it becomes trivial to ship and deploy - because the humble container is what everyone works to, because a bunch of people spent countless hours figuring out exactly what a container should be, in ISO 668 and ISO 1496-1.
Yup, the Intermodal Container, as it's formally known, is an Open Standard. And, as we all know, standards are boring, and not at all innovative. As someone who's worked in Open Standards for a long time, I've had this charge levelled at my work for decades. You'd expect me to reject it, but there's a significant amount of truth here, and it's actually something to be quite proud of.
The container itself is not very exciting. It is, if anything, deliberately made boring. By encouraging everyone to stick to the same dimensions, with the same twistlock fastenings at each corner, nobody can innovate and create, say, slightly bigger containers to squeeze out a few more inches of space.
On the other hand, by providing a stable, standardized platform of shippable, everyone gets to innovate with confidence on top - you know that if you build a hospital constructed of ISO shipping containers, it becomes shippable across the world on ships, lorries (trucks for the Americans), trains, and even some aircraft. The funny blocks on each corner secure your hospital units in transit, and act as lifting points for cranes. You don't have to actually check every ship, or truck, or train.
And in case you wonder, yes, someone has stuffed a hospital into containers. And then put it in a plane.
What's happened is that the International Standards Organisation, ISO, has taken the container and made it a commodity. There is literally nothing interesting about containers themselves. The innovation has all been done, and then ground away by the standards mill. You can't, by design, get a good or bad container. They just are.
But since you can ship a container - or anything that pretends to be a container - across the entire planet in a few days without worrying, people have innovated like crazy on top of the concept. While containers, and the standardisation of them, is not very innovative in itself, the innovation built on top of them is vast.
It'd be easy to point to the Web and think the same - but the Web is, in many ways, a terrible standard. It's huge, for one thing - a browser these days has to handle realtime video, transactional databases, multiple heavyweight languages for layout, styling, documents and of course the most byzantine programming language ever invented. And the goalposts are constantly moving, because it's an all or nothing thing.
So instead, let's look at email.
When email came on the scene, literally everyone had their own email systems. The venerable Sendmail was written simply to gateway between them all. Rapidly, the IETF devised a standard format for email, so at least once you got a message from one system to another you could understand it.
Eventually, though, transports and routing were all standardised, and basic ASCII messages became Boring - the ultimate goal for all standardisation work.
The innovators didn't stop, though - they devised hacks for encoding binary files into these ASCII messages. They developed arcane ways of sending more than just ASCII. They abused the message entirely to send commands, building entire systems on top of email - mailing lists and file servers for example (the former are still around, the latter have long-since died out).
Gradually, many of these were taken and made Boring, so we ended up with MIME, for example, which gave us more than just ASCII messages, with attachments, and - eventually - annoying HTML emails than an entire generation of marketeers would think were a really good idea. But all of these things have become, well, boring.
You can see counter-examples in spades, too.
Twitter, Google, and Facebook are all cases where we've failed to create a commoditized version of the fundamental concept, with the result that while Twitter, Google, and Facebook are themselves innovative, they don't drive much innovation at all.
You can think of it in fractal terms - as the bubble of commoditized, standardized concepts grows, the surface area upon which we innovate grows an a much faster rate - but that innovation alone doesn't provide the same foundation.
All of this has, though, an impact on how we design standards. In order to maximize that innovative surface area - the event horizon of boringness, if you prefer - we have to ensure that standards have sufficient extension points.
TLS, for example, has just undergone a major revision in the form of TLS 1.3, and much of this effort has gone into adding more extension points.
In the XMPP world, we've stuck with XML for nearly 20 years - not because it's quick to design and work with static structures, compared to JSON, but because its built-in support for namespaces means anyone can extend it easily, without clashing with anyone else, and without having to coordinate their work. This maximises the potential for innovation.
So while Standards might be boring, it's the kind of predictable, comforting boringness that provides the foundation for some astonishing leaps of innovation.
If you're interested in doing something Boring and not at all innovative, the IETF - Internet Engineering Task Force and the XSF - XMPP Standards Foundation are entirely free to participate in, and in both cases your contribution to the vital world of Boring will be hugely appreciated.
Top comments (6)
I used to work in Nokia research; lots of people there were involved with W3C, IETF, etc. including on things like XMPP. It's interesting that you mention XMPP because effectively that's a failed standard that is by and large not used at or all or has been removed over time from essentially all of the dominant products that ought to be implementing it. This has a lot to do with the way companies conduct themselves when writing these standards and their motivations. XMPP is a complex beast and definitely has a design by committee flavor to it.
Standardization works best when the need for interoperability is genuine. With browsers, the turning point was when Google, Apple, Opera, and Mozilla teamed up and took HTML out of the W3C into the WhatWG so that they could compete more effectively with MS by making sure that they worked in a consistent way, which was not possible with the lack of progress on improving HTML in W3C because of all the corporate infighting going on there. MS lost a lot of market share over the subsequent years and eventually figured out that they needed to be part of this if they were to stay relevant as a provider of a web browser. Up until that happened, the w3c was a battle ground where various large companies were actively frustrating progress by insisting on wiggle room for proprietary features or variations of features. This is still ongoing, with the browser manufacturers making public statements this week about the W3Cs attempts to regain control over the DOM spec.
IETF has a long history of standardizing things after the fact. Google pushed out SPDY, people saw that was a good idea and then teamed up to produce HTTP 2 meanwhile several web server projects were working on reference implementations and browsers rolled out early support. Similarly, people were using email on various unixes in the seventies and eighties and IETF came up with specs that basically outlined and documented how that stuff worked. By the time the internet took off in the nineties, not supporting that was no longer an option for companies building email products. The few that tried anyway, no longer exist. Hence email is still around despite many good technical reasons to maybe come up with something better at this point.
Sadly, XMPP never really took off. At this point insisting on XML is probably not helping but I would argue that if failed when companies like Google started pulling the plug on XMPP support across their products. I would argue they never really wanted federated chat. At this point, Google obviously could use that given that they have a hard time bridging the multiple communication tools that they developer. So, in hindsight their decision making maybe wasn't that great.
Thanks for the enormous comment - I feel I owe an equally enormous response.
You've made at least three main points here, and I'll try to answer them separately. Apologies if you feel I've not answered some key point; feel free to nag me.
Firstly, you've suggested that XMPP is a failed standard that nobody uses. Honestly, it never ceases to amaze me how much it is used. Nevertheless, about the only place it's not widely used has been in the one place it was aimed at: general Instant Messaging.
But it is used in some astonishing places, from Finance (trading shares? You're doing it with help from XMPP) to Telcos (Depending on your mobile provider, you might have an XMPP account already) to video games (League of Legends, for example) to the military (have a Google for NATO Joint Tactical Chat sometime).
You're right that the likes of Google are not interested in federated interoperability - just watch what Google is doing to email to make federated email a second-class citizen - and XMPP was getting in the way of that. But while that was a very public slap in the face for standards, it has, perhaps surprisingly, not made as much impact as you'd think.
You're right that XML is hardly a trendy thing these days - I've joked that if we told people XMPP is a React-like syntax on the wire we'd get much more developer interest - but if you show the average XMPP developer some SOAP-style XML they'll react in the same way you would.
Next, the IETF isn't alone in being a standards organisation that works best when taking an experimental, but working specification and polishing it for standardisation. This is a good approach, but the risk is that a large company with a controlling monopoly can usurp that practise - this has been happening, and the bad feeling within the IETF toward Google in particular is very high.
But the IETF can work very well when it's not being coerced into rubber-stamping Google's specifications as standards. An example would be in the early days of the web, when HTTP/1.0 was effectively rejected by the IETF as a standard, despite widespread deployment. Instead, a huge amount of work went into fixing it in the form of HTTP/1.1, which supported the web for years (and still, largely, does).
Finally, the W3C is an odd fish in modern standards work. Most standards organisations, particularly Internet ones, have moved toward opening as much as possible, but the W3C has, if anything, moved in the opposite direction. While this makes the actions of the incumbent organisations in forming WHATWG understandable, the result has been an equally closed clique which now believes it has the only opinion worth considering - as a result, there are now two subtly conflicting URL specifications, for example.
In general, seeing that a solution to a particular problem isn't working is fine, and deciding to replace that with a new model is also fine - but WHATWG failed to learn from existing standards work, and has settled down to repeat a great many mistakes that have otherwise long-since been solved.
WhatWG succeeded in unifying what was a horrible mess of intentionally and unintentionally incompatible implementations of the same standards in W3C and IETF. It fixed that by introducing rigor, tests, and clear language and by involving those actually trying to implement it on a basis of meritocracy while getting rid of the politics that prevented the W3C from doing anything productive on this front for well over a decade.
I've seen the mobile industry from the inside and knew several people in Nokia who were full time active in various standard bodies (including W3C). A lot of this activity was not motivated by wanting to improve standards or building stuff that implemented these standards but to ensure access to, and control over relevant IP and ensuring certain patents that Nokia had stayed relevant or making sure it had new patents covering key stuff being standardized that competitors might want or need. I know of several standards that Nokia was involved with that it never had any serious intention of implementing or supporting.
My understanding is that people from Apple, Google, Opera and Mozilla working on browser technology were very successful collaborating within WhatWG after they figured out that W3C was the wrong place for that. Even while their companies were competing very aggressively. These days it is common for complex web designs to just work across different browsers. WhatWG transformed a broken by design and obsolete standard that was being pulled in all sorts of directions by companies not involved with actually implementing it into something that actually works. That's no small achievement; so I think you are being a bit harsh on WhatWG.
Http is an interesting example because Http 2 only happended after Google pushed out Spdy and was adopted quite rapidly. IETF then did what it was supposed to do and rapidly developed HTTP 2 to replace spdy by something better that met the needs of people in the industry.
Excellent post! The impact of 'boring' really is underestimated.
I've long found that I very much prefer boring code, since (as you say) you don't need to worry about it.
Oh, I don't know. Making something truly boring takes a a lot of work.
I was referring to code that is as near to pseudo-code as is possibly, which means that it tends to do nothing exciting or unexpected. So nothing special and nothing clever. Though granted, even that is something that seems to require some aptitude and concerted effort :).