This article is about a journey that started in the early 1970s, that has no itinerary and no foreseeable end.
The first microcomputers were powered by primitive microprocessor chips that initially could only be programmed using numbers, and each type used different numbers. So in a Motorola 6800 from 1975, the codes
were an instruction to put a carriage-return character into the A accumulator, but the same codes meant something completely different - or nothing at all - to a Zilog Z-80. We also had assembly language, which was the same thing but with the numbers replaced by letter codes for the benefit of human beings, and where the same instruction was
lda a 0x0d
In those days 4 kilobytes was a lot of memory so programs had to be short and efficient. They were usually written in assembler. High-level language compilers need a lot of memory so they ran on larger machines, but these were difficult and expensive for ordinary people to access. Which didn't really matter because few people had microcomputers at home.
As the cost of memory fell it became practical to have more of it, and by the end of the 1970s, 8-bit computers were appearing with a full complement of 64k bytes of memory, enough to even run a high-level language. This really marked the start of the computer revolution as it became possible to write application programs on the computer itself, and here mass-market software was born.
The next decade saw the introduction of the home computer, the IBM PC, the Macintosh and the main kinds of computer software we are now all familiar with; word processors, spreadsheets and operating systems. But the next major port of call on the journey came at the end of the 1980s, with the birth of the World Wide Web. For the first time it was now possible to connect home computers to the Internet, which up until then had been a tool for universities, military and industry.
For nearly two decades, applications were programs you installed on your computer and the browser was there to give you the means to download them. But as the speed of the Internet increased it became practical for applications to download and run in the browser. The advantage was obvious; nothing to install or update, just run it. Anywhere.
The majority of these applications were still self-contained, though; once loaded they had little to do with the Internet itself. But with continued increasing Internet speed and reliability, a browser application could begin to operate seamlessly across the Net, distributing itself in any way that suited it best. We had arrived at the Web App.
Right now we're seeing the slow decline of the installed application, as more and more is done with web apps. Some personal computers, such as ChromeBooks, don't even offer the means to install applications; everything is done with the browser. This process is mid-way so we cannot see where it's headed; all we can do is guess. But my guess is that in the end nearly everything will be done with a browser, rendering even the operating system redundant or relegated to being just a core component of the browser.
I don't wish to suggest that frameworks are a bad thing, but in their current form they are overly complex when it comes to expressing the somewhat random logic the customer is interested in, where the system meets the user in an arbitrary and frequently-changing set of business rules. Everything has its place, and where frameworks excel is in building stable, self-contained components that require little maintenance and interact with the rest of the system through well-defined interfaces. To go from there to assert the entire system should be managed the same way is a mistake. It may work for Facebook but few of us work with a thousand other engineers, build sites of that nature or have to maintain them.
In the mid 1980s, users of the revolutionary Macintosh computer could run a program called HyperCard, a combination of flat database, graphical system and programming language, the latter being called HyperTalk. Years before the Web entered the scene, HyperCard/HyperTalk excelled in being able to create flexible user interfaces that were easy to understand and maintain. It's like it was born before its time. Users with little or no computer training could build sophisticated "stacks", as they were called - interactive graphical database applications - to meet a host of needs. Just like web apps, but without the connectivity.
HyperTalk was designed to look as much as possible like English, so you could write
put the first character of the third word of line 5 of card field "sometext" into theChar
which requires no computer training at all to understand. The system was so exceptional that it earned the epithet "Insanely great"; praise the likes of which I've not heard applied to anything else in the intervening 3 decades and more.
Code like this is close to what we write when asked how something should work. It's readable by domain experts as well as programmers and I have yet to come across a web page whose appearance and behavior could not be described using language of this kind. Complex functional blocks like Google Maps can be included as packaged components with their own simple operating keywords, leaving the appearance and behavior of the page understandable by all and the question "why do it any other way?" hanging in the air.
Thank you for staying with me. If you are interested to know what a high-level browser scripting language looks like, head over to EasyCoder.