DEV Community

Rui Figueiredo
Rui Figueiredo

Posted on • Updated on • Originally published at blinkingcaret.com

The Road to Modern JavaScript

When I recently decided to learn webpack I realized just how many new things were added to the JavaScript ecosystem in the last couple of years. Things that you need to know if you want to feel comfortable with all the new frameworks and tools like Angular, React, Gulp, Webpack, etc.

The goal for this blog post is to walk you through the major developments in the language that lead to what is considered modern JavaScript development. It also has examples that are illustrative of how modern tools and features work.

Road to Modern Javascript symbolized by a curvy road

JavaScript development has changed immensely in the last two decades. When JavaScript was first introduced in 1995 one of the major goals was that it should be easy for beginners. It had requirements like being embeddable directly in HTML. It was supposed to be the "glue" between Java applets.

We all know that it evolved in a very different direction. All of that was motivated by JavaScript taking an ever more prominent role in web development, and that clashed with some of those earlier goals.

Scope and naming

In the 90s it was common to find this in a .html file:

<input type="button" value="Save" onclick="save();"/>
<script>
  function save() {
    //...
  }
</script>
Enter fullscreen mode Exit fullscreen mode

Script tags with large chunks of code intermingled with HTML, plus inline event handlers. All of this quickly made the code hard to read and maintain.

Another thing that caused problems was it was really easy to get into a situation where you would accidentally redefine a function because you named it the same way as a previous one.

For example if there were two .js files that define a save function, the second one would override the first. This is perfectly valid in JavaScript, so there would be no errors or warning messages.

The solution for this problem was to try to mimic the namespace functionality that exists in other programming languages. We started doing things like:

var MyNamespace = (function() {
  function save(){
    //...
  }

  return {
    save: save
  };
})()
Enter fullscreen mode Exit fullscreen mode

And then instead of just calling save() we'd call MyNamespace.save().

This takes advantage of the fact that in JavaScript new scopes are only created by functions. This became so popular that IIFE became a common "word" (iffy) between JavaScript developers. It means Immediately-invoked function expression. A really simple example is:

(function() { 
    //whatever variables and functions you declare here won't be "visible" outside the function
})()
Enter fullscreen mode Exit fullscreen mode

It was now possible to have more complex applications, and to reuse parts of the code because the naming of functions was not an issue.

We also started making our JavaScript "unobtrusive", meaning that we didn't mix it with HMTL, and we made more object-oriented.

Too many files to load

As these new practices made writing more complex JavaScript more manageable we started to get into situations where we had a lot of it. That JavaScript had to be loaded to the browser, and as good practices dictate, it had to be separated over several files with meaningful names.

Well, there's a limit on how many concurrent GET requests a browser can do, and they are not many.

We've started using tools to bundle all our JavaScript. Bundling means that all the JavaScript code is concatenated into a single file. My first experience with bundling was with ASP.NET. With ASP.NET it's actually .Net code that bundles the JavaScript files.

This only worked in .Net so alternatives were required in order for this technique to be used with other technologies.

At some point in time someone decided that it would be a good idea to have JavaScript run outside of the browser. Node.js was created. Node leverages the open-source V8 JavaScript engine created by Google. What's so appealing about Node is that you can create C++ Addons that can be invoked through JavaScript running in Node, which basically means you don't have any of the limitations that you have running inside a browser (it is possible to access the filesystem, etc).

A lot of tools started showing up that were created using Node. Specifically for doing bundling the most popular ones were Grunt and Gulp.

In reality Grunt and Gulp are task runners, meaning that they run tasks, and bundling is just one of those possible tasks. Another example that also goes hand in hand with bundling is minification (or "uglification" outside the .Net world). It's the process of making the JavaScript as small as possible by renaming the variable and function names to single letters, and also removing all whitespace and comments.

Here's an example of how a gulp configuration file that creates a bundle looks like:

var gulp = require('gulp');
var concat = require('gulp-concat');

gulp.task('default', function(){
  gulp.src(['player.js', 'game.js'])
      .pipe(concat('bundle.js'))
      .pipe(gulp.dest("."));
});
Enter fullscreen mode Exit fullscreen mode

When you run this task with gulp it creates a bundle with player.js and game.js (in that order) named bundle.js. If you are interested in learning Gulp I recommend: Automate Your Tasks Easily with Gulp.js.

Modules

Even though bundling solves the issue of the limited number of GET requests that browsers can perform simultaneously, it requires that the JavaScript files are added to the bundle in a particular order if they have dependencies on each other. It's also easy to end up in a situation where there is JavaScript code that never gets executed inside the bundle. Over time bundles become hard to manage.

JavaScript modules solve this problem. The idea behind using modules is that it is possible to have dependencies stated explicitly. For example imagine you are creating a JavaScript game and you have a game.js file. That file uses code from another file named player.js. We can explicitly say that game.js depends on player.js.

There are a few different module "formats". The most common ones are commonjs which is the one used in Node.js, there's also Asynchronous Module Definition (AMD)](https://github.com/amdjs/amdjs-api/wiki/AMD), and ES6 modules.

Let's imagine a simple scenario with game.js and player.js and describe them with these three module formats. Game has a start method that calls Player's getName method.

In all these module formats each JavaScript file is a module, so in this case we would have two modules, game and player.

CommonJS

With commonjs the player.js file would look like this:

var privateVar; //if this is not "exported" it won't be available outside player.js

function getName() {
  //...
}

module.exports.getName = getName;
Enter fullscreen mode Exit fullscreen mode

And game.js:

var player = require('./player.js');

function start(){
  var playerName = player.getName();
  //...
}
Enter fullscreen mode Exit fullscreen mode

It's through module.exports that we expose what's inside the module to whomever requests it. In this case the only thing that was "exported" was the getName function.

In commonjs to get the exported parts of another module we use the require function. You might have notice the ./ in the require statement in game.js. In this case it would mean that both files are in the same folder, however the way a module's file is found can become complicated. I'd recommend reading Node.js documentation on how to get to the exact filename when require is used.

Asynchronous Module Definition

The AMD syntax is a little bit different, it consists in using a define function where a module's dependencies are listed in an array, and then supplying a function where each of the arguments will be a dependency in the order they are listed in the array.

With AMD the player.js would look like this:

define([], function(){
  var privateVar; //not accessible outside the module

  function getName() {
    //...
  }
  return {
    getName: getName
  };
})
Enter fullscreen mode Exit fullscreen mode

And game.js:

define(['./player'], function(player) {
  function start(){
    var playerName = player.getName();
    //...
  }
});
Enter fullscreen mode Exit fullscreen mode

Here's a good resource to learn more about AMD.

ES6 Modules

The ECMAScript 6 standard which is the new specification for JavaScript (the new version of JavaScript if you will) introduced modules.

With ES6 modules the player.js file would look like this:

var privateVar;

function getName(){
  //...
}

export { getName };
Enter fullscreen mode Exit fullscreen mode

And game.js would look like this:

import * as player from './player.js'

function start() {
  var playerName = player.getName();
  //...
}
Enter fullscreen mode Exit fullscreen mode

Module Loaders

If you were to just load game.js or player.js as they are defined in the examples above they wouldn't work (you would get errors stating that require/define/import are not defined).

For them to work they need to be loaded through a module loader. A module loader is a JavaScript library that runs in the browser and which is capable of interpreting one (or several) module formats.

There are several popular module loaders. The most popular ones is probably SystemJS.

SystemJS supports several module formats. You can specify which one you are using through configuration options.

To use them you need to specify which module is the "entry point". You can think of the entry point as the main module, in our example that would be game.

Here's how we could use SystemJS to load the CommonJS example above:

<script src="system.js"></script>
<script>
  SystemJS.config({
    meta: {
      format: "cjs" //use commonjs module format
    }
  });

  SystemJS.import('game.js');
</script>
Enter fullscreen mode Exit fullscreen mode

When you do this SystemJS will load game.js inspect it and realize that it needs to fetch player.js. It will then load the JavaScript from player.js and then game.js in the browser.

You can find a good introduction to JavaScript modules and module loaders in this pluralsight course: JavaScript Module Fundamentals.

JavaScript build process

Although client-side module loaders enable the use of modules, if there are a lot of them, we will again get into the issue of browsers having a limited number of GET request that can be performed simultaneously.

There's no reason not to do the module's loader "work" beforehand as a build step, and as a result produce a bundle. An example of a tool that does this is browserify.

Browserify gets its name from the idea of enabling the use of modules in the browser the same way they are used in Node.js. It's a "browserification" of Node.js modules (which use the commonjs format).

To create a bundle with browserify we just need to specify what is the main module. Browserify will figure out what other modules that module depends on, and which other modules those modules depend on and so on.

In our example we could create a bundle simply by doing this:

$ browserify game.js --outfile bundle.js
Enter fullscreen mode Exit fullscreen mode

We then just need to include our bundle in our web page and we are good to go.

Transpilation

One thing that JavaScript is known for is being lax with regards to types. In JavaScript you don't need to specify which type a variable is, what is the return type of a function or what are the types of its parameters.

This made creating tools to aid the developers difficult. Some IDE's would provide some intellisense information (e.g. Visual Studio) but the experience was never perfect.

TypeScript is a language that is a superset of JavaScript and that allows for type information to be added.

To use TypeScript you need to compile it to JavaScript. This process of compiling a language to another language is what transpilation is.

Here's how a function definition with TypeScript looks like:

function getPlayer(id: number) : IPlayer {
  //...
}
Enter fullscreen mode Exit fullscreen mode

Here we are saying that the getPlayer function expects a parameter named id that is a number and returns an IPlayer. In TypeScript you can define interfaces, for example IPlayer could be:

interface IPlayer {
  id: number;
  name: string;
}
Enter fullscreen mode Exit fullscreen mode

When you compile this TypeScript code, the interface has no effect on the output, but during development type you get intellisense when you have an instance of IPlayer. Also, you will also get an error if you pass for example a string as an argument to getPlayer (e.g. getPlayer("abc")), you will also get intellisense in regards to the function parameters and their type, in this case you'd get intellisense for id of type number.

TypeScript was by no means the first language to come by that transpiles to JavaScript. The first that became really popular for a while was CoffeeScript, however (at least from my perception) it seems to be fading away.

Because it enables a better development experience, TypeScript is probably responsible for enabling ever more complex projects to be done in JavaScript. Also, because having build steps for JavaScript is so common now, having one more for transpilation adds very little friction.

Although TypeScript is probably the most popular language that transpiles to JavaScript, it should be mentioned that just writing ES6 code, the new version of JavaScript, is also very popular. Since not all features from ES6 are supported by current browsers, ES6 code is also transpilled to the current version of JavaScript. The tool that enables this is Babel.

Build tools on steroids

Imagine using JavaScript to load images or CSS instead of doing it in HTML. That's what build tools like Webpack enable.

If this is the first time you've heard about this you might be thinking about how this can be a good idea. It turns out that it enables scenarios that solve some common problems in web development. The same way we now have modules in JavaScript, we can apply the same solution to CSS where if we import CSS through JavaScript, that CSS might be scoped so that it does not interact with any other CSS in the page.

Images in CSS can automatically be converted to base64 and embedded inside the CSS itself if they are below a certain threshold size.

These are just some examples of what Webpack enables. If you spend some time becoming familiar with it you'll recognize that the new version of Angular relies heavily on this type of functionality.

Conclusion

In this post I tried to describe how I perceived JavaScript to evolve into what it is today. At the beginning JavaScript was a simple language, it sill is, but it didn't had this buzzing ecosystem around it. Most of that ecosystem was enabled by addressing problems that were a consequence of how JavaScript was being used. With the amount of shareable work that was done in Node.js and with ways to use it in a similar fashion in the browser (Browserify) the JavaScript ecosystem grew immensely. It continues to evolve with tools like Webpack that facilitate scenarios and practices that enable ever more complexity in a manageable way.

Top comments (9)

Collapse
 
thatjoemoore profile image
Joseph Moore

Good news on the ES6 modules front - while you still have to use something like Webpack for other browsers, Safari has already shipped support, and Chrome is shipping them in Chrome 61, which is coming in the next couple of weeks. Firefox and Edge both have experimental versions hidden behind config flags. 🎉

Collapse
 
martyonthefly profile image
Sylvain Marty

Really good article ! It explains a lot of internal concept of popular JS framework (like Angular). The dependency injection from Angular (v1.6 in my case) looks particularly like the AMD way !

Thanks for sharing ! :)

Collapse
 
antonfrattaroli profile image
Anton Frattaroli

http2 supposed to get around the concurrent requests issue. Although for IIS hosted, you'll need Server 2016.

Collapse
 
ruidfigueiredo profile image
Rui Figueiredo

You are absolutely right. It's funny, yesterday I listened to a podcast about this exactly: devchat.tv/js-jabber/http-2-with-s...

It was very informative.

One of the other new things that are gaining momentum is web assembly. Check this video: youtu.be/MiLAE6HMr10

Collapse
 
antonfrattaroli profile image
Anton Frattaroli

It poses some questions about current techniques. Is bundling good practice on a HTTP/2 enabled site? I suppose the focus would be on the waterfall: if your initial JS payload says it requires certain dependencies, then those are downloaded, and those files have dependencies and are downloaded... But on the other hand, you still want that minimum load for max performance on first load.

Web Assembly is scary, once it gains momentum the industry is going to take a long time to normalize. It'll make the JS ecosystem explosion look simple.

Thread Thread
 
ruidfigueiredo profile image
Rui Figueiredo

My understanding is that with http2 bundling is considered bad practice. The reason for this is that with http2 the limited number of connections is not an issue anymore, therefore there's no reason not to take advantage of browser features like caching.

Also, now there's no need to regenerate bundles just because there's a change in a JavaScript file.

Thread Thread
 
thatjoemoore profile image
Joseph Moore • Edited

In a perfect HTTP/2 world, bundling becomes an antipattern. That, however, basically depends on properly-implemented server push, where each file that gets pushed also causes its dependencies to get pushed. If you don't have server push, you can get into situations where you have deeply-nested dependencies and, while you're transferring everything over a single TCP connection, you still have to wait for the browser to discover that it needs a resource so that it can download it.

So, it kind of becomes a 'measure and see' kind of thing. I suspect that, for those environments without server-side push (like Amazon Cloudfront), we'll see a pattern emerge of no longer bundling all of your direct dependencies into one file, but bundling all of their dependencies with them.

Thread Thread
 
antonfrattaroli profile image
Anton Frattaroli

Just out of curiosity, I'd like to see the extent of the impact on Angular. Maybe they'd need another messy migration to a future version. I'd assume it would be less of an issue for a React app. I wouldn't really know regarding either.

Collapse
 
borismoser profile image
Boris

Excellent! I'm starting to learn js and your article is going to be very useful.