DEV Community

Cover image for Web Assembly should be a default binary target
Ben Taylor
Ben Taylor

Posted on

Web Assembly should be a default binary target

WebAssembly (WASM) is no longer a toy environment for cool demos. It's a thriving ecosystem that enables high-performance web applications with rich functionality. It's a portable code system that can run binaries in the cloud, or on your local machine. This may seem like an extreme position, but hear me out: if you develop a compiled library, language, or other tool you should consider making WASM a default binary target.

Here are my main reasons:

  1. Front-end web apps can handle workloads we'd previously reserve for the backend. Like image processing, or file encoding. This reduces bandwidth and speeds up apps.

  2. WASM doesn't only run in the browser, you can use it from in-language bindings or an interpreter. This makes it a portable format that doesn't require compilation or platform-specific versions.

  3. WASM runs in a sandboxed environment controlled by the host. The host can be certain that a library will not have network or file access unless it's provided.

  4. With portability we get freedom. Freedom to run software anywhere we want without the arduous process of figuring out how to compile and run it.

Let's dive into that a bit further.

What's changed with WASM

When the whole concept of WASM started it was a big ol' hack. A neat trick to compile C programs so that they would run fast in a Javascript runtime. Emscripten really pushed the limits of what we thought was possible here and we saw some pretty amazing demos: some people compiled interpreted languages like Python to WASM, the legends at Unreal got their engine running inside the browser, it was all very cool. Gary Bernhardt even did a talk about how WASM would eat the world. But WASM has changed a lot since then.

Screen Shot 2021-08-20 at 7.46.03 pm
Gary talking about Unreal Engine running in WASM

The most important change to happen to WASM is the WebAssembly System Interface (WASI). WASI is a system interface for a conceptual OS. It allows system software developers to bind to POSIX-like system calls, and for hosts to implement that functionality in a way that makes sense for their runtime. The result is flexible, but not so far from Linux, macOS, and Windows that it can't be practically implemented.

What's really neat about WASI is that it's based on a capability-oriented sandboxing system. The host who runs the binary can scope file access, proxy network access, and disable system calls as they see fit. This sort of sandboxing can prevent whole categories of security issues because it creates an extra layer between your application and its dependencies, reducing the possible impact scope if a dependency does get compromised.

Running WASM outside the Browser

With improvements in WASM has come an increased interest in running WASM outside the browser. This is a really interesting space of innovation that is made even easier with new tools for WASI. The obvious candidates are running WASM inside v8 runtimes. For example, Cloudflare's Workers platform uses v8 (Chrome's JS) as a way for them to run typical Javascript serverless functions as well as WASM-compiled binaries from other languages.

Diagram about how Cloudflare Workers works
From Cloudflare: How Workers Works

But why run v8 at all? Good question! A WASM runtime is much simpler to implement than a Javascript runtime. You could absolutely implement one yourself. But rather than doing that, there's off the shelf solutions. Wasmer has runtime bindings for the major languages: Python, Ruby, PHP, Go, Rust, C/C++, and Javascript. The Bytecode Alliance has their own runtime that works in .NET, Python, Go, Rust, C, and C++. Plus I'm sure they're both working on more.

If you've got a WASM runtime for your language, you can load in a WASM binary, write some simple bindings to call into that binary and presto! You're running system code independent of OS or CPU. Just sling that bad boy on to whatever computer you want and press execute.

Now you know you could run WASM from your Python code, but why on earth would you? Here's an experience you might be familiar with:

$ install cool-library
Downloading cool-library...
Downloading dependencies: cool-dependency, ruby, gcc
Compiling gcc...
Compiling ruby...
Enter fullscreen mode Exit fullscreen mode

You wanted to install cool-library but to get it, you need its dependencies. And those dependencies have dependencies. Meanwhile you're screaming, your computer's fan is screaming, and your package manager is compiling gcc, so that it can compile Ruby, so that some transitive dependency works. In most cases it's not really that bad, but it's familiar right?

We've kind of solved this problem with Docker. Things are at least better than they were. But it's a lot of overhead. You're setting up this whole shared kernel isolation system just to run a single binary. Seems like a portable binary would be better.

Portable binaries
Go away James Gosling, inventor of Java. This is totally different.

This is where WASM shines! cool-library releases a WASM binary instead, we download it, and we run it from our favourite language. No need to install all of its dependencies, no need to compile gcc, no need to containerise an operating system. Just a binary you can run anywhere. It'll work on your linux server, it'll work on Windows, it'll work on your shiny new M1 MacBook. Heck, it'll work on the little chip running in your lightbulb that forwards your darkest secrets to Amazon. It'll run in more places than Javascript does!

If you think the idea of managing binary versions sounds terrifying, don't even worry about it. There's WAPM the WASM Package Manager right there to solve that problem for you.

$ wapm install cool-library
[INFO] Installing _/cool-library@0.2.0
Package installed successfully to wapm_packages!
Enter fullscreen mode Exit fullscreen mode

That was easier!

This doesn't just stop at running WASM from your favourite language. Why do you need a language at all? With WAPM you can run the binary directly, like it's native. That means if a CLI tool is available in WAPM, you can install it and run it on your machine. No dependencies, no compilation, just a binary you install and run.

And if you want to deploy that WASM binary to production, there's another suite of tools for doing that. WasmEdge is a runtime made for cloud computing. Or skip the runtime altogether and compile your WASM to a native binary with Lucet, giving you all the security and sandboxing benefits of WASM and all of the performance benefits of a native binary.

Running WASI inside the Browser

Okay, so we know we can run these system binaries on your machine, or on your servers and that all makes sense. But why would you run them in the browser? What are you going to do? Run ffmpeg on the client side?!

Yes, yes I bloody well am. I'm gonna run whatever I want! If I want to run ffmpeg in the browser I should be able to. If I want to run clang in the browser I should be able to. If I want to download some open source printer drivers and run them in the browser I should be able to!

Portability is a matter of freedom. The whole promise of computing is universality. Being able to run libraries wherever I want is extremely powerful. Maybe you can't imagine why I might want to, but that's the point of open source tools. You put them out there and then other people do amazing and incredible things with them - things you never dreamed of.

ffmpeg running in the browser

Because ffmpeg runs in the browser there are now video editors that run in the browser. That may seem like a silly idea, but even just the basics of video editing are useful for many people, and by running on the client-side you save a lot of bandwidth. The traditional way to process videos would be to transcode them server-side, then provide controls for the user to make small edits like trimming the video. Because the user is so "far away" from the video file, making this interactive is very difficult. Plus you have to upload the whole raw file first.

By running on the client-side we can trim, and transcode before uploading. This means that the user gets an interactive trimming user interface that responds instantly. Then when they're done, instead of uploading a massive raw file, they can transcode locally and then upload just what the site needs. This isn't a fantasy, Discourse uses a similar workflow to optimise images before uploading them.

maconha
Wouldn't you like to save bandwidth like Discourse did?

But this isn't just for media. Have you ever used Jupyter Notebooks? It's a document format for data science where you can interweave text explanations with code that executes on data and shows intermediate results. It’s a great way to communicate research results, since the results of the research are sent along with how to get those results. If you want to check the working, it's right there!

Jupyter Notebooks require a connection to a server that can run the Python code. The server can run locally, but you have to fire it up yourself. Or you could pay for someone to run a server for you, upload the document there and then run it. But what if you just want to have a quick look?

This isn't how we're used to browsing documents. Nowadays you just open up a website and read the document right there. That's exactly what JupyterLite allows you to do. It's an entirely in-browser version of Jupyter based off the Pyodide project from Mozilla. Pyodide is Python compiled to WASM, and packaged up with a bunch of standard scientific python packages.

IPython running in the browser with JupyterLite

Reproducible science is even more accessible because Mozilla put the effort into getting Python running in WASM. Imagine what other good could be done by getting the rest of our programming languages onto WASM? Library documentation could have runnable examples. Bug reports could include a runnable reproduction. Technical articles could run the real thing, in your browser, with no need to install locally. I could tweet a program, and you could run it inside the tweet!

Ultimately it's about freedom

One of the most frustrating parts of programming is finding good matches of tools and environments. You find a great library for solving your problem, but it's for another language. You find a database you want to use, but there's no binary for your architecture. You're familiar with a programming language, but it doesn't run where you want it. You deploy your code and it fails because your dev environment doesn't match your production environment!

We emulate, we containerise, we port, we put in a lot of effort to solve these problems. But all these barriers could go away. We could run software wherever we wanted, however we wanted.

WebAssembly should be a default binary target.

Top comments (4)

Collapse
 
spock123 profile image
Lars Rye Jeppesen

Great article.

A big issue is the large initialization time as the browser still needs to compile/interpret the wasm payloads.

Collapse
 
nxtexe profile image
𝓃𝓍𝓉𝑒.π“…π“Ž πŸ‡―πŸ‡²

This may be less of a problem once interface types are fully integrated. Some including myself hope to see a script tag feature added where the type can be "application/wasm" and be fed a wasm binary directly without JavaScript setting up an environment.

Collapse
 
mitschabaude profile image
Gregor Mitscha-Baude • Edited

That would definitely be nice! But I wouldn't expect that it would make a noticeable performance difference to what you can do today: inline a couple of lines of JS that instantiate the Wasm

Collapse
 
taybenlor profile image
Ben Taylor

I agree this is an important consideration but the overheads here are getting smaller and smaller. The Python binary (a whole language runtime!) on WAPM is sitting at 5mb. Right now the median desktop page weight on HTTP archive is about 2mb. So not that substantial an increase, plus it'll get cached.

The classic way to instantiate WASM was to grab all the bytes, then instantiate it over multiple steps. This can now be done in streaming mode, meaning it'll happen as the binary downloads. This is much simpler and faster!

When you compare it to something like video playback it's not really that bad at all. Just feels bad because this isn't how things have been done historically.