If you haven't been following this series, I have been building a synthesizer in Rust and WebAssembly. The advantage of using WebAssembly is that I can take advantage of all of Rust's features while simultaneously avoiding the pitfalls of trying to use the operating system's complex libraries for playing audio.
My Last Solution Was Not Ideal
In the last post, I arrived at a working, but less than ideal solution for importing WebAssembly into an AudioWorkletProcessor
. Even though I couldn't initially figure out how to import a WebAssembly module into a worklet using wasm-pack
, the process of doing so without wasm-pack
enabled me to identify a potential solution that uses wasm-pack
.
The next step was trying to implement it.
A Better Idea
Once I understood the process of importing a WebAssembly module into a worklet, doing so using wasm-pack
proved to be fairly simple. There are two general steps:
- Compile the WebAssembly module using the
wasm-pack
web
target. Theweb
target creates JavaScript code that works without a bundler or Node. - Copy the JavaScript into the
AudioWorkletProcessor
file and adapt it to the context ofAudioWorkletGlobalScope
.
Explanation
Since the web
target is meant to work without the need for a bundler, the resulting JvaScript code can be copied into the AudioWorkletProcessor
file and will only require minor changes. This approach is currently the best option because the AudioWorkletGlobalScope
lacks the features required to import WebAssembly modules.
Implementing the Solution
The first step is to build the Rust code with the web
wasm-pack build --target web
Once the code is compiled, I opened the JavaScript file named "_bg.js". This file contains the WebAssembly bindings generated by wasm-pack
. I copied everything in this file and pasted into my AudioWorkletProcessor
class. This is the only way to get the JavaScript bindings into the AudioWorkletGlobalScope
.
Next, I copied the .wasm
file to a location where the browser could load it using fetch()
. At the top of my worklet file, I had the following code which I knew would need to be adjusted.
// processor.js
let wasm;
const cachedTextDecoder = new TextDecoder('utf-8', { ignoreBOM: true, fatal: true });
cachedTextDecoder.decode();
let cachedUint8Memory0 = null;
function getUint8Memory0() {
if (cachedUint8Memory0 === null || cachedUint8Memory0.byteLength === 0) {
cachedUint8Memory0 = new Uint8Array(wasm.memory.buffer);
}
return cachedUint8Memory0;
}
// wasm-pack generated code ...
export { initSync }
export default init;
class WebSynthProcessor extends AudioWorkletProcessor {
// ...
}
I replaced this code with the following:
// processor.js
let wasm;
let cachedTextDecoder;
// decode call will be made later
let cachedUint8Memory0 = null;
function getUint8Memory0() {
if (cachedUint8Memory0 === null || cachedUint8Memory0.byteLength === 0) {
cachedUint8Memory0 = new Uint8Array(wasm.memory.buffer);
}
return cachedUint8Memory0;
}
// wasm-pack generated JavaScript
// Notice, I removed the exports.
class WebSynthProcessor extends AudioWorkletProcessor {
// ...
}
An Instance of TextDecoder
class needs to be available in the AudioWorkletGlobalScope
, but the audio worklet scope does not allow the TextDecoder
class's constructor to be used. I used the AudioWorkletProcessor
's MessagePort
to give the audio worklet a TextDecoder
instance.
Setting Up the WebAssembly Module
In order to pass the WebAssembly module to the AudioWorkletProcessor
, I knew I would need to call WebAssembly.compile()
in the main thread and pass it to my audio processor. To do this, I needed to call both fetch()
and WebAssembly.compileStreaming()
like this:
// index.js
// We will need the worklet node later.
let node = new AudioWorkletNode(context, 'web-synth-proto');
WebAssembly.compileStreaming(fetch('/path/to/library.wasm')).then(module => {
// Pass the module to the AudioWorkletProcessor
});
I originally did this in an async
function so that I could use await
to get the return value directly. Here, I'm using a callback with a promise because this pattern is more common.
Once I had a WebAssembly
module instance, I passed it to the AudioWorkletProcessor
using the MessagePort
instance like this:
// index.js
WebAssembly.compileStreaming(fetch('/path/to/library.wasm')).then(module => {
// Now that the module has been created, it can be passed to the processor.
node.port.postMessage({type: 'init-wasm', wasmData: module});
});
Using the Module in the AudioWorkletProcessor
To use the WebAssembly module, it needed to be instantiated from inside the audio worklet thread. The first step in this process was to set up a message listener for for the MessagePort
instance. I did this in the constructor of the AudioWorkletProcessor
.
// processor.js
class WebSynthProcessor extends AudioWorkletProcessor {
constructor(options) {
super(options);
this.port.onmessage = event => this.onmessage(event.data);
}
onmessage(data) {
// Check to make sure the message we receive is the correct type.
if (data.type === 'init-wasm') {
// Finish instantiating the WebAssembly module
}
}
}
Now that there is an onmessage
listener for the MessagePort
, the worklet can listen for the message sent by the main thread.
// processor.js
class WebSynthProcessor extends AudioWorkletProcessor {
constructor(options) {
super(options);
this.port.onmessage = event => this.onmessage(event.data);
}
onmessage(data) {
// Check to make sure the message we receive is the correct type.
if (data.type === 'init-wasm') {
cachedTextDecoder = data.decoder;
// Load returns a promise
load(data.wasmData, getImports()).then(mod => {
// Once the WebAssembly module has been instantiated, it needs to be finalized
// so that it can be accessed later.
finalizeInit(mod.instance, mod.module);
});
}
}
}
Once the worklet receives the WebAssembly module from the main thread, it needs to call the load()
function copied from the wasm-pack output so that the WebAssembly module can be properly instantiated. Finally, the audio processor thread needs to call the finalizeInit()
function to ensure that it can continue to access the WebAssembly code.
Now that all the setup is complete, the audio thread can use the SineOsc
struct by calling the JavaScript wrapper functions.
Here is how I setup the oscillator:
// processor.js
class WebSynthProcessor extends AudioWorkletProcessor {
constructor(options) {
super(options);
this.port.onmessage = event => this.onmessage(event.data);
}
onmessage(data) {
// Check to make sure the message we receive is the correct type.
if (data.type === 'init-wasm') {
cachedTextDecoder = data.decoder;
// Load returns a promise
load(data.wasmData, getImports()).then(mod => {
// Once the WebAssembly module has been instantiated, it needs to be finalized
// so that it can be accessed later.
finalizeInit(mod.instance, mod.module);
// Create the oscillator instance.
this.osc = SineOsc.new(sampleRate);
});
}
}
}
Now that the audio thread has created an oscillator, that oscillator can be called to do the actual sound generation.
// processor.js
class WebSynthProcessor extends AudioWorkletProcessor {
// ...
process(inputs, outputs, parameters) {
if (typeof this.osc !== 'undefined' && this.osc !== null) {
// Get the first output channel.
let output = outputs[0];
output.forEach(channel => {
// populate the channel's buffer with samples from the WebAssembly code.
for (let i = 0; i < channel.length; i++) {
// Remember the first argument for the sample method is the pitch we want to
// synthesize and the second argument is the volume.
let sample = this.osc.sample(440, 0.5);
channel[i] = sample;
}
});
} else {
console.log('wasm not instantiated yet');
}
return true;
}
}
What Next?
At this point, I have demonstrated how to adapt wasm-pack
's output so that it can be used in an AudioWorkletProcessor
file. The next step is to start building out the features necessary for a usable synthesizer.
If you want to read more about this project, feel free to follow me or read my other posts in this series.
Introduction to the series:
Building a Browser-Based Synthesizer Using Rust and WebAssembly
Andrew Luchuk ・ Apr 19 '23
Source code for this prototype:
speratus / web-synth-proto2
Web Synthesizer prototype 2
Websynth Prototype
This project is a proof of concept for using Rust to generate samples for a Web Audio API backed synthesizer.
You can read an introductiont to the project on dev.to.
Dependencies
To build this project you will need the following:
Building
To build the Rust components of this project, simply run
wasm-pack build --target web
See this post for details regarding how to integrate it into a web page.
Running
This project can be run using Visual Studio Code's "Live Server" extension. Simply install it, and follow the instructions to go live. Once you have the server running, navigate to the "www" directory to see the web page.
If you don't have Visual Studio Code, you should also be able to open index.html
in the www
directory in any modern web browser
Although I haven't thoroughly tested this method, I don't think you will run…
Top comments (0)