DEV Community

Cover image for Experiments && Progress

Experiments && Progress

Context

Within the scope of this post, I am going to delve into some of my understandings in my process of experimenting with the idea of coded music synthesizers. I have created a relatively simple audio synthesizer using the Web Audio API and in future posts will deploy my synth as I enhance its interface and functions.

To get an idea of what I will be discussing,

here is what the synth I've coded so far looks like!

Web Audio API based synthesizer


Objective

My primary objective in exploring this project, is to utilize the knowledge I've gathered with HTML, CSS, and JavaScript to build a functional synthesizer and furthermore determine what possible parameters and displays I can add to it to create

an epic tool for creating algorithmic sound.

This synthesizer uses an oscillator as the source of its sound, which I will briefly touch on in my thoughts below.


Inspiration

Much of my inspiration stems from the live coding synthesizer Sonic Pi which has been created by the wonderful Sam Aaron. This program is free for anyone to use!

Sonic Pi is basically an incredible Ruby based program in which you can build functions to create sounds and manipulate them in front of live audiences, AND Sam created a very in depth guide for how to use it.

The photo below is a photo of the program, and a bit of code I've put into it to create a basic techno beat.

Sonic Pi Live Coding Synthesizer


Intention

As I continue to work on this project, it will evolve in the way it works and functions, as my primarily goal is to integrate a container for code to generate and display as some of the parameters are adjusted within the synth.

I really appreciate what Sonic Pi has to offer its users, and really just want to bring more focus to the "algoraving" scene.

This post serves to mainly delve into some of the code I created with my novice understanding of JavaScript, and I hope as I continue to work on this and other related projects, to display the skills I am harnessing as I (hopefully) become a better coder.

The basic framework for creating a modular synth should involve a master volume control, and I established this with a JS variable:

const masterVolume = context.createGain();
masterVolume.connect(context.destination);
Enter fullscreen mode Exit fullscreen mode

Though I am only going to tap on the topic of what an oscillator is for the sake of simplicity, (as I am not discussing my assignment of notes in this post) I will include the general definition of what an oscillator is.

An electronic oscillator is an electronic circuit that produces a periodic, oscillating electronic signal, often a sine wave or a square wave or a triangle wave.

To add an oscillator to the sound source I created this variable, and also connected it to the volume variable in the code above.


const oscillator = context.createOscillator();
oscillator.frequency.setValueAtTime(220, 0);
oscillator.connect(masterVolume);
Enter fullscreen mode Exit fullscreen mode

ASDR

In sound and music, an envelope describes how a sound changes over time, and is essentially an amplitude effect typically made up of 4 values: (A)ttack time, (D)ecay time, (S)ustain level, and (R)elease time, or ADSR.

These are the four main controls of an envelope. Envelopes are what make a sound short and punchy, and also what enables you to fade in and fade out a sound.


Here is a visual that might simplify this concept.

ASDR

Within my code, I used JavaScript to create these variables for the synthesizer to call from.

Web Audio API doesn’t have an envelope built in, it is possible to simulate the same effect using gain nodes. Here is the code I used to set initial parameters for the sound envelope.


// Envelope
let attackTime = 0.3;
let sustainLevel = 0.8;
let releaseTime = 0.3;
let noteLength = 1;

const attackControl = document.querySelector('#attack-control');
const releaseControl = document.querySelector('#release-control');
const noteLengthControl = document.querySelector('#note-length-control'):
Enter fullscreen mode Exit fullscreen mode

Within the synthesizer I've coded, I've included three interactive levels that a user to adjust the range of the Attack, Release, and Note Length.

The event listeners below enables the levels that a user chooses to be registered as values that can greatly change the dynamic of the oscillator chosen.


attackControl.addEventListener('input', function() {
  attackTime = Number(this.value);
});

releaseControl.addEventListener('input', function() {
  releaseTime = Number(this.value);
});

noteLengthControl.addEventListener('input', function() {
  noteLength = Number(this.value);
});
Enter fullscreen mode Exit fullscreen mode

Conclusion

It might have made sense for me to write this post on the way the sounds were implemented for the synthesizer to play, but I think that would be more appropriate for a more in depth post lightly tapping into music theory, and some of the ways Sonic Pi also generates notes from numbers.

Though I am only a few weeks into my journey with JavaScript, I am looking forward to better understanding it through creating and contributing towards projects within my coding school curriculum at Flatiron School, as well as personal projects I find myself working on incessantly in my free time.

If you're interested in exploring the Web Audio API for yourself,

you can explore some documentation here.

ʕっ‒ α΄₯ β€’ ʔっ Thanks for reading!

Top comments (3)

Collapse
 
elliotmangini profile image
Elliot Mangini

POG

Collapse
 
nessakodo profile image
π™‰π™šπ™¨π™¨π™– 𝙆𝙀𝙙𝙀 • Edited

thanx!! ur post set the bar at ultimate height tho

Collapse
 
matceasar profile image
Mateus Loubach

good read.