DEV Community

Uncodeable864
Uncodeable864

Posted on

Create the Musicbot of Musicness with @magenta/music

If you're here, you either Googled a Magenta tutorial, or this was recomened to you. Before we start, I'll quickly explain what Magenta is. According to Magenta's website, Magenta lets you "make music and art using machine learning" Today, we'll make music.

Sidenote: Magenta is open-source and the GitHub repo is here:

GitHub logo magenta / magenta-js

Magenta.js: Music and Art Generation with Machine Learning in the browser

Side-sidenote: This article is about Magenta.js, the JS library, NOT the python library.

Building the Basics

First off, I'll be making the website on Glitch. Second off, we'll need to bring in the script tags, with this code:

        <script src="https://cdn.jsdelivr.net/npm/@magenta/music@1.9.0"></script>

Enter fullscreen mode Exit fullscreen mode

This loads Magenta.js into the webpage.

Uploading the note sequence.

Magenta understands music through NoteSequence, which is an object of notes with their start/stop times, and is pretty big for a minute-long song. You could put this in manually, but that's is boring and takes forever. Instead, what we'll to is convert a MIDI file to a NoteSequence. MIDI is short for Musical Instrument Digital Interface, and is a file type that is similar to a NoteSequence. So, the folks at Magenta allowed us to convert the two with ease. To convert them we'll first need a Blob or a URL with the MIDI. I like to use the URL. In your JS file, you'll need to add this code:

 const mm = window.mm
Enter fullscreen mode Exit fullscreen mode

This places the window's mm object in the variable mm. Up next make sure your script tag is using defer. This code block will make get the MIDI file and convert it to NoteSequence:

const midi = "[MIDI FILE URL HERE]";

let ns = null;

(async () => {
  ns = await mm.urlToNoteSequence(midi);
})();

//More code will be added below this code block
Enter fullscreen mode Exit fullscreen mode

Playing the NoteSequence

Creating the Player

Ok, we have the NoteSequence now. We can use a Player to play it!

This code will initialize a Player object:

const player = new mm.Player()
Enter fullscreen mode Exit fullscreen mode

Now, the default player wont sound the best, so you use replace new mm.Player() with a new SoundFontPlayer. A Soundfont is collection of instrument sounds in a font-like format. The constructor takes in a URL, like this:

// const player = new mm.Player();
  const player = new mm.SoundFontPlayer('https://storage.googleapis.com/magentadata/js/soundfonts/sgm_plus');
Enter fullscreen mode Exit fullscreen mode

Actually playing it!

Quick check in: your JS should look like this:

const mm = window.mm;
const midi = "[MIDI FILE HERE]";
let ns = null;

(async () => {
  ns = await mm.urlToNoteSequence(midi);
})();

const player = new mm.SoundFontPlayer('https://storage.googleapis.com/magentadata/js/soundfonts/sgm_plus');
Enter fullscreen mode Exit fullscreen mode

Now, unfortunately, we need to start the player after the user interacted with the page, this is to prevent people from randomly playing music on a page. So, we can just slap a button on the page:

  <button>play</button>
Enter fullscreen mode Exit fullscreen mode

Now, to start a player we can use this code:

  player.start(ns);
Enter fullscreen mode Exit fullscreen mode

This tells the SoundFontPlayer to start playing the NoteSequence ns. Because we didn't shove everything in that async function, we can just call it in the button's onclick event. Like so:

    <button onclick="player.start(ns)">play</button>
Enter fullscreen mode Exit fullscreen mode

If we want to stop the music, we can just use this:

player.stop()
Enter fullscreen mode Exit fullscreen mode

If we put it in a button:

<button onclick="player.stop()">stop</button>
Enter fullscreen mode Exit fullscreen mode

Now, if you click play you should hear a sound! Pushing stop should stop it(if it doesn't reload the page)

Bringing in Musicbot

Because of technical limitations, we can't make a Magenta model in the browser, but we can use one.

Continuing your music file with MusicRNN

One of the "vanilla" Magenta models is MusicRNN. It continues a NoteSequence. So, how do we implement this power? First, we need to reel in the model, like so:

   const musicbot = new mm.MusicRNN('https://storage.googleapis.com/magentadata/js/checkpoints/music_rnn/basic_rnn');
Enter fullscreen mode Exit fullscreen mode

This pulls in the bot from the dangers of the web. But, the bot is asleep until we tell it to activate, with the initialize command. Turning on the bot is as simple as so:

musicbot.initialize();
Enter fullscreen mode Exit fullscreen mode

Great! We've turned on our bot. He is ready to continue our music.

Making the music

The musicfy function of creation

To make our music, we'll need to bring the the big guns -- a async function. We'll call it musicfy. So, the function code is like this as of now:

async function musicfy() {
// Code goes here
}
Enter fullscreen mode Exit fullscreen mode

Now, to create the music we first need to quantize -- or makes the notes in beats instead of seconds.

Quantizing the NoteSequence

To quantize the note sequence the note sequence, we need to call the mm.sequences.quantizeNoteSequence function, like so:

  const qns = mm.sequences.quantizeNoteSequence(ns, 4);
Enter fullscreen mode Exit fullscreen mode

Great news! We can now generate the music.

Making the MUSIC!!

To make our music, we can use the continueSequence function, like so:

  const nns = await musicbot.continueSequence(qns, 20, 1.5);
  player.start(nns);
Enter fullscreen mode Exit fullscreen mode

Let's talk about those parameters. The first one is, of course, our quantized note sequence. The second is how many new steps, or notes, musicbot should make. The third is the temperature. Or how repetitive it should be. Now, you can hook up this function to a button or anything else your heart desires.

Three more things...

First

Magenta has more models you can use!

Second

The code is avalible here:

Third

Magenta has a tutorial here. It's competly intertive and really cool (the code was also used to help make this article)

Bye!
PS. You can also convert a NoteSequence into a Blob, which you can then use a tool to convert to an MP3, and then listen to what your AI has made!
PPS. You can get Magenta on NPM under @magenta/music

Top comments (0)