DEV Community

Fabian
Fabian

Posted on

Sending detailed MIDI messages from Ableton to the browser using OSC over UDP.

Max/MSP

Last months I have been working on the concept to generate live visuals on the web based on the music that I play, so that I can perform live with real time generative visuals. I make music in Ableton, which is in this case an advantage for me, because Ableton Live Suite comes with Max4Live, which is Max/MSP, a visual programming language in which you can (for example) create your own devices like granular synthesizers, vj tools, or anything other you can imagine that can run into your DAW.



Normally I would have made use of this node module, that includes a pre-made Max4Live device that you can insert into your Live set. By importing this node module you can have access to the LOM, which is the Live Object Model, a node based representation of the Live interface. This lets you interact with the Live interface in various ways, and it lets you read out a lot of values. Unfortunately, it does not let you know when certain notes are played at what time, which was a huge constraint for me, because this would be valuable information for my project.

Sending messages over UDP

It turns out that receiving real-time messages into your javascript code is actually quite trivial! All you need is a simple Max4Live device that sends messages over UDP to a certain port, and you would then listen for these messages in Node.js. You really do want to use UDP over TCP (like HTTP for example), because this way of sending data over the internet is way faster, however, it does require you to work with Buffers, as UDP works with binary data.

This would literally be all you need for the Max4Live device:

midiin receives all note data from the track that the Max4Live device is mounted on, it then sends this data over UDP to port 9000. Then in Nodejs you could listen on port 9000 for messages with the dgram package (which is part of NodeJs itself), like so:

const dgram = require('dgram');
const server = dgram.createSocket('udp4');

server.on('error', (err) => {
  console.log(`server error:\n${err.stack}`);
  server.close();
});

server.on('message', (msg, rinfo) => {
  console.log(`server got: ${msg} from ${rinfo.address}:${rinfo.port}`);
});

server.on('listening', () => {
  const address = server.address();
  console.log(`server listening ${address.address}:${address.port}`);
});

server.bind(9000);
Enter fullscreen mode Exit fullscreen mode

The downside of using basic UDP for MIDI messages

We now receive messages when a note is played, but the information it gives us is not really helpful, as it spits out random numbers. MIDI is a very descriptive standard, as it can receive up to 16 channels of information. It turns out that using just UDP does not give us a lot of information on what not was played, and what it velocity is, etcetera. And this only gets more complicated if we want to act on CC messages, when you turn a knob on a MIDI controller for example.

Using OSC for more detailed MIDI messages

This is where OSC comes to the rescue. OSC stands for Open Sound Control, and it's a protocol over UDP designed specifically for sending music performance data over the internet. It's also very easy to read for humans, as it uses an URL based messages. Luckily for us, we don't have to create our own Max4Live device that makes use of OSC on a Ableton track, there is already a perfect implementation that we can use, which you can find here.

This device lets you decide what messages get send out and on what path you want to receive them.

Using the osc node module, receiving these messages is a piece of cake, see the code below:

// // https://cycling74.com/forums/midi-over-udp
const app = require("express")();
const server = require("http").Server(app);
const io = require("socket.io")(server);
const osc = require("osc");

const UDP_PORT = 9000;
const SOCKET_PORT = 8000;

const udpPort = new osc.UDPPort({
  localAddress: "127.0.0.1",
  localPort: UDP_PORT
});
server.listen(SOCKET_PORT);

const requiredValuesForNote = 2;
let valueBuffer = {};

udpPort.on("ready", function() {
  console.log(`Listening for OSC over UDP on port ${UDP_PORT}.`);
  console.log(`Awaiting socket connection on port ${SOCKET_PORT}.`);

  io.on("connection", socket => {
    console.log("Socket connected!");

    udpPort.on("message", ({ address, args }) => {
      if (address === "/pitch") valueBuffer.pitch = args[0];
      if (address === "/velocity") valueBuffer.velocity = args[0];

      if (Object.keys(valueBuffer).length === requiredValuesForNote) {
        // Emit socket to (webGL) client
        io.emit("osc-message", valueBuffer);
        valueBuffer = {};
      }
    });
  });
});

udpPort.on("error", function(err) {
  console.log(err);
});

udpPort.open();
Enter fullscreen mode Exit fullscreen mode

Unfortunately, every different type of information (pitch, velocity, etc) is received through a different event. We only want to send a socket to our client (which handles our visuals in this case, how this is implemented is totally up to you) once. Using a buffer, we wait untill all note values are present, and only then we fire our websocket with the data we need.

And there you have it, we now receive real time detailed MIDI messages!

Top comments (0)