DEV Community

Cover image for HLS Audio Streaming in NodeJS
Sandip Basnet
Sandip Basnet

Posted on

HLS Audio Streaming in NodeJS

In this article, I will discuss the working mechanism of HLS and how we can use HLS(HTTP Live Streaming) for audio/music streaming in NodeJs.

What is HLS?
According to Wikipedia: HLS is an HTTP-based adaptive bitrate streaming communications protocol developed by Apple Inc. It was released in 2009. Support for the protocol is widespread in media players, web browsers, mobile devices, and streaming media servers.

Why HLS?

  1. Fast:
    As one large file is divided into segments a few segments in length and those files are fetched on respective player time with the help of the manifest/index file.

  2. Adaptive Bitrate Streaming:
    It has the ability to adjust video quality in the middle of a stream as network conditions change. This ability allows videos to keep playing even if network conditions get worse; conversely, it also maximizes video quality to be as high as the network can support.
    If the network slows down, the user’s video player detects this, and adaptive bitrate streaming lowers the quality of the stream so that the video does not stop playing. If more network bandwidth becomes available, adaptive bitrate streaming improves the quality of the stream.
    Adaptive bitrate streaming is possible because HLS creates several duplicate segmented streams at different quality levels during the segmentation process. The user’s video player can switch from one of those streams to another one during video playback.

fig: Illustration of Adaptive bitrate streaming. source cloudinary.com.

  1. Unwanted i.e unwatched or unlistened portions of the file(audio/video) won’t be downloaded.

  2. Less server and client configuration uses HTTP Protocol.

How does HLS work?
Server: An HLS stream originates from a server where (in on-demand streaming) the media file is stored, or where (in live streaming) the stream is created. Because HLS is based on HTTP, any ordinary web server can originate the stream.

Two main processes take place on the server:

Encoding: The audio/video data is reformatted so that any device can recognize and interpret the data. HLS must use H.264 or H.265 encoding.

Segmenting: The audio/video is divided up into segments a few seconds in length.

  • In addition to dividing the audio/video into segments, HLS creates an index file of the audio/video segments to record the order they belong in.

  • HLS will also create several duplicate sets of segments at different quality levels: 480p, 720p, 1080p, and so on.

Distribution: The encoded video segments are pushed out to client devices over the Internet when client devices request the stream. Typically, a CDN or OFS like s3 will help distribute the stream to geographically diverse areas.

Client devices: The client device is the device that receives the stream and plays the video — for instance, a user's smartphone or laptop. The client device uses the index file(.m3u8) as a reference for assembling the audio/video in order, and it switches from higher quality to lower quality pictures (and vice versa) as needed.
fig: HLS workflow

fig: HLS different levels of manifests, source https://www.streamingmedia.com/

Example: HLS Audio streaming(NodeJs)

  1. MP3 to chunks:

First of all, initiate a project with npm init and add utils/mp3tochunks.js, the contents of mp2tochunks.js should be:
`const util = require('util');
const exec = util.promisify(require('child_process').exec);
const fs = require('fs');
const path = require('path');

const dir = path.join(dirname, '../songs');
const dest = path.join(
dirname, '../temp/chunks');

const startTime = new Date();
console.info('> Start reading files', startTime);

fs.readdir(dir, (readDirError, files) => {
if (readDirError) {
console.error(readDirError);

    return;
}

const countFiles = files.length;
files.map(async (file, index) => { 
    const fileName = path.join(dir, file);

    const { err, stdout, stderr } =
        await exec(`ffmpeg -i ${fileName} -profile:v baseline -level 3.0 -s 640x360 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls  ${dest}/${index}.m3u8`);

    if (err) {
        console.log(err);
    }

    if (countFiles - 1 === index) {
        const endTime = new Date();
        console.info('< End Preparing files', endTime);
    }
});
Enter fullscreen mode Exit fullscreen mode

});`

In the above snippet, I have made a song folder that contains mp3 files that will later be converted into chunks and those chunks are stored on the temp/chunks directory.

The main task in this util is done by ffmpeg, which should be installed on your machine.

Command used:
ffmpeg -i ${fileName} -profile:v baseline -level 3.0 -s 640x360 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls ${dest}/${index}.m3u8

where,
-i : specifies the input file audio/video,
-profile: sets the audio/video encoding codec profiles, setting baseline, as it supports lower-cost applications with limited computing resources,
-level: level as mentioned in Annex A of the H.264 standard,
-s: sets frame size,
-start_number: Set the index of the file matched by the image file pattern to start to read from, the default value is 0.
-hls_time: segment duration in seconds,
-hls_list_size: Set the maximum number of playlist entries. If set to 0 the list file will contain all the segments. The default value is 5,
-f: Set the file format,
${dest}/${index}.m3u8: Sets Path of the chunks generated and manifest file m3u8

The content of the m3u8 file should be something like this:

`#EXTM3U

EXT-X-VERSION:3

EXT-X-TARGETDURATION:10

EXT-X-MEDIA-SEQUENCE:0

EXTINF:10.008778,

00.ts

EXTINF:10.008778,

01.ts

EXTINF:9.985556,

02.ts

EXTINF:10.008778,

03.ts

EXTINF:10.008778,

04.ts

EXTINF:9.985556,

05.ts

EXTINF:10.008778,

06.ts

EXTINF:10.008778,

07.ts

EXTINF:9.985556,

08.ts

EXTINF:10.008778,

09.ts

EXTINF:10.008778,

010.ts

EXTINF:9.985556,

011.ts

EXT-X-ENDLIST`

where,
EXTM3U: this indicates that the file is an extended m3u file. Every HLS playlist must start with this tag.

EXT-X-VERSION: indicates the compatibility version of the Playlist file.

EXT-X-TARGETDURATION: this specifies the maximum duration of the media file in seconds.

EXT-X-MEDIA-SEQUENCE: indicates the sequence number of the first URL that appears in a playlist file. Each media file URL in a playlist has a unique integer sequence number. The sequence number of a URL is higher by 1 than the sequence number of the URL that preceded it. The media sequence numbers have no relation to the names of the files.

EXTINF: tag specifies the duration of a media segment. It should be followed by the URI of the associated media segment — this is mandatory. You should ensure that the EXTINF value is less than or equal to the actual duration of the media file that it is referring to.

Once the manifest file is generated, we have to host it, lets add a node server and test it on hls player i.e. add main.js file that should have:
`var http = require('http');
var fs = require('fs');

const port = 8000

http.createServer(function (request, response) {
console.log('request starting...');

var filePath = './temp/chunks' + request.url;

fs.readFile(filePath, function(error, content) {
    response.writeHead(200, { 'Access-Control-Allow-Origin': '*' });
    if (error) {
        if(error.code == 'ENOENT'){
            fs.readFile('./404.html', function(error, content) {
                response.end(content, 'utf-8');
            });
        }
        else {
            response.writeHead(500);
            response.end('Sorry, check with the site admin for error: '+error.code+' ..\n');
            response.end(); 
        }
    }
    else {
        response.end(content, 'utf-8');
    }
});
Enter fullscreen mode Exit fullscreen mode

}).listen(port);
console.log(Server running at http://127.0.0.1:${port}/);
`

This basically takes the request URL and looks for the corresponding URL if that is generated in chunks, if that is generated it will respond with the contents of the file. for example: for request http://127.0.0.1:8000/0.m3u8 the manifest file will be downloaded, here to test the file you have to add one browser extension i.e. Native HLS Playback, that should play the media you have used. In the meantime, you can inspect the file sequence chunks requested by the client/browser from the server after every 10 seconds the chunk has been loaded.

That sums up the HLS audio/video streaming with nodejs. Here all the files are served locally, in an ideal case they are served through some online file storage service like AWS s3, for that, I have added uploadchunkstos3.js file in the repository of this article.

References

  1. https://github.com/mondyfy/hls_demo
  2. https://blog.boot.dev/javascript/hls-video-streaming-node/
  3. https://hls-js.netlify.app/demo/
  4. https://github.com/video-dev/hls.js/
  5. https://scanskill.com/programming/how-to-use-hlsjs-for-video-streaming/
  6. https://www.cloudflare.com/learning/video/what-is-http-live-streaming/
  7. https://medium.com/sharma02gaurav/adaptive-bitrate-streaming-hls-vod-service-in-nodejs-8df0d91d2eb4
  8. http://blog.mediacoderhq.com/h264-profiles-and-levels/

Top comments (0)