DEV Community

Hamit Demir
Hamit Demir

Posted on

Most used streaming protocols 2022

We will learn about the following streaming protocols in order:

RTMP
RTSP
WebRTC
HLS
SRT
CMAF

RTMP (Real-Time Messaging Protocol):

RTMP streaming protocol, a TCP-based technology, was developed by Macromedia for streaming audio, video, and data over the Internet, between a Flash player and a server. Macromedia was purchased by its rival Adobe Inc. on December 3, 2005, but with the phasing out of Flash in 2020, its use has become less to do with viewer-facing delivery of content and more for ingesting live streams into a platform through RTMP-enabled encoders.

RTMP streaming protocol technical specifications:

Audio Codecs: AAC, AAC-LC, HE-AAC+ v1 & v2, MP3, Speex
Video Codecs: H.264, VP8, VP6, Sorenson Spark®, Screen Video v1 & v2
Playback compatibility:
Not widely supported anymore
Limited to Flash Player, Adobe AIR, RTMP-compatible players
No longer accepted by iOS, Android, most browsers, and most embeddable players
Benefits: Low latency and minimal buffering
Drawbacks: Not optimized for quality of experience or scalability
Latency: 5 seconds
RTMP variations:
RTMP: The plain TCP- based protocol
RTMPS: Uses a secure SSL connection to minimize the risk of cloud-based streaming.
RTMPE: Uses Adobe’s proprietary security encryption and is a lighter-weight encryption layer than RTMPS.
RTMPT: Encapsulated with HTTP to bypass firewalls and corporate traffic filtering.
RTMFP: Uses UDP instead of TCP

RTSP (Real-Time Streaming Protocol):

RTSP establishes and controls either a single or several time-synchronized streams of continuous media such as audio and video. It does not typically deliver the continuous streams itself, although interleaving the continuous media stream with the control stream is possible In other words, RTSP acts as a “network remote control” for multimedia servers.

Because most IP cameras still support RTSP, it remains a standard used in surveillance and CCTV systems.

RTSP technical specifications:
Audio codecs: AAC, AAC-LC, HE-AAC+ v1 & v2, MP3, Speex, Opus, Vorbis
Video codecs: H.265 (preview), H.264, VP9, VP8
Playback compatibility:
Not widely supported and rarely used for playback (Quicktime Player and other RTSP/RTP-compliant players, VideoLAN VLC media player, 3Gpp-compatible mobile devices)
Benefits: Low-latency and ubiquitous in IP cameras
Drawbacks: Not optimized for quality of experience and scalability
Latency: 2 seconds
RTSP variations:
The entire stack of RTP
RTCP (Real-Time Control Protocol)
RTSP is often referred to as RTSP

WebRTC (Web Real-Time Communications):

WebRTC stands for web real-time communications and it is a very exciting, powerful, and highly disruptive cutting-edge technology and streaming protocol.

It is HTML5 compatible and you can use it to add real-time media communications directly between browser and devices. Plus, you can do that without the need for any prerequisite plugins to be installed in the browser. It is progressively becoming supported by all major modern browser vendors including Safari, Google Chrome, Firefox, Opera, and others.

Thanks to WebRTC video streaming technology, you can embed the real-time video directly into your browser-based solution to create an engaging and interactive streaming experience for your audience without worrying about the delay.

WebRTC streaming protocol features:
Ultra-Low Latency Video Streaming – latency is 0.5 seconds
Platform and device independence
Advanced voice and video quality
Secure voice and video
Easy to scale
Adaptive to network conditions
WebRTC Data Channels
Pros:

Easy, browser-based contribution.
Peers open connections directly to each other.
Low latency and supports interactivity at 500-millisecond delivery.
Can be used end-to-end for some use cases.

Cons:

Not the best option for broadcast-quality streaming due to certain features to enable near real-time delivery.

HLS (HTTP Live Streaming):

HLS is an adaptive HTTP-based protocol used for transporting video and audio data/content from media servers to the end user’s device.

HLS was created by Apple in 2009. Apple announced the HLS at about the same time as the legendary device iPhone 3. Earlier generations of iPhone 3 had live streaming playback problems, and Apple wanted to fix this problem with HLS.

Features of HLS video streaming protocol:
Closed captions
Fast forward and rewind
Alternate audio and video
Fallback alternatives
Timed metadata
Ad insertion
Content protection
HLS technical specifications:
Audio codecs: AAC-LC, HE-AAC+ v1 & v2, xHE-AAC, Apple Lossless, FLAC
Video codecs: H.265, H.264
Playback compatibility: It was created for iOS devices, however, now all Google Chrome browsers, Android, Linux, Microsoft, and macOS devices; several set-top boxes, smart TVs, and other players support HLS as it is a universal protocol.
Benefits: Supports adaptive bitrate, reliable, and widely supported.
Drawbacks: Video quality and viewer experience are prioritized over latency.
Latency: HLS allows us to have 5-20 seconds latency, but the Low-Latency HLS extension has now been incorporated as a feature set of HLS, promising to deliver sub-2-second latency.

SRT (Secure Reliable Transport):

SRT is an open-source technology designed for reliable and low-latency streaming over unpredictable public networks. It competes directly with RTMP and RTSP as a first-mile solution but is still being adopted as encoders, decoders, and players add support. One interactive use case that SRT proved instrumental for in 2020 was the first virtual NFL draft — ensuring high-quality streaming and operational flexibility from anywhere with an internet connection.

SRT benefits:
An open-source alternative to proprietary protocols.
High-quality and low-latency.
Designed for live video transmission across unpredictable public networks.
Accounts for packet loss and jitter.
SRT limitations:
Not natively supported by all encoders.
Still being adopted as newer technology.
Not widely supported for playback.

CMAF (Common Media Application Format):

CMAF is a new format to simplify the delivery of HTTP-based streaming media. It is an emerging standard to help reduce cost, and complexity and provide a latency of around 3-5 secs in streaming. CMAF can be used in DASH or HLS.

As a result of the declining status of RTMP, other HTTP-based (Hypertext Transfer Protocol) technologies for adaptive bitrate streaming have emerged. However, different streaming standards require different file containers. Such as while MPEG-DASH uses .mp4 containers, HLS streams are delivered in .ts format.

Therefore, every broadcaster who wants to reach a wider audience must encode and store the same video file twice, because encryption creates completely different groups of files.

These two versions of the same video stream should be made either in advance or instantly. Both of these procedures require additional storage and processing costs.

Apple and Microsoft suggested Moving Pictures Expert Group create a new uniform standard called Common Media Application Format (CMAF) to reduce complexity when transmitting video online.

CMAF benefits:
Cutting costs
Minimizing workflow complexity
Reducing latency

Discussion (0)