RTSP streaming has been used for quite a long time. A partnership between RealNetworks, Netscape, and Columbia University first developed and delivered the protocol in 1996-97. RTSP protocol was developed through hands-on experience of streaming practice with RealNetworks’ RealAudio and Netscape’s LiveMedia. Its main purpose is “VCR-like control” over media streams. VCR-like control is the ability to play, pause, rewind, and otherwise direct the viewing experience. It was pretty cool in the late ’90s, even if it doesn’t sound interesting right now.
RTSP was standardized in 1998 as RFC 2326 and immediately became useful as a way for users to play audio and video directly from the internet without downloading the files to their device first. People really liked it!
It was built on existing standards of the time, resembling HTTP in operation ( therefore easily compatible with existing HTTP networks), and was able to use SDP (Session Description Protocol) for multimedia communication sessions.
RTSP is an application layer protocol that communicates with a media server to create a session and send commands such as “Pause” and “Play” rather than transmitting actual streaming data. Traditionally, most RTSP servers use RTP (Real-Time Transport Protocol) and RTCP (Real-Time Control Protocol) to transmit media streams.
As I said above, RTSP was once one of the leading streaming technologies for internet audio and video streaming. Over time HTTP-based streaming technologies and adaptive bitrate streaming solutions began to eclipse the old technologies such as RTSP and RTMP (R.I.P). Original authors Anup Rao, Rob Lanphier, and others proposed an RTSP version 2.0 in 2016, with updates intended to shorten round trip communications with the media server and address some issues with network address translation (NAT).
RTSP also remains the protocol of choice for IP cameras, which are used in a majority of surveillance, CCTV, and conference video technologies all of which might be used as a source for live broadcast.