re: DEV feature idea: Self-serve live broadcasting VIEW POST

FULL DISCUSSION
 

Building this kind of thing open source is easier then you might think. WebRTC is p2p and the only server side infrastructure you would need is a signaling server and possibly a messaging pipeline (WebSocket), storage for displaying metadata about the streams. Recording the live stream is a whole other can of worms, but it could be possible to stream WebRTC to a server and record it there, compress and serve. A lo fi solution would be to allow the user to record, the WebRTC MediaStream could be recorded on the client and then uploaded to a server as a Blob.

 

P.S. if this feature goes live I will rig up a green screen for my home office and live stream content all the time.

 

This is music to my ears.

But is WebRTC the solution for mass streaming? I was thinking that was more for group video chat. It seems like video streaming is typically implemented by uploading to a server which distributes the content via a CDN.

I'm all for taking the right approach, we just need to figure out what that is!

The way I was thinking about implementing this feature followed these steps.

  • Live broadcast over WebRTC
  • WebRTC MediaStream is recorded on server or client
  • Recording is transmitted to CDN
  • Recording is streamed from CDN

WebRTC would only be for the live broadcast, multiple people could participate or it could be a one to many broadcast (easier for MVP).

Would love you to weigh in about WebRTC on this GIthub issue I just made! It sounds like a really cool thing to do, but I don't much experience in it to give it a good evaluation

github.com/thepracticaldev/dev.to/...

 

There is a mature project around webrtc jitsy.org

But I dont think the webrtc will be the right tool for this.
By default each node will open a connection with all nodes connected to the session, so for mass streaming it will burn the computers ;)

code of conduct - report abuse