DEV Community

Angelik Laboy for Dolby.io

Posted on

Interactive Live Streaming Through Virtual Cameras on Unity

When it comes to the world of streaming, we have become accustomed to its widespread use, with various technologies striving for lower latency and faster integration. Amidst these advancements, one aspect remains to be tackled: interactivity.

I mean, look at Netflix's Black Mirror: Bandersnatch or Kaleidoscope, all part of their endeavor to immerse audiences in the stories, becoming an integral part of the narrative. And if you observe closely, that is where we get the key to interactivity: the integration. We have come so far in entertainment that we desire to belong to the stories in order to live the adventures of our favorite characters. Wouldn't it be cool to be in the kitchen with Carmy and Sdyney from The Bear? Or to be one of the fans cheering for AFC Richmond in Ted Lasso?

I would personally want to do that too but the challenge so far has been the immersion part. We haven't really been able to dissociate from life to believe we are actually in those stories. As of today (August 3, 2023), the stage is still being set for this new era of interactive streaming. By now, the XR community is ready to fire back saying their insert X product is doing it, and I genuinely believe they have the potential to deliver on this promise.

Image description

For example, at the 2021 Annecy International Animation Film Festival, "On The Morning You Wake (To The End Of The World)" was shown as an official selection which recounts the experiences of a few Hawaiians who received an SMS from the state's Emergency Management Agency about a missile threat. This documentary displayed a turbulent situation where everything is an unceasing chaos and instead gave us a reflective look about the fragility of life. On the visuals, its pointillism technique enhanced the sentiment of this experience being shared by not only one, but thousands of citizens at the same time.

Ultimately, one gets to feel like a part of the story.

Image description

Now, this experience is exclusively being offered through Meta's Quest lineup, but what if there was a way to experience another one's perspective on a headset like that? What if there was a way to allow us, the audience, to control how we want to watch the film?

Live Stream from Unity

Image description

I have talked your ear off about interactivity and being the main protagonist. Now, lets talk about the tech. Working within Dolby I have gotten the chance to experiment with our virtual world's plugins. As part of Dolby.io, the Unity/Unreal plugins offer the exciting capability to stream content from inside the engines. Whether it's viewing the 3rd person POV of the main character or streaming from a static virtual camera, the plugin introduces a groundbreaking way to actualize experiences in this space.

In this post, I aim to demonstrate how you can set up multiple cameras inside the engine and display them all through a single stream using a multi-view configuration.

Installing the Unity Plugin

Image description

To start, I will be using the Bitsize Samples from Unity - Client Driven for a multiplayer game where users pick up orbs and deposit them in their designated colors. With a fully actualized world, our focus can now shift to enhancing the cinematics.

To get started, follow these steps:

  1. Open the project through Unity Hub.
  2. Go to Windows > Package Manager.
  3. In the Package Manager window, change the view to show only the "In-Project" packages.
  4. To the left of it, click on the "+" icon, and select the option to Add package from git URL....
  5. Insert the following URL into the space:

https://github.com/millicast/millicast-unity-sdk.git

The installed package should read out Millicast v 1.1.0.

Establishing Credentials and Stream Settings

Before starting with the stream, we will need to do one more thing. On Assets, create a new folder and title it “Credentials”. Once inside, right-click and navigate to Create > Millicast > Credentials. By completing this step, you'll set up credentials once and associate them with the plugin's components, eliminating the need to rewrite them for each instance.

Image description

The next step will require connecting your Dolby.io credentials so lets check that off. Go to Dolby.io and receive 50GBs each month for free to get started with a few examples. Once you're inside the Dolby.io Dashboard, ensure that the Streaming tab is selected on the top left. Here is where stream tokens would be created to enable a broadcast; press on the "+ Create" button and name the token label as desired. For Add stream names, click on the Allow any stream name. The reason why this option is selected is to center all of the cameras to be under the same stream token.

Image description

With the token made, navigate inside settings and view the API tab to collect your Account ID and Publishing Token.

Image description

In the same way Credentials was created, two other assets can be added: Video Configuration and Audio Configuration. The first one would have you control the codec, resolution, framerate, and many more while the audio asset is there to control the distance in which the sound would be emitted as well as the volume of the stream. Customize the settings to your choosing!

Image description

Publisher: How You Can Stream Out

Image description

Now, let's get started! In your Playground, begin by creating an empty GameObject and name it "Production." Within "Production," we will craft four different cameras, each serving a unique purpose. Position these cameras strategically throughout the environment, ensuring they capture crucial action points that the audience would want to witness. Think of these cameras as virtual security cameras, recording the exciting moments at the scoring platforms.

By incorporating a variety of camera perspectives, you'll create an immersive experience for the audience, drawing them deeper into the virtual world and enhancing their enjoyment of the content.

Image description

With those set, visit the first camera's inspector view and click Add component. Next, search Mc Publisher; this is a component from Dolby.io that will allow for the game to be streamed out. The first information we need to give is a Stream Name. Since the stream token was created as a wild card, any name can be used. However, if the stream token was specifically named, return to the streaming dashboard to find the token’s API information, where you will see the Stream Name (e.g., “leqgbgh9”). In the Credentials space, drag and drop the asset containing all the necessary information (previously named Credentials). Do the same in the space asking for the Video Config Data, providing the required video configuration asset.

Image description

Check off the Publish On Start to link the Start button on the engine to starting the stream on Dolby.io. For Stream Type, there are three options: Video + Audio, Video Only, or Audio Only. As the options read, this is the field to select how you would like your stream to be shown.

Image description

For the next two settings, Video Source Type and Video Source Camera, simply select "Camera" and the name of the specific camera where this component is added. This space allows you to control the stream without directly adding the component to the camera itself. For example, if you prefer to manage all four cameras from a single GameObject, that is also a viable option.

Image description

Next, you will notice that it asks to use the Audio Listener of the camera. Selecting this option will have the audio transmitted be from the camera’s perspective.

  • Note: Make sure the Audio Listener component is check on to in order to allow any audio transmission.

Lastly, it is asking whether to Enable Multi Source and we will apply this option. Multisource is the feature that allows a single stream (WebRTC or RTMP) to support broadcasting multiple video and audio feeds. Once rendered these streams can be switched between, offering the viewer the ability to control how they view and listen to the content. Upon selecting this option, a new field called Source Id will appear. The sourceID represents the ID of a device that you want to use. If you use multiple screens, use this parameter to specify which screen we are viewing from. For simplicity sake, I am going to name the first camera "camOne".

Follow this procedure for the next three cameras: keep the Stream Name the same for all, change the Video Source Camera to the current camera with the component, and use the sourceId as the next incrementing label, e.g., "camTwo," "camThree," "camFour."

BONUS:

This bitsize sample already has a camera following the mainPlayer. If you wish to stream this perspective, follow the same steps for the mainCamera object. The only difference is that you do not need to mark Enable Multi-Source in this case. When no specific sourceId is labeled, and the video is going into the same stream, it will be identified as the main source by default.

Viewing the Multi-View

With the Mc Publishers finished, let us verify our work. On the engine’s centered buttons, press Start to commence the experience. Congrats, your game has started streaming! But where is it being streamed out to?

Let’s pay back a visit to the token’s API information. On the Playback tab, the hosted player path URL will be present. This URL is the link needed to view the livestream going on. In the URL, you will be asked to write in YourPublishName, which is the stream name of your token. If you decided to add the secure viewer, it would also ask for your subscribe token information.

https://viewer.millicast.com?streamId=accountID/McPublisherStreamName

Image description

Voilà, your streaming is up and running!

Image description

You might notice that it is only showing the mainCamera's view and not the other four camera set up. Go to the gear icon on the bottom right and click on Show Multi View. This will activate the layout on the web viewer. Now you can click around the screen to the different camera to change the focus on who is the first one exhibited. Equally, if all feeds are important then on the gear icon is also the option to change layout.

Image description

Final Thoughts

We have just accomplished a stream from inside an engine. What now? The idea of presenting a streaming solution for XR experience can mean a lot of different things. Not only is it an accessibility bonus for your product for which users who can't use a headset can have a chance at viewing the experience. Keyly, it makes us think of what can be done with more dimensions now added to these experience. Streaming XR experiences can allow audiences to actively participate in virtual worlds, creating a deeper level of engagement and immersion. This interactivity lets users control their actions and decisions within the XR environment, leading to a more personalized and captivating experience. The enablement of XR streaming invites real-time participation in events, performances, and social gatherings. For example, esports viewers can now be more close to the action by giving them the chance to see where they desire. This personalization allows users to shape the storyline, outcomes, and interactions, tailoring the experience to their liking. As a result, users feel more invested in the content and are more likely to return for further engagement. Not to mention, it can foster a sense of community and social interaction even when physically distant.

All I am saying is that... you should watch out for this field.

Top comments (0)