๐ Introduction
Integrating RTMP (Real-Time Messaging Protocol) live stream into your iOS video call app enables seamless broadcasting of live video content. With this integration, users can easily share live streams during video calls, enhancing communication and collaboration. By leveraging RTMP technology, your app can efficiently transmit real-time video data to streaming servers, ensuring high-quality and low-latency streaming experiences.
Benefits of Integrate RTMP Live Stream in the iOS app:
- Enhanced Communication : Integrating RTMP live stream enables users to share live video content during video calls, enhancing communication and collaboration.
- Real-Time Interaction : Users can engage in real-time discussions and interactions while streaming live video, fostering dynamic communication.
- Versatility : The integration adds versatility to your iOS video call app, allowing users to utilize it for both regular video calls and live streaming purposes.
Use Case of Integrate RTMP Live Stream in the iOS app:
- Training Sessions : Various departments conduct training sessions during the conference. The RTMP live stream feature enables remote employees to join these sessions, ensuring everyone receives the necessary training regardless of their physical location.
- Keynote Address : The CEO delivers the keynote address via video call, and the RTMP live stream feature allows all employees to watch the address in real-time, irrespective of their location.
- Product Demonstrations : The company showcases new products through live demonstrations. Employees can watch the live stream and interact with the presenters, providing feedback and asking questions.
This comprehensive guide will lead you through the step-by-step process of integrating RTMP live streaming into your iOS video call app using VideoSDK.
How to build an iOS live Streaming app using RTMP
๐ Getting Started with VideoSDK
VideoSDK enables the opportunity to integrate video & audio calling into Web, Android, and iOS applications with so many different frameworks. It is the best infrastructure solution that provides programmable SDKs and REST APIs to build scalable video conferencing applications. This guide will get you running with the VideoSDK video & audio calling in minutes.
Create a VideoSDK Account
Go to your VideoSDK dashboard and sign up if you don't have an account. This account gives you access to the required Video SDK token, which acts as an authentication key that allows your application to interact with VideoSDK functionality.
Generate your Auth Token
Visit your VideoSDK dashboard and navigate to the "API Key" section to generate your auth token. This token is crucial in authorizing your application to use VideoSDK features. For a more visual understanding of the account creation and token generation process, consider referring to the provided tutorial.
Prerequisites and Setup
- iOS 11.0+
- Xcode 12.0+
- Swift 5.0+
This App will contain two screens:
Join Screen : This screen allows the user to either create a meeting or join the predefined meeting.
Meeting Screen : This screen basically contains local and remote participant views and some meeting controls such as Enable/Disable the mic & Camera and Leave meeting.
๐ ๏ธ Integrate VideoSDKโ
To install VideoSDK, you must initialize the pod on the project by running the following command:
pod init
It will create the podfile in your project folder, Open that file and add the dependency for the VideoSDK, like below:
pod 'VideoSDKRTC', :git => 'https://github.com/videosdk-live/videosdk-rtc-ios-sdk.git'
then run the below code to install the pod:
pod install
then declare the permissions in Info.plist :
<key>NSCameraUsageDescription</key>
<string>Camera permission description</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone permission description</string>
Project Structure
iOSQuickStartDemo
โโโ Models
โโโ RoomStruct.swift
โโโ MeetingData.swift
โโโ ViewControllers
โโโ StartMeetingViewController.swift
โโโ MeetingViewController.swift
โโโ AppDelegate.swift // Default
โโโ SceneDelegate.swift // Default
โโโ APIService
โโโ APIService.swift
โโโ Main.storyboard // Default
โโโ LaunchScreen.storyboard // Default
โโโ Info.plist // Default
Pods
โโโ Podfile
Create modelsโ
Create a swift file for MeetingData
and RoomStruct
class model for setting data in object pattern.
import Foundation
struct MeetingData {
let token: String
let name: String
let meetingId: String
let micEnabled: Bool
let cameraEnabled: Bool
}
MeetingData.swift
import Foundation
struct RoomsStruct: Codable {
let createdAt, updatedAt, roomID: String?
let links: Links?
let id: String?
enum CodingKeys: String, CodingKey {
case createdAt, updatedAt
case roomID = "roomId"
case links, id
}
}
// MARK: - Links
struct Links: Codable {
let getRoom, getSession: String?
enum CodingKeys: String, CodingKey {
case getRoom = "get_room"
case getSession = "get_session"
}
}
RoomStruct.swift
๐ฅ Essential Steps for Building the Video Calling
This guide is designed to walk you through the process of integrating Chat with VideoSDK. We'll cover everything from setting up the SDK to incorporating the visual cues into your app's interface, ensuring a smooth and efficient implementation process.
Step 1: Get started with APIClientโ
Before jumping to anything else, we have to write an API to generate a unique meetingId
. You will require an authentication token; you can generate it either using videosdk-server-api-example or from the Video SDK Dashboard for developers.
import Foundation
let TOKEN_STRING: String = "<AUTH_TOKEN>"
class APIService {
class func createMeeting(token: String, completion: @escaping (Result<String, Error>) -> Void) {
let url = URL(string: "https://api.videosdk.live/v2/rooms")!
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.addValue(TOKEN_STRING, forHTTPHeaderField: "authorization")
URLSession.shared.dataTask(
with: request,
completionHandler: { (data: Data?, response: URLResponse?, error: Error?) in
DispatchQueue.main.async {
if let data = data, let utf8Text = String(data: data, encoding: .utf8) {
do {
let dataArray = try JSONDecoder().decode(RoomsStruct.self, from: data)
completion(.success(dataArray.roomID ?? ""))
} catch {
print("Error while creating a meeting: \(error)")
completion(.failure(error))
}
}
}
}
).resume()
}
}
APIService.swift
Step 2: Implement Join Screenโ
The Join Screen will work as a medium to either schedule a new meeting or join an existing meeting.
import Foundation
import UIKit
class StartMeetingViewController: UIViewController, UITextFieldDelegate {
private var serverToken = ""
/// MARK: outlet for create meeting button
@IBOutlet weak var btnCreateMeeting: UIButton!
/// MARK: outlet for join meeting button
@IBOutlet weak var btnJoinMeeting: UIButton!
/// MARK: outlet for meetingId textfield
@IBOutlet weak var txtMeetingId: UITextField!
/// MARK: Initialize the private variable with TOKEN_STRING &
/// setting the meeting id in the textfield
override func viewDidLoad() {
txtMeetingId.delegate = self
serverToken = TOKEN_STRING
txtMeetingId.text = "PROVIDE-STATIC-MEETING-ID"
}
/// MARK: method for joining meeting through seague named as "StartMeeting"
/// after validating the serverToken in not empty
func joinMeeting() {
txtMeetingId.resignFirstResponder()
if !serverToken.isEmpty {
DispatchQueue.main.async {
self.dismiss(animated: true) {
self.performSegue(withIdentifier: "StartMeeting", sender: nil)
}
}
} else {
print("Please provide auth token to start the meeting.")
}
}
/// MARK: outlet for create meeting button tap event
@IBAction func btnCreateMeetingTapped(_ sender: Any) {
print("show loader while meeting gets connected with server")
joinRoom()
}
/// MARK: outlet for join meeting button tap event
@IBAction func btnJoinMeetingTapped(_ sender: Any) {
if (txtMeetingId.text ?? "").isEmpty {
print("Please provide meeting id to start the meeting.")
txtMeetingId.resignFirstResponder()
} else {
joinMeeting()
}
}
// MARK: - method for creating room api call and getting meetingId for joining meeting
func joinRoom() {
APIService.createMeeting(token: self.serverToken) { result in
if case .success(let meetingId) = result {
DispatchQueue.main.async {
self.txtMeetingId.text = meetingId
self.joinMeeting()
}
}
}
}
/// MARK: preparing to animate to meetingViewController screen
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
guard let navigation = segue.destination as? UINavigationController,
let meetingViewController = navigation.topViewController as? MeetingViewController
else {
return
}
meetingViewController.meetingData = MeetingData(
token: serverToken,
name: txtMeetingId.text ?? "Guest",
meetingId: txtMeetingId.text ?? "",
micEnabled: true,
cameraEnabled: true
)
}
}
StartMeetingViewController.swift
Step 3: Initialize and Join Meetingโ
Using the provided token
and meetingId
, we will configure and initialize the meeting in viewDidLoad()
.
Then, we'll add @IBOutlet for localParticipantVideoView
and remoteParticipantVideoView
, which can render local and remote participant media, respectively.
class MeetingViewController: UIViewController {
import UIKit
import VideoSDKRTC
import WebRTC
import AVFoundation
class MeetingViewController: UIViewController {
// MARK: - Properties
// outlet for local participant container view
@IBOutlet weak var localParticipantViewContainer: UIView!
// outlet for label for meeting Id
@IBOutlet weak var lblMeetingId: UILabel!
// outlet for local participant video view
@IBOutlet weak var localParticipantVideoView: RTCMTLVideoView!
// outlet for remote participant video view
@IBOutlet weak var remoteParticipantVideoView: RTCMTLVideoView!
// outlet for remote participant no media label
@IBOutlet weak var lblRemoteParticipantNoMedia: UILabel!
// outlet for remote participant container view
@IBOutlet weak var remoteParticipantViewContainer: UIView!
// outlet for local participant no media label
@IBOutlet weak var lblLocalParticipantNoMedia: UILabel!
// Meeting data - required to start
var meetingData: MeetingData!
// current meeting reference
private var meeting: Meeting?
// MARK: - video participants including self to show in UI
private var participants: [Participant] = []
// MARK: - Lifecycle Events
override func viewDidLoad() {
super.viewDidLoad()
// configure the VideoSDK with token
VideoSDK.config(token: meetingData.token)
// init meeting
initializeMeeting()
// set meeting id in button text
lblMeetingId.text = "Meeting Id: \(meetingData.meetingId)"
}
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
navigationController?.navigationBar.isHidden = true
}
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
navigationController?.navigationBar.isHidden = false
NotificationCenter.default.removeObserver(self)
}
// MARK: - Meeting
private func initializeMeeting() {
// Initialize the VideoSDK
meeting = VideoSDK.initMeeting(
meetingId: meetingData.meetingId,
participantName: meetingData.name,
micEnabled: meetingData.micEnabled,
webcamEnabled: meetingData.cameraEnabled
)
// Adding the listener to meeting
meeting?.addEventListener(self)
// joining the meeting
meeting?.join()
}
}
MeetingViewController.swift
Step 4: Implement Controlsโ
After initializing the meeting in the previous step, we will now add @IBOutlet for btnLeave
, btnToggleVideo
and btnToggleMic
which can control the media in the meeting.
class MeetingViewController: UIViewController {
...
// outlet for leave button
@IBOutlet weak var btnLeave: UIButton!
// outlet for toggle video button
@IBOutlet weak var btnToggleVideo: UIButton!
// outlet for toggle audio button
@IBOutlet weak var btnToggleMic: UIButton!
// bool for mic
var micEnabled = true
// bool for video
var videoEnabled = true
// outlet for leave button click event
@IBAction func btnLeaveTapped(_ sender: Any) {
DispatchQueue.main.async {
self.meeting?.leave()
self.dismiss(animated: true)
}
}
// outlet for toggle mic button click event
@IBAction func btnToggleMicTapped(_ sender: Any) {
if micEnabled {
micEnabled = !micEnabled // false
self.meeting?.muteMic()
} else {
micEnabled = !micEnabled // true
self.meeting?.unmuteMic()
}
}
// outlet for toggle video button click event
@IBAction func btnToggleVideoTapped(_ sender: Any) {
if videoEnabled {
videoEnabled = !videoEnabled // false
self.meeting?.disableWebcam()
} else {
videoEnabled = !videoEnabled // true
self.meeting?.enableWebcam()
}
}
...
}
MeetingViewController.swift
Step 5: Implementing MeetingEventListener
โ
In this step, we'll create an extension for the MeetingViewController
that implements the MeetingEventListener, which implements the onMeetingJoined
, onMeetingLeft
, onParticipantJoined
, onParticipantLeft
, onParticipantChanged
, onSpeakerChanged
, etc. methods.
class MeetingViewController: UIViewController {
...
extension MeetingViewController: MeetingEventListener {
/// Meeting started
func onMeetingJoined() {
// handle local participant on start
guard let localParticipant = self.meeting?.localParticipant else { return }
// add to list
participants.append(localParticipant)
// add event listener
localParticipant.addEventListener(self)
localParticipant.setQuality(.high)
if(localParticipant.isLocal){
self.localParticipantViewContainer.isHidden = false
} else {
self.remoteParticipantViewContainer.isHidden = false
}
}
/// Meeting ended
func onMeetingLeft() {
// remove listeners
meeting?.localParticipant.removeEventListener(self)
meeting?.removeEventListener(self)
}
/// A new participant joined
func onParticipantJoined(_ participant: Participant) {
participants.append(participant)
// add listener
participant.addEventListener(self)
participant.setQuality(.high)
if(participant.isLocal){
self.localParticipantViewContainer.isHidden = false
} else {
self.remoteParticipantViewContainer.isHidden = false
}
}
/// A participant left from the meeting
/// - Parameter participant: participant object
func onParticipantLeft(_ participant: Participant) {
participant.removeEventListener(self)
guard let index = self.participants.firstIndex(where: { $0.id == participant.id }) else {
return
}
// remove participant from list
participants.remove(at: index)
// hide from ui
UIView.animate(withDuration: 0.5){
if(!participant.isLocal){
self.remoteParticipantViewContainer.isHidden = true
}
}
}
/// Called when speaker is changed
/// - Parameter participantId: participant id of the speaker, nil when no one is speaking.
func onSpeakerChanged(participantId: String?) {
// show indication for active speaker
if let participant = participants.first(where: { $0.id == participantId }) {
self.showActiveSpeakerIndicator(participant.isLocal ? localParticipantViewContainer : remoteParticipantViewContainer, true)
}
// hide indication for others participants
let otherParticipants = participants.filter { $0.id != participantId }
for participant in otherParticipants {
if participants.count > 1 && participant.isLocal {
showActiveSpeakerIndicator(localParticipantViewContainer, false)
} else {
showActiveSpeakerIndicator(remoteParticipantViewContainer, false)
}
}
}
func showActiveSpeakerIndicator(_ view: UIView, _ show: Bool) {
view.layer.borderWidth = 4.0
view.layer.borderColor = show ? UIColor.blue.cgColor : UIColor.clear.cgColor
}
}
...
MeetingViewController.swift
Step 6: Implementing ParticipantEventListener
In this stage, we'll add an extension for the MeetingViewController
that implements the ParticipantEventListener, which implements the onStreamEnabled
and onStreamDisabled
methods for the audio and video of MediaStreams enabled or disabled.
The function updateUI is frequently used to control or modify the user interface (enable/disable camera & mic) in accordance with the MediaStream state.
class MeetingViewController: UIViewController {
...
extension MeetingViewController: ParticipantEventListener {
/// Participant has enabled mic, video or screenshare
/// - Parameters:
/// - stream: enabled stream object
/// - participant: participant object
func onStreamEnabled(_ stream: MediaStream, forParticipant participant: Participant) {
updateUI(participant: participant, forStream: stream, enabled: true)
}
/// Participant has disabled mic, video or screenshare
/// - Parameters:
/// - stream: disabled stream object
/// - participant: participant object
func onStreamDisabled(_ stream: MediaStream,
forParticipant participant: Participant) {
updateUI(participant: participant, forStream: stream, enabled: false)
}
}
private extension MeetingViewController {
func updateUI(participant: Participant, forStream stream: MediaStream, enabled: Bool) { // true
switch stream.kind {
case .state(value: .video):
if let videotrack = stream.track as? RTCVideoTrack {
if enabled {
DispatchQueue.main.async {
UIView.animate(withDuration: 0.5){
if(participant.isLocal) {
self.localParticipantViewContainer.isHidden = false
self.localParticipantVideoView.isHidden = false
self.localParticipantVideoView.videoContentMode = .scaleAspectFill self.localParticipantViewContainer.bringSubviewToFront(self.localParticipantVideoView)
videotrack.add(self.localParticipantVideoView)
self.lblLocalParticipantNoMedia.isHidden = true
} else {
self.remoteParticipantViewContainer.isHidden = false
self.remoteParticipantVideoView.isHidden = false
self.remoteParticipantVideoView.videoContentMode = .scaleAspectFill
self.remoteParticipantViewContainer.bringSubviewToFront(self.remoteParticipantVideoView)
videotrack.add(self.remoteParticipantVideoView)
self.lblRemoteParticipantNoMedia.isHidden = true
}
}
}
} else {
UIView.animate(withDuration: 0.5){
if(participant.isLocal){
self.localParticipantViewContainer.isHidden = false
self.localParticipantVideoView.isHidden = true
self.lblLocalParticipantNoMedia.isHidden = false
videotrack.remove(self.localParticipantVideoView)
} else {
self.remoteParticipantViewContainer.isHidden = false
self.remoteParticipantVideoView.isHidden = true
self.lblRemoteParticipantNoMedia.isHidden = false
videotrack.remove(self.remoteParticipantVideoView)
}
}
}
}
case .state(value: .audio):
if participant.isLocal {
localParticipantViewContainer.layer.borderWidth = 4.0
localParticipantViewContainer.layer.borderColor = enabled ? UIColor.clear.cgColor : UIColor.red.cgColor
} else {
remoteParticipantViewContainer.layer.borderWidth = 4.0
remoteParticipantViewContainer.layer.borderColor = enabled ? UIColor.clear.cgColor : UIColor.red.cgColor
}
default:
break
}
}
}
...
Known Issueโ
Please add the following line to the MeetingViewController.swift
file's viewDidLoad
method If you get your video out of the container.
override func viewDidLoad() {
localParticipantVideoView.frame = CGRect(x: 10, y: 0,
width: localParticipantViewContainer.frame.width,
height: localParticipantViewContainer.frame.height)
localParticipantVideoView.bounds = CGRect(x: 10, y: 0,
width: localParticipantViewContainer.frame.width,
height: localParticipantViewContainer.frame.height)
localParticipantVideoView.clipsToBounds = true
remoteParticipantVideoView.frame = CGRect(x: 10, y: 0,
width: remoteParticipantViewContainer.frame.width,
height: remoteParticipantViewContainer.frame.height)
remoteParticipantVideoView.bounds = CGRect(x: 10, y: 0,
width: remoteParticipantViewContainer.frame.width,
height: remoteParticipantViewContainer.frame.height)
remoteParticipantVideoView.clipsToBounds = true
}
MeetingViewController.swift
After following the steps, you will be able to seamlessly integrate live streaming into your application, allowing users to broadcast their video calls directly to the RTMP server. This opens the door to a variety of applications, from hosting live Q&A sessions to creating interactive workshops.
TIP:
Stuck anywhere? Check out this example code on GitHub.
Integrate RTMP Live Stream in iOS Video App
RTMP is a widely-used protocol for live streaming video content from VideoSDK to platforms like YouTube, Twitch, and Facebook. By inputting platform-specific stream keys and URLs, VideoSDK connects to the platform's RTMP server, enabling the transmission of live video streams.
VideoSDK enables live streaming of your meetings to platforms supporting RTMP ingestion. Simply provide the platform-specific stream key and URL, and we'll connect to the platform's RTMP server to transmit the live video stream.
The below guide provides an overview of implementing start-and-stop RTMP live streaming.
Start Live Stream
startLivestream()
can be used to start an RTMP live stream of the meeting which can be accessed from the Meeting
class. This method accepts two parameters:
outputs
: This parameter accepts a list of LivestreamOutput
objects that contain the RTMP url
and streamKey
of the platforms, you want to start the live stream.
class MeetingViewController {
private let platformUrl = "<url-of-the-platform>"
private let privateKey = "<private-key>"
// button to start livestream
@IBAction func startLiveStreamButtonTapped(_ sender: Any) {
self.meeting?.startLivestream(outputs: LivestreamOutput(url: platformUrl, streamKey: privateKey))
}
}
Stop Live Stream
stopLivestream()
is used to stop the meeting live stream which can be accessed from the Meeting
class.
class MeetingViewController {
// button to stop livestream
@IBAction func startLiveStreamButtonTapped(_ sender: Any) {
self.meeting?.stopLivestream()
}
}
Event associated with Livestreamโ
Whenever the livestream state changes, then onLivestreamStateChanged
the event will trigger.
extension MeetingViewController: MeetingEventListener {
// rtmp-event
func onLivestreamStateChanged(state: LiveStreamState) {
switch(state) {
case .LIVESTREAM_STARTING:
print("livestream starting")
case .LIVESTREAM_STARTED:
print("livestream started")
case .LIVESTREAM_STOPPING:
print("livestream stoping")
case .LIVESTREAM_STOPPED:
print("livestream stopped")
}
}
}
Custom Templateโ
With VideoSDK, you can also use your own custom-designed layout template to live stream the meetings. In order to use the custom template, you need to create a template for which you can follow this guide. Once you have set the template, you can use the REST API to start the live stream with the templateURL
parameter.
Known Issueโ
Please add the following line to the MeetingViewController.swift
file's viewDidLoad
method If you get your video out of the container view like the below image.
override func viewDidLoad() {
localParticipantVideoView.frame = CGRect(x: 10, y: 0, width: localParticipantViewContainer.frame.width, height: localParticipantViewContainer.frame.height)
localParticipantVideoView.bounds = CGRect(x: 10, y: 0, width: localParticipantViewContainer.frame.width, height: localParticipantViewContainer.frame.height)
localParticipantVideoView.clipsToBounds = true
remoteParticipantVideoView.frame = CGRect(x: 10, y: 0, width: remoteParticipantViewContainer.frame.width, height: remoteParticipantViewContainer.frame.height)
remoteParticipantVideoView.bounds = CGRect(x: 10, y: 0, width: remoteParticipantViewContainer.frame.width, height: remoteParticipantViewContainer.frame.height)
remoteParticipantVideoView.clipsToBounds = true
}
๐ Conclusion
Integrating RTMP live streaming into an iOS video call app using VideoSDK is a powerful way to enhance real-time communication. With this integration, users can seamlessly broadcast live video content alongside their video calls, expanding the app's functionality.
Unlock the full potential of VideoSDK today and craft seamless video experiences! Sign up now to receive 10,000 free minutes and take your video app to new heights.
Top comments (0)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.