DEV Community

Chetan Sandanshiv for Video SDK

Posted on • Originally published at videosdk.live on

How to Integrate Screen Share in iOS Video Call App?

Image description

๐Ÿ“Œ Introduction

Integrating screen share in an iOS video call app enhances user experience and collaboration. With this feature, users can seamlessly share their screen during calls, facilitating presentations, demonstrations, and remote assistance. Implementing screen sharing requires integrating APIs for capturing device screens, ensuring smooth transmission of visuals, and maintaining privacy controls.

Benefits of Screen Share in iOS Video Call App:

  1. Enhanced Collaboration: Screen share enables users to collaborate more effectively by sharing documents, presentations, or designs during video calls, fostering better understanding and teamwork.
  2. Improved Communication: Visual aids facilitate clearer communication, especially for technical support, education, or remote work scenarios, leading to faster issue resolution and knowledge transfer.
  3. Increased Productivity: With real-time sharing, teams can discuss projects or review documents without the need for additional tools or meetings, saving time and boosting productivity.

Use Cases of Screen Share in iOS Video Call App:

  1. Business Meetings: Sales teams can share presentations or product demos with clients, enhancing engagement and closing deals more effectively.
  2. Remote Work: Colleagues can collaborate on projects by sharing screens to discuss documents, designs, or code, replicating in-person collaboration remotely.
  3. Technical Support: Customer support agents can visually guide users through troubleshooting steps by sharing screens, and resolving issues efficiently.

This tutorial guides you through integrating this valuable feature into your JavaScript video call application using VideoSDK. We'll cover the steps required to leverage VideoSDK's capabilities and implement visual cues that highlight the active speaker within your app's interface.

๐Ÿš€ Getting Started with VideoSDK

VideoSDK enables the opportunity to integrate video & audio calling into Web, Android, and iOS applications with so many different frameworks. It is the best infrastructure solution that provides programmable SDKs and REST APIs to build scalable video conferencing applications. This guide will get you running with the VideoSDK video & audio calling in minutes.

Create a VideoSDK Account

Go to your VideoSDK dashboard and sign up if you don't have an account. This account gives you access to the required Video SDK token, which acts as an authentication key that allows your application to interact with VideoSDK functionality.

Generate your Auth Token

Visit your VideoSDK dashboard and navigate to the "API Key" section to generate your auth token. This token is crucial in authorizing your application to use VideoSDK features. For a more visual understanding of the account creation and token generation process, consider referring to the provided tutorial.

Prerequisites and Setup

  • iOS 11.0+
  • Xcode 12.0+
  • Swift 5.0+

This App will contain two screens:

Join Screen : This screen allows the user to either create a meeting or join the predefined meeting.

Meeting Screen : This screen basically contains local and remote participant views and some meeting controls such as Enable/Disable the mic & Camera and Leave meeting.

๐Ÿ› ๏ธ Integrate VideoSDKโ€‹

To install VideoSDK, you must initialise the pod on the project by running the following command:

pod init
Enter fullscreen mode Exit fullscreen mode

It will create the podfile in your project folder, Open that file and add the dependency for the VideoSDK, like below:

pod 'VideoSDKRTC', :git => 'https://github.com/videosdk-live/videosdk-rtc-ios-sdk.git'
Enter fullscreen mode Exit fullscreen mode

How to Integrate Screen Share in iOS Video Call App?

then run the below code to install the pod:

pod install
Enter fullscreen mode Exit fullscreen mode

then declare the permissions in Info.plist :

<key>NSCameraUsageDescription</key>
<string>Camera permission description</string>
<key>NSMicrophoneUsageDescription</key>
<string>Microphone permission description</string>
Enter fullscreen mode Exit fullscreen mode

Project Structure

iOSQuickStartDemo
   โ”œโ”€โ”€ Models
        โ”œโ”€โ”€ RoomStruct.swift
        โ””โ”€โ”€ MeetingData.swift
   โ”œโ”€โ”€ ViewControllers
        โ”œโ”€โ”€ StartMeetingViewController.swift
        โ””โ”€โ”€ MeetingViewController.swift
   โ”œโ”€โ”€ AppDelegate.swift // Default
   โ”œโ”€โ”€ SceneDelegate.swift // Default
   โ””โ”€โ”€ APIService
           โ””โ”€โ”€ APIService.swift
   โ”œโ”€โ”€ Main.storyboard // Default
   โ”œโ”€โ”€ LaunchScreen.storyboard // Default
   โ””โ”€โ”€ Info.plist // Default
 BroadcastExtension
   โ”œโ”€โ”€ SampleHandler.swift // Default
   โ”œโ”€โ”€ Atomic.swift
   โ””โ”€โ”€ SocketConnection.swift
   โ”œโ”€โ”€ DarwinNotification.swift
   โ”œโ”€โ”€ SampleUploader.swift
   โ””โ”€โ”€ Info.plist // Default
 Pods
     โ””โ”€โ”€ Podfile
Enter fullscreen mode Exit fullscreen mode

Create modelsโ€‹

Create a swift file for MeetingData and RoomStruct class model for setting data in object pattern.

import Foundation
struct MeetingData {
    let token: String
    let name: String
    let meetingId: String
    let micEnabled: Bool
    let cameraEnabled: Bool
}
Enter fullscreen mode Exit fullscreen mode

MeetingData.swift

import Foundation
struct RoomsStruct: Codable {
    let createdAt, updatedAt, roomID: String?
    let links: Links?
    let id: String?
    enum CodingKeys: String, CodingKey {
        case createdAt, updatedAt
        case roomID = "roomId"
        case links, id
    }
}

// MARK: - Links
struct Links: Codable {
    let getRoom, getSession: String?
    enum CodingKeys: String, CodingKey {
        case getRoom = "get_room"
        case getSession = "get_session"
    }
}
Enter fullscreen mode Exit fullscreen mode

RoomStruct.swift

๐ŸŽฅ Essential Steps for Building the Video Calling

This guide is designed to walk you through the process of integrating Screen Share with VideoSDK. We'll cover everything from setting up the SDK to incorporating the visual cues into your app's interface, ensuring a smooth and efficient implementation process.

Step 1: Get started with APIClientโ€‹

Before jumping to anything else, we have to write an API to generate unique meetingId. You will require an authentication token; you can generate it either using videosdk-server-api-example or from the VideoSDK Dashboard for developers.

import Foundation

let TOKEN_STRING: String = "<AUTH_TOKEN>"

class APIService {

  class func createMeeting(token: String, completion: @escaping (Result<String, Error>) -> Void) {

    let url = URL(string: "https://api.videosdk.live/v2/rooms")!

    var request = URLRequest(url: url)
    request.httpMethod = "POST"
    request.addValue(TOKEN_STRING, forHTTPHeaderField: "authorization")

    URLSession.shared.dataTask(
      with: request,
      completionHandler: { (data: Data?, response: URLResponse?, error: Error?) in

        DispatchQueue.main.async {

          if let data = data, let utf8Text = String(data: data, encoding: .utf8) {
            do {
              let dataArray = try JSONDecoder().decode(RoomsStruct.self, from: data)

              completion(.success(dataArray.roomID ?? ""))
            } catch {
              print("Error while creating a meeting: \(error)")
              completion(.failure(error))
            }
          }
        }
      }
    ).resume()
  }
}

Enter fullscreen mode Exit fullscreen mode

APIService.swift

Step 2 : Implement Join Screenโ€‹

The Join Screen will work as a medium to either schedule a new meeting or join an existing meeting.

import Foundation
import UIKit

class StartMeetingViewController: UIViewController, UITextFieldDelegate {

  private var serverToken = ""

  /// MARK: outlet for create meeting button
  @IBOutlet weak var btnCreateMeeting: UIButton!

  /// MARK: outlet for join meeting button
  @IBOutlet weak var btnJoinMeeting: UIButton!

  /// MARK: outlet for meetingId textfield
  @IBOutlet weak var txtMeetingId: UITextField!

  /// MARK: Initialize the private variable with TOKEN_STRING &
  /// setting the meeting id in the textfield
  override func viewDidLoad() {
    txtMeetingId.delegate = self
    serverToken = TOKEN_STRING
    txtMeetingId.text = "PROVIDE-STATIC-MEETING-ID"
  }

  /// MARK: method for joining meeting through seague named as "StartMeeting"
  /// after validating the serverToken in not empty
  func joinMeeting() {

    txtMeetingId.resignFirstResponder()

    if !serverToken.isEmpty {
      DispatchQueue.main.async {
        self.dismiss(animated: true) {
          self.performSegue(withIdentifier: "StartMeeting", sender: nil)
        }
      }
    } else {
      print("Please provide auth token to start the meeting.")
    }
  }

  /// MARK: outlet for create meeting button tap event
  @IBAction func btnCreateMeetingTapped(_ sender: Any) {
    print("show loader while meeting gets connected with server")
    joinRoom()
  }

  /// MARK: outlet for join meeting button tap event
  @IBAction func btnJoinMeetingTapped(_ sender: Any) {
    if (txtMeetingId.text ?? "").isEmpty {

      print("Please provide meeting id to start the meeting.")
      txtMeetingId.resignFirstResponder()
    } else {
      joinMeeting()
    }
  }

  // MARK: - method for creating room api call and getting meetingId for joining meeting

  func joinRoom() {

    APIService.createMeeting(token: self.serverToken) { result in
      if case .success(let meetingId) = result {
        DispatchQueue.main.async {
          self.txtMeetingId.text = meetingId
          self.joinMeeting()
        }
      }
    }
  }

  /// MARK: preparing to animate to meetingViewController screen
  override func prepare(for segue: UIStoryboardSegue, sender: Any?) {

    guard let navigation = segue.destination as? UINavigationController,

      let meetingViewController = navigation.topViewController as? MeetingViewController
    else {
      return
    }

    meetingViewController.meetingData = MeetingData(
      token: serverToken,
      name: txtMeetingId.text ?? "Guest",
      meetingId: txtMeetingId.text ?? "",
      micEnabled: true,
      cameraEnabled: true
    )
  }
}

Enter fullscreen mode Exit fullscreen mode

StartMeetingViewController.swift

Step 3 : Initialize and Join Meetingโ€‹

Using the provided token and meetingId, we will configure and initialize the meeting in viewDidLoad().

Then, we'll add @IBOutlet for localParticipantVideoView and remoteParticipantVideoView, which can render local and remote participant media, respectively.

class MeetingViewController: UIViewController {

import UIKit
import VideoSDKRTC
import WebRTC
import AVFoundation

class MeetingViewController: UIViewController {

// MARK: - Properties
// outlet for local participant container view
   @IBOutlet weak var localParticipantViewContainer: UIView!

// outlet for label for meeting Id
   @IBOutlet weak var lblMeetingId: UILabel!

// outlet for local participant video view
   @IBOutlet weak var localParticipantVideoView: RTCMTLVideoView!

// outlet for remote participant video view
   @IBOutlet weak var remoteParticipantVideoView: RTCMTLVideoView!

// outlet for remote participant no media label
   @IBOutlet weak var lblRemoteParticipantNoMedia: UILabel!

// outlet for remote participant container view
   @IBOutlet weak var remoteParticipantViewContainer: UIView!

// outlet for local participant no media label
   @IBOutlet weak var lblLocalParticipantNoMedia: UILabel!

// Meeting data - required to start
   var meetingData: MeetingData!

// current meeting reference
   private var meeting: Meeting?

    // MARK: - video participants including self to show in UI
    private var participants: [Participant] = []

        // MARK: - Lifecycle Events

        override func viewDidLoad() {
        super.viewDidLoad()
        // configure the VideoSDK with token
        VideoSDK.config(token: meetingData.token)

        // init meeting
        initializeMeeting()

        // set meeting id in button text
        lblMeetingId.text = "Meeting Id: \(meetingData.meetingId)"
      }

      override func viewWillAppear(_ animated: Bool) {
          super.viewWillAppear(animated)
          navigationController?.navigationBar.isHidden = true
      }

    override func viewWillDisappear(_ animated: Bool) {
        super.viewWillDisappear(animated)
        navigationController?.navigationBar.isHidden = false
        NotificationCenter.default.removeObserver(self)
    }

        // MARK: - Meeting

        private func initializeMeeting() {

            // Initialize the VideoSDK
            meeting = VideoSDK.initMeeting(
                meetingId: meetingData.meetingId,
                participantName: meetingData.name,
                micEnabled: meetingData.micEnabled,
                webcamEnabled: meetingData.cameraEnabled
            )

            // Adding the listener to meeting
            meeting?.addEventListener(self)

            // joining the meeting
            meeting?.join()
        }
}
Enter fullscreen mode Exit fullscreen mode

MeetingViewController.swift

Step 4 : Implement Controlsโ€‹

After initializing the meeting in the previous step, we will now add @IBOutlet for btnLeave, btnToggleVideo and btnToggleMic which can control the media in the meeting.

class MeetingViewController: UIViewController {

...

    // outlet for leave button
    @IBOutlet weak var btnLeave: UIButton!

    // outlet for toggle video button
    @IBOutlet weak var btnToggleVideo: UIButton!

    // outlet for toggle audio button
    @IBOutlet weak var btnToggleMic: UIButton!

    // bool for mic
    var micEnabled = true
    // bool for video
    var videoEnabled = true

    // outlet for leave button click event
    @IBAction func btnLeaveTapped(_ sender: Any) {
            DispatchQueue.main.async {
                self.meeting?.leave()
                self.dismiss(animated: true)
            }
        }

    // outlet for toggle mic button click event
    @IBAction func btnToggleMicTapped(_ sender: Any) {
        if micEnabled {
            micEnabled = !micEnabled // false
            self.meeting?.muteMic()
        } else {
            micEnabled = !micEnabled // true
            self.meeting?.unmuteMic()
        }
    }

    // outlet for toggle video button click event
    @IBAction func btnToggleVideoTapped(_ sender: Any) {
        if videoEnabled {
            videoEnabled = !videoEnabled // false
            self.meeting?.disableWebcam()
        } else {
            videoEnabled = !videoEnabled // true
            self.meeting?.enableWebcam()
        }
    }

...

}
Enter fullscreen mode Exit fullscreen mode

MeetingViewController.swift

Step 5 : Implementing MeetingEventListenerโ€‹

In this step, we'll create an extension for the MeetingViewController that implements the MeetingEventListener, which implements the onMeetingJoined, onMeetingLeft, onParticipantJoined, onParticipantLeft, onParticipantChanged, onSpeakerChanged, etc. methods.


class MeetingViewController: UIViewController {

...

extension MeetingViewController: MeetingEventListener {

        /// Meeting started
        func onMeetingJoined() {

            // handle local participant on start
            guard let localParticipant = self.meeting?.localParticipant else { return }
            // add to list
            participants.append(localParticipant)

            // add event listener
            localParticipant.addEventListener(self)

            localParticipant.setQuality(.high)

            if(localParticipant.isLocal){
                self.localParticipantViewContainer.isHidden = false
            } else {
                self.remoteParticipantViewContainer.isHidden = false
            }
        }

        /// Meeting ended
        func onMeetingLeft() {
            // remove listeners
            meeting?.localParticipant.removeEventListener(self)
            meeting?.removeEventListener(self)
        }

        /// A new participant joined
        func onParticipantJoined(_ participant: Participant) {
            participants.append(participant)

            // add listener
            participant.addEventListener(self)

            participant.setQuality(.high)

            if(participant.isLocal){
                self.localParticipantViewContainer.isHidden = false
            } else {
                self.remoteParticipantViewContainer.isHidden = false
            }
        }

        /// A participant left the meeting
        /// - Parameter participant: participant object
        func onParticipantLeft(_ participant: Participant) {
            participant.removeEventListener(self)
            guard let index = self.participants.firstIndex(where: { $0.id == participant.id }) else {
                return
            }
            // remove participant from list
            participants.remove(at: index)
            // hide from ui
            UIView.animate(withDuration: 0.5){
                if(!participant.isLocal){
                    self.remoteParticipantViewContainer.isHidden = true
                }
            }
        }

        /// Called when speaker is changed
        /// - Parameter participantId: participant id of the speaker, nil when no one is speaking.
        func onSpeakerChanged(participantId: String?) {

            // show indication for active speaker
            if let participant = participants.first(where: { $0.id == participantId }) {
                self.showActiveSpeakerIndicator(participant.isLocal ? localParticipantViewContainer : remoteParticipantViewContainer, true)
            }

            // hide indication for others participants
            let otherParticipants = participants.filter { $0.id != participantId }
            for participant in otherParticipants {
                if participants.count > 1 && participant.isLocal {
                    showActiveSpeakerIndicator(localParticipantViewContainer, false)
                } else {
                    showActiveSpeakerIndicator(remoteParticipantViewContainer, false)
                }
            }
        }

        func showActiveSpeakerIndicator(_ view: UIView, _ show: Bool) {
            view.layer.borderWidth = 4.0
            view.layer.borderColor = show ? UIColor.blue.cgColor : UIColor.clear.cgColor
        }

}

...
Enter fullscreen mode Exit fullscreen mode

MeetingViewController.swift

Step 6: Implementing ParticipantEventListener

In this stage, we'll add an extension for the MeetingViewController that implements the ParticipantEventListener, which implements the onStreamEnabled and onStreamDisabled methods for the audio and video of MediaStreams enabled or disabled.

The function updateUI is frequently used to control or modify the user interface (enable/disable camera & mic) in accordance with MediaStream state.

class MeetingViewController: UIViewController {

...

extension MeetingViewController: ParticipantEventListener {

/// Participant has enabled mic, video or screenshare
/// - Parameters:
/// - stream: enabled stream object
/// - participant: participant object
func onStreamEnabled(_ stream: MediaStream, forParticipant participant: Participant) {
    updateUI(participant: participant, forStream: stream, enabled: true)
 }

/// Participant has disabled mic, video or screenshare
/// - Parameters:
/// - stream: disabled stream object
/// - participant: participant object

func onStreamDisabled(_ stream: MediaStream, 
            forParticipant participant: Participant) {

  updateUI(participant: participant, forStream: stream, enabled: false)
 }

}

private extension MeetingViewController {

 func updateUI(participant: Participant, forStream stream: MediaStream, enabled: Bool) { // true
        switch stream.kind {
        case .state(value: .video):
            if let videotrack = stream.track as? RTCVideoTrack {
                if enabled {
                    DispatchQueue.main.async {
                        UIView.animate(withDuration: 0.5){

                            if(participant.isLocal) {

        self.localParticipantViewContainer.isHidden = false
    self.localParticipantVideoView.isHidden = false       
    self.localParticipantVideoView.videoContentMode = .scaleAspectFill self.localParticipantViewContainer.bringSubviewToFront(self.localParticipantVideoView)                                                           
    videotrack.add(self.localParticipantVideoView)
    self.lblLocalParticipantNoMedia.isHidden = true

} else {
        self.remoteParticipantViewContainer.isHidden = false
            self.remoteParticipantVideoView.isHidden = false
                                self.remoteParticipantVideoView.videoContentMode = .scaleAspectFill
                                self.remoteParticipantViewContainer.bringSubviewToFront(self.remoteParticipantVideoView)
                                                videotrack.add(self.remoteParticipantVideoView)
 self.lblRemoteParticipantNoMedia.isHidden = true
        }
     }
  }
} else {
         UIView.animate(withDuration: 0.5){
                if(participant.isLocal){

                    self.localParticipantViewContainer.isHidden = false
                    self.localParticipantVideoView.isHidden = true
                    self.lblLocalParticipantNoMedia.isHidden = false
                            videotrack.remove(self.localParticipantVideoView)
} else {
                   self.remoteParticipantViewContainer.isHidden = false
                   self.remoteParticipantVideoView.isHidden = true
                   self.lblRemoteParticipantNoMedia.isHidden = false
                            videotrack.remove(self.remoteParticipantVideoView)
      }
    }
  }
}

     case .state(value: .audio):
            if participant.isLocal {

               localParticipantViewContainer.layer.borderWidth = 4.0
               localParticipantViewContainer.layer.borderColor = enabled ? UIColor.clear.cgColor : UIColor.red.cgColor
            } else {
                remoteParticipantViewContainer.layer.borderWidth = 4.0
                remoteParticipantViewContainer.layer.borderColor = enabled ? UIColor.clear.cgColor : UIColor.red.cgColor
            }
        default:
            break
        }
    }
}

...

Enter fullscreen mode Exit fullscreen mode

Known Issueโ€‹

Please add the following line in the MeetingViewController.swift file's viewDidLoad method If you get your video out of the container.

override func viewDidLoad() {

  localParticipantVideoView.frame = CGRect(x: 10, y: 0, 
            width: localParticipantViewContainer.frame.width, 
        height: localParticipantViewContainer.frame.height)

  localParticipantVideoView.bounds = CGRect(x: 10, y: 0, 
        width: localParticipantViewContainer.frame.width, 
            height: localParticipantViewContainer.frame.height)

  localParticipantVideoView.clipsToBounds = true

  remoteParticipantVideoView.frame = CGRect(x: 10, y: 0, 
        width: remoteParticipantViewContainer.frame.width, 
            height: remoteParticipantViewContainer.frame.height)

  remoteParticipantVideoView.bounds = CGRect(x: 10, y: 0, 
        width: remoteParticipantViewContainer.frame.width, 
            height: remoteParticipantViewContainer.frame.height)

    remoteParticipantVideoView.clipsToBounds = true
}

Enter fullscreen mode Exit fullscreen mode

MeetingViewController.swift

TIP:

Stuck anywhere? Check out this example code on GitHub.

๐Ÿ“ธ Integrate Screen Share in Video App

Step 1 : Open Targetโ€‹

Open your project with XCode, then select File > New > Target in the menu bar.

How to Integrate Screen Share in iOS Video Call App?

Step 2: Select Targetโ€‹

Select Broadcast Upload Extension and click

How to Integrate Screen Share in iOS Video Call App?

Step 3: Configure Broadcast Upload Extensionโ€‹

Enter the extension's name in the Product Name field, choose the team from the dropdown, uncheck the "Include UI extension" field, and click "Finish."

How to Integrate Screen Share in iOS Video Call App?

Step 4: Activate Extension schemeโ€‹

You will be prompted with a popup : Activate "Your-Extension-name" scheme? , click on activate.

How to Integrate Screen Share in iOS Video Call App?

Now, the "Broadcast" folder will appear in the Xcode left side bar.

How to Integrate Screen Share in iOS Video Call App?

Step 5: Add External file in the created extension.โ€‹

Open the videosdk-rtc-ios-sdk-example repository, and copy the following files: SampleUploader.swift, SocketConnection.swift, DarwinNotificationCenter.swift, and Atomic.swift to your extension's folder. Ensure that these files are added to the target.

Step 6: Update SampleHandler.swift fileโ€‹

Open SampleHandler.swift, and copy the content of the file. Paste this content into your extension's SampleHandler.swift file.

Step 7: Add Capability to Appโ€‹

In Xcode, navigate to YourappName > Signing & Capabilities , and click on +Capability to configure the app group.

How to Integrate Screen Share in iOS Video Call App?

Choose App Groups from the list.

How to Integrate Screen Share in iOS Video Call App?

After that, select or add the generated App Group ID that you have created before.

How to Integrate Screen Share in iOS Video Call App?

Step 8: Add Capability in Extensionโ€‹

Go to Your-Extension-Name > Signing & Capabilities and configure App Group functionality which we had perform in previous steps. (Group id should be same for both targets).

How to Integrate Screen Share in iOS Video Call App?

Step 9: Add App Group Id in Extension Fileโ€‹

Go to the extension's SampleHandler.swift file and paste your group ID in the appGroupIdentifier constant.

How to Integrate Screen Share in iOS Video Call App?

Step 10: Update App level info.plist fileโ€‹

  1. Add a new key, RTCScreenSharingExtension in Info.plist with the extension's Bundle Identifier as the value.
  2. Add a new key RTCAppGroupIdentifier in Info.plist with the extension's App groups Id as the value.

Note : For the extension's Bundle Identifier, go to TARGETS > Your-Extension-Name > Signing & Capabilities.

How to Integrate Screen Share in iOS Video Call App?

NOTE:

You can also check out the extension's example code on Github.

Integrate ScreenShare in your Appโ€‹

After successfully creating Broadcast Upload Extension using the above-listed steps, we can start using the enableScreenShare and disableScreenShare functions of the Meeting class.

How to use the ScreenShare functions

Use these functions in your app's Meeting Screen.

@IBAction func ScreenShareButtonTapped(_ sender: Any) {
    Task {
      self.meeting?.enableScreenShare()
    }
}

@IBAction func StopScreenShareButtonTapped(_ sender: Any) {
    Task {
      self.meeting?.disableScreenShare()
    }
}
Enter fullscreen mode Exit fullscreen mode

CAUTION :

The function enableScreenShare and disableScreenShare are async functions; therefore use the above syntax to call the ScreenShare functions.

Calling the enableScreenShare() will prompt a RPBroadcastPickerView with the extension that was created using the above steps.

How to Integrate Screen Share in iOS Video Call App?

After clicking the Start Broadcast button, you will be able to get the screen share stream in the session.

  • When the broadcast is started, it creates a Stream that has MediaStream.kind = .share. Using the stream kind, you can prompt a ScreenShare view for remote peers when ScreenShare is started by the local peer.
  • Similarly, you can use the same kind to dismiss the ScreenShare view on the remote peer when the ScreenShare is stopped.
extension MeetingViewController: ParticipantEventListener {
    /// Participant has enabled mic, video or screenshare
    /// - Parameters:
    /// - stream: enabled stream object
    /// - participant: participant object
    func onStreamEnabled(_ stream: MediaStream, forParticipant participant: Participant) {

        if stream.kind == .share {
        // show screen share
            showScreenSharingView(true)
            screenSharingView.showMediastream(stream)
            return
        }

        // show stream in cell
        if let cell = self.cellForParticipant(participant) {
            cell.updateView(forStream: stream, enabled: true)
        }

        if participant.isLocal {
        // turn on controls for local participant
            self.buttonControlsView.updateButtons(forStream: stream, enabled: true)
        }
    }

    /// Participant has disabled mic, video or screenshare
    /// - Parameters:
    /// - stream: disabled stream object
    /// - participant: participant object
    func onStreamDisabled(_ stream: MediaStream, forParticipant participant: Participant) {

        if stream.kind == .share {
        // remove screen share
            showScreenSharingView(false)
            screenSharingView.hideMediastream(stream)
            return
        }

        // hide stream in cell
        if let cell = self.cellForParticipant(participant) {
            cell.updateView(forStream: stream, enabled: false)
        }

        if participant.isLocal {
        // turn off controls for local participant
            self.buttonControlsView.updateButtons(forStream: stream, enabled: false)
        }
    }
}
Enter fullscreen mode Exit fullscreen mode

๐Ÿ”š Conclusion

Integrating screen sharing into your iOS video call app with VideoSDK is a straightforward process that unlocks a powerful new feature for your users. This functionality can streamline communication, boost productivity, and open doors for innovative use cases within your app.

Unlock the power of seamless video communication with VideoSDK! Sign up now and dive into an extraordinary world of interactive video calling experiences.

With VideoSDK, you get a generous 10,000 free minutes to kickstart your journey towards creating engaging and immersive connections. Whether you're building a social app, a collaboration tool, or an e-learning platform, our platform provides the tools you need to elevate your user experience to the next level.

Top comments (0)