It is a strange time to be a developer, with most countries going into lock-down, the need for communication technologies has sparked.
One of the technologies that has been readily available for web developers to create this solutions is the WebRTC API which web browsers have implemented almost across the board. The next step for these kind of technologies is to make a jump to mobile, either with PWA or a Native application. The former could be considered already viable but the features that the public expects with video apps has only become more and more complex.
For React-Native Developers the WebRTC solutions have historically had a high barrier of entry. Expo has not yet (as of May 2020) integrated the
react-native-webrtc native module into their framework despite having demand for it. Of course it is not Expo's responsibility to implement the things that we claim for, that is not how open source works.
If you want to have WebRTC on your React-Native project you must go with the official
react-native-webrtc's documentation has been neglected by the community, as it is one of the least favourite parts of a developer's tasks, but an essential task nonetheless. You can follow the installation guides for both Android and iOS but in this post I will try to guide you through a much much simpler process that is made possible thanks to the auto-linking features that came with
My purpose here is to lower the entrance barrier for people to explore WebRTC technologies, for this reason I will approach this from the perspective of a greenfield project, the typical
npx react-native init myApp will do.
Once you have a boilerplate project we need add
react-native-webrtc as a dependency:
npm install --save react-native-webrtc
From here on our next step is going to be to integrate
react-native-webrtc with each platform.
The high-level overview of this setup is:
- glue the native module to each platform
- ask for the necessary permissions
I highly recommend you use a physical device to debug.
The iOS integration is the simplest one because cocoapods will do most of the legwork.
On the project's root, locate the podfile under
./ios/podfile, the first line is where we set the platform version, we will change this to version 10:
platform :ios, '10.0'
On the podfile, locate your project's target pods and add the following line:
pod 'react-native-webrtc', :path => '../node_modules/react-native-webrtc'
Once you've made these changes save everything and open a terminal on your project's root and run
Now that you've completed the first step, it's time to ask (nicely) for some permissions, locate the
info.plist file under
./ios/myApp and add the following lines after the first
<key>NSCameraUsageDescription</key> <string>Camera Permission</string> <key>NSMicrophoneUsageDescription</key> <string>Microphone Permission</string>
And that completes the iOS setup portion, easy right? If you only care about the iOS platform then all you need to do now is to run the project on an iPhone.
The setup process for Android is inherently more complex but we will try to keep it as simple as possible.
On the project's root locate graddle settings under
./android/settings.graddle and replace the last line with the following:
include ':WebRTCModule', ':app' project(':WebRTCModule').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-webrtc/android')
In the ideal world we would add permissions and be done with it, but bear with me
Locate the project's graddle properties under
./android/graddle.properties and add the following line:
Locate Android's build properties under
./android/build.graddle and look for the buildscript dependencies, make sure the tooling is on version 3.5.2:
On different build properties located under
./android/app/build.graddle locate the dependencies and add the following line within its scope:
Go into the project's Android
MainApplication.java located at
./android/app/src/java/com/all/the/things/ and add the namespace for
react-native-webrtc by adding the following import:
Finally we can ask (nicely) for permissions, locate the project's
./android/app/src/main and add the permissions on the same scope as the
<application> tag with the following lines:
<uses-permission android:name="android.permission.INTERNET" /> <uses-permission android:name="android.permission.ACCESS_NETWORK_STATE"/> <uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" /> <uses-permission android:name="android.permission.RECORD_AUDIO" /> <uses-permission android:name="android.permission.WAKE_LOCK" /> <uses-permission android:name="android.permission.WRITE_EXTERNAL_STORAGE"/> <uses-permission android:name="android.permission.CAMERA" /> <uses-feature android:name="android.hardware.camera" /> <uses-feature android:name="android.hardware.camera.autofocus"/>
Once this is in place we are ready to run the project on a device for testing.
react-native-webrtc's installation guides while not completely up-to-date, they have valuable information that is still useful for some people. I've made a template application that has been setup for WebRTC broadcasting and includes a simple socket implementation of a signaling server (required to start the Peer Connections), The sample itself is written using hooks and its linked below:
What do you think about video broadcasting technologies? What other uses could Real Time Communication technologies be applied to? Have you baked bread during this quarantine yet? Thanks for reading!