A simple native WebRTC demo iOS app using swift.
This demo app's purpose is to demonstrate the bare minimum required to establish peer to peer connection with WebRTC. This is not a production ready code! In order to have a production VoIP app you will need to have a real signaling server (not a simple broadcast server like in this example), deploy your own Turn server(s) and probably integrate CallKit and push notifications.
- Xcode 10.0 or newer
- Cocoapods
- Node.js + npm
XCode 9.4 users can use swift-4.0
branch
- Start the signaling server:
- Navigate to the
signaling
folder. - Run
npm install
to install all dependencies. - Run
node app.js
to start the server.
- Navigate to the
- Run
pod install
from the WebRTC app, where you find theWebRTC.xcworkspace
file. - Modify
SignalClient.swift
and set theserverUrl
variable to your signaling server ip/host. Don't uselocalhost
or127.0.0.1
if you plan to connect other devices in your network to your mac. - Build and run on devices or simulator (video capture is not supported on a simulator).
- Run the app on two devices with the signaling server running.
- Make sure both of the devices are connected to the signaling server.
- On the first device, click on 'Send offer' - this will generate local offer SDP and send it to the other client using the signaling server.
- Wait until the second device receives the offer from the first device (you should see that a remote SDP has arrived).
- Click on 'Send answer' on the second device.
- when the answer arrives to the first device, both of the devices should be now connected to each other using webRTC, try to talk or click on the 'video' button to start capturing video.
- To restart the process, kill both apps and repeat steps 1-6.
Disclaimer: I am not sure if this is the best way doing it but this has worked for me so far:
- Configure WebRTC audio session to use manual audio and disable audio:
RTCAudioSession.sharedInstance().useManualAudio = true
RTCAudioSession.sharedInstance().isAudioEnabled = false
- On your
CXProvider
delegate'sprovider(CXProvider, didActivate: AVAudioSession)
method:- Call
RTCAudioSession.sharedInstance().audioSessionDidActivate
with theAVAudioSession
from theCXProvider
- Enable audio:
RTCAudioSession.sharedInstance().isAudioEnabled = true
- Call
- On your
CXProvider
delegate'sprovider(CXProvider, didDeactivate: AVAudioSession)
callRTCAudioSession.sharedInstance().audioSessionDidDeactivate
with theAVAudioSession
from theCXProvider
WebRTC and CallKit talk from 2016: https://youtu.be/JB2MdcY1MKs?t=6m23s
- WebRTC website: https://webrtc.org/
- WebRTC iOS compile guide: https://webrtc.org/native-code/ios/
- appear.in dev blog post: https://github.com/appearin/tech.appear.in/blob/master/source/_posts/Getting-started-with-WebRTC-on-iOS.md (it uses old WebRTC api but still very informative)
- AppRTC - a more detailed app to demonstrate WebRTC: https://github.com/ISBX/apprtc-ios
- Useful information from pexip: https://pexip.github.io/pexkit-sdk/ios_media