Giter Club home page Giter Club logo

webrtc-ios's People

Contributors

dependabot[bot] avatar kmadiar avatar stasel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

webrtc-ios's Issues

No video after first call

I do not see the video when I sendOffer again(after first sendOffer remoteCandidateCount = 52, after second = 104, but in VideoViewController I do not see anything). Local candidate in first call = 3, in second 0. I get 3 Local candidate about every 1 minute, if I do nothing

Use of undeclared...

This is a strange behaviour: Shortly after opening and first successful compilation tons of errors are indicated, even though the code still compiles and runs well.

I can easily "go to declaration" from each of the errors indicated, so it might be a Swift to C problem, maybe a missing header or so.

Any ideas? XCode 10.1 on macOS Mojave

image

After Turn on Video can't turn off

After the open video, it can't turn off
if one of user click back and he can't see his self video but other use still see his video so we have to remove the video from live after one of user click back/turn off video

Compilation on iPhone fails

Many thanks for submitting your work to a large audience. It is certainly a big time saver for folks starting to look into the integration of webRTC in their iOS projects.
Project successfully compiles and runs when target is simulator. However, when target is real device, e.g. iPhone, project does not successfully compile. Xcode throws 2 errors on lines 30 and 32 in file VideoViewController.swift.

Would you pls look into this issue?

Video call hang Simulator to iPhone

Does anybody have an idea why my video call stuck(simulator) :( after 20/30s but iPhone working fine
I am calling from simulator to iPhone 6

Using my own peerconnection server

Where do I change the code to point to my own server for peer connection? I assume the config.swift file gets changed for using my turn server, but is the signaling server the same is a peer connection server?

iPhone 11 pro videocall freezes

I noticed that with the iPhone 11 pro the video freezes after about 15 seconds while with an iPhone 7 everything works fine.

Lowering the quality of the video call in WebRTCClient.swift
instead it also works well on the iPhone 11 pro.

the change i made is change:

        let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
            let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
            let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
            return width1 < width2
        }).**last**,

whit:

        let format = (RTCCameraVideoCapturer.supportedFormats(for: frontCamera).sorted { (f1, f2) -> Bool in
            let width1 = CMVideoFormatDescriptionGetDimensions(f1.formatDescription).width
            let width2 = CMVideoFormatDescriptionGetDimensions(f2.formatDescription).width
            return width1 < width2
        }).**first**,

Offer is not getting at receiver side

Hye stasel,
hope you are well. I've cloned this repo. I'm creating an Offer from initiator but not getting at other end. Can you please tell me is that problem related to your socket connection u're using???

LTE to WIFI connections not working for WebRTC

LTE to WIFI connections not working for WebRTC swift does not work i have your code on production app with my own turn server :( do you know why it's hot working?
my production app does not working :(
appName is: AddaMe on AppleStore

example of RTCDataChannel implementation

Please add example of RTCDataChannell

func createDataChannel() -> RTCDataChannel? {
        let dataChannelConfiguration = RTCDataChannelConfiguration()
        let dataChannel = self.peerConnection.dataChannel(forLabel: "exampleDataChannel", configuration: dataChannelConfiguration)
        dataChannel?.delegate = self
        return dataChannel
}

...

extension WebRTCClient: RTCDataChannelDelegate {
    func dataChannelDidChangeState(_ dataChannel: RTCDataChannel) {
        switch dataChannel.readyState {
        case .open:
            print("open datachannel")
            break
        case .connecting:
            print("connecting datachannel")
            break
        case .closing:
            print("closing datachannel")
            break
        case .closed:
            print("closed datachannel")
            break
        }
    }
    
    func dataChannel(_ dataChannel: RTCDataChannel, didReceiveMessageWith buffer: RTCDataBuffer) {
        print(String(data: buffer.data, encoding: .utf8))
    }
}

1 to many?

Hello,
is it possible to use this library to share 1 video to few devices?
for example: 1 device is broadcasting and 2 devices watch on video from 1 device?

if its possible, can you provide some example?

Stream local screen

This thread is more of a suggestion request than an issue. I am trying to make a teamviewer Pilot-type application where I can stream my screen to another person so that he can see it.
I see you are hooking the camera feed directly, what would you recommend to stream screen instead?

I am thinking of recording the screen to a local file in bundle and using RTCFileVideoCapturer's startCapturing with the file name to capture the screen and send it over to another person at the same time.

Am I overseeing a more efficient way?

Signalling status: Not connected

Hello,

I have installed your app, ran signalling server at port 8080, but when i launched app it says
Signalling status: Not connected

screen shot 2018-10-10 at 6 21 47 pm

thanks,
-Kota

Using of protocols for signaling channel

Thank you for your work! It is day saver!

Could you please consider using protocols for messaging channel?

It would help a lot to integrate your code to other projects.

Example:

protocol SignalingChannelMessageSender {
	func send(message: String)
}
protocol SignalingChannelReceiver {
	func didReceive(messageData: Data)
}
final class SignalingClient {
    
    private let socket: SignalingChannelMessageSender
    private let decoder = JSONDecoder()
    private let encoder = JSONEncoder()
    weak var delegate: SignalClientDelegate?
    
	init(socket: SignalingChannelMessageSender) {
		self.socket = socket
    }
    
    func send(sdp rtcSdp: RTCSessionDescription) {
        let message = Message.sdp(SessionDescription(from: rtcSdp))
        do {
            let dataMessage = try self.encoder.encode(message)
			guard let dataMessageString = String(data: dataMessage, encoding: .utf8) else {
				debugPrint("Error - cannot create dataMessageString")
				return
			}
			socket.send(message: dataMessageString)
        }
        catch {
            debugPrint("Warning: Could not encode sdp: \(error)")
        }
    }
    
    func send(candidate rtcIceCandidate: RTCIceCandidate) {
        let message = Message.candidate(IceCandidate(from: rtcIceCandidate))
        do {
            let dataMessage = try self.encoder.encode(message)
			guard let dataMessageString = String(data: dataMessage, encoding: .utf8) else {
				debugPrint("Error - cannot create dataMessageString")
				return
			}
			socket.send(message: dataMessageString)
        }
        catch {
            debugPrint("Warning: Could not encode candidate: \(error)")
        }
    }
}


extension SignalingClient: SignalingChannelReceiver {
	func didReceive(messageData: Data) {
		let message: Message
		do {
			message = try self.decoder.decode(Message.self, from: messageData)
		}
		catch {
			debugPrint("Warning: Could not decode incoming message: \(error)")
			return
		}

		switch message {
		case .candidate(let iceCandidate):
			self.delegate?.signalClient(self, didReceiveCandidate: iceCandidate.rtcIceCandidate)
		case .sdp(let sessionDescription):
			self.delegate?.signalClient(self, didReceiveRemoteSdp: sessionDescription.rtcSessionDescription)
		}
	}
}

Thanks!

Code improvement

what you think about this code improvement
Something like this

import Foundation
import WebRTC

enum SdpType: String, Codable {
    case offer
    case answer
    case pranswer
}

struct SessionDescription: Codable {
    let type: SdpType
    let sdp: String
    
    init(from rtcSessionDescription: RTCSessionDescription) {
        self.sdp = rtcSessionDescription.sdp
        
        switch rtcSessionDescription.type {
        case .offer:
            self.type = .offer
        case .prAnswer:
            self.type = .pranswer
        case .answer:
            self.type = .answer
        }
    }
    
    func toRTCSessionDescription() -> RTCSessionDescription {
        let rtcSdpType: RTCSdpType
        
        switch type {
        case .offer:
            rtcSdpType = .offer
        case .answer:
            rtcSdpType = .answer
        case .pranswer:
            rtcSdpType = .prAnswer
        }
        return RTCSessionDescription(type: rtcSdpType, sdp: sdp)
    }
}

struct AddaCandidate: Codable {
    let type: String = "candidate"
    let sdpMLineIndex: Int32
    let sdpMid: String?
    let candidate: String
    
    init(from iceCandidate: RTCIceCandidate) {
        self.sdpMLineIndex = iceCandidate.sdpMLineIndex
        self.sdpMid = iceCandidate.sdpMid
        self.candidate = iceCandidate.sdp
    }
    
    func toRtcCandidate() -> RTCIceCandidate {
        return RTCIceCandidate(sdp: candidate, sdpMLineIndex: sdpMLineIndex, sdpMid: sdpMid)
    }
}

Video problems at ios13

Hi!
After migration my app to iOS13, my videoCalls just disappear. Sound is ok, localView and remoteView is just white. At iOS12 everything is fine. Like this problem: #27
What did you change for demo?

Changing volume for different participants

Thank you so much for getting this together. Demo runs flawlessly and was easy to set up. Been looking through the code for the past few hours and can't seem to figure this out. Is it possible to change the audio levels or volume for different participants in the call? By this I don't mean muting them completely, but rather changing the audio to x 0.2 or 0.6 instead of 1. Thanks!

Video streams not showing up

Hi,

I'm trying to implement a video calling feature as a prototype. I'm using your project as the base. In my app, there is only one view controller. So both local and remote video views are on the same view controller where connection establishing stuff happens too.

import UIKit
import AVFoundation
import WebRTC
import Toaster

class VideoCallViewController: UIViewController {
    @IBOutlet weak private var doctorNameLabel: UILabel!
    @IBOutlet weak private var callDurationLabel: UILabel!
    @IBOutlet weak private var localVideoView: UIView!
    @IBOutlet weak private var remoteVideoView: UIView!
    @IBOutlet weak private var webRTCStatusLabel: UILabel!
    @IBOutlet weak private var acceptButton: UIButton!
    @IBOutlet weak private var callButton: UIButton!
    @IBOutlet weak private var declineButton: UIButton!
    @IBOutlet weak private var muteButton: UIButton!
    
    private let config = Config.default
    
    private var webRTCClient: WebRTCClient!
    private var signalClient: SignalingClient!
    
    
    private var signalingConnected: Bool = false {
        didSet {
            DispatchQueue.main.async {
                if self.signalingConnected {
                    self.callButton.isEnabled = true
                    Toast(text: "Connected to Server").show()
                } else {
                    self.callButton.isEnabled = false
                    Toast(text: "Disconnected from Server").show()
                }
            }
        }
    }
    
    private var mute: Bool = false {
        didSet {
            let iconName = mute ? "mute-off" : "mute-on"
            let iconImage = UIImage(named: iconName)
            muteButton.setImage(iconImage, for: .normal)
        }
    }
    
    
    override func viewDidLoad() {
        super.viewDidLoad()
        NotificationCenter.default.addObserver(self, selector: #selector(VideoCallViewController.toggleSpeakers(_:)), name: AVAudioSession.routeChangeNotification, object: nil)
        
        webRTCClient = WebRTCClient(iceServers: config.webRTCIceServers)
        webRTCClient.delegate = self
        signalClient = self.buildSignalingClient()
        signalClient.delegate = self
        
        doctorNameLabel.isHidden = true
        callDurationLabel.isHidden = true
        localVideoView.isHidden = true
        webRTCStatusLabel.isHidden = true
        callButton.isEnabled = false
        acceptButton.isHidden = true
        declineButton.isHidden = true
        muteButton.isHidden = true
        
        mute = false
        
        signalClient.connect()
    }
    
    deinit {
        NotificationCenter.default.removeObserver(self)
    }
    
    private func buildSignalingClient() -> SignalingClient {
        // iOS 13 has native websocket support. For iOS 12 or lower we will use 3rd party library.
        let webSocketProvider: WebSocketProvider
        if #available(iOS 13.0, *) {
            webSocketProvider = NativeWebSocket(url: config.signalingServerUrl)
        } else {
            webSocketProvider = StarscreamWebSocket(url: config.signalingServerUrl)
        }
        return SignalingClient(webSocket: webSocketProvider)
    }
    
    @objc private func toggleSpeakers(_ notification: Notification) {
        guard
            let userInfo = notification.userInfo,
            let reasonKey = userInfo[AVAudioSessionRouteChangeReasonKey] as? UInt,
            let reason = AVAudioSession.RouteChangeReason(rawValue: reasonKey) else {
            return
        }
        
        switch reason {
        case .newDeviceAvailable:
            let session = AVAudioSession.sharedInstance()
            for output in session.currentRoute.outputs where output.portType == AVAudioSession.Port.headphones {
                // Headphones plugged in
                webRTCClient.speakerOff()
                break
            }
        case .oldDeviceUnavailable:
            if let previousRoute = userInfo[AVAudioSessionRouteChangePreviousRouteKey] as? AVAudioSessionRouteDescription {
                for output in previousRoute.outputs where output.portType == AVAudioSession.Port.headphones {
                    // Headphones pulled out
                    webRTCClient.speakerOn()
                    break
                }
            }
        default:
            ()
        }
    }
    
    private func captureVideo() {
        #if arch(arm64)
            // Using metal (arm64 only)
            let localRenderer = RTCMTLVideoView(frame: localVideoView?.frame ?? CGRect.zero)
            let remoteRenderer = RTCMTLVideoView(frame: remoteVideoView.frame)
            localRenderer.videoContentMode = .scaleAspectFill
            remoteRenderer.videoContentMode = .scaleAspectFill
        #else
            // Using OpenGLES for the rest
            let localRenderer = RTCEAGLVideoView(frame: localVideoView?.frame ?? CGRect.zero)
            let remoteRenderer = RTCEAGLVideoView(frame: remoteVideoView.frame)
        #endif

        webRTCClient.startCaptureLocalVideo(cameraPositon: .front, renderer: localRenderer)
        webRTCClient.renderRemoteVideo(to: remoteRenderer)
    }
    
    // MARK: - Actions
    @IBAction func didTapAcceptButton(_ sender: UIButton) {
        webRTCClient.answer { localSdp in
            self.signalClient.send(sdp: localSdp)
        }
    }
    
    @IBAction func didTapCallButton(_ sender: UIButton) {
        webRTCClient.offer { sdp in
            self.signalClient.send(sdp: sdp)
        }
    }
    
    @IBAction func didTapDeclineButton(_ sender: UIButton) {
        signalClient.disconnect()
        callButton.isHidden = false
    }
    
    @IBAction func didTapMuteButton(_ sender: UIButton) {
        mute = !mute
        if mute {
            webRTCClient.muteAudio()
        } else {
            webRTCClient.unmuteAudio()
        }
    }
}

extension VideoCallViewController: SignalClientDelegate {
    func signalClientDidConnect(_ signalClient: SignalingClient) {
        signalingConnected = true
    }
    
    func signalClientDidDisconnect(_ signalClient: SignalingClient) {
        signalingConnected = false
    }
    
    func signalClient(_ signalClient: SignalingClient, didReceiveRemoteSdp sdp: RTCSessionDescription) {
        webRTCClient.set(remoteSdp: sdp) { error in }
    }
    
    func signalClient(_ signalClient: SignalingClient, didReceiveCandidate candidate: RTCIceCandidate) {
        webRTCClient.set(remoteCandidate: candidate)
    }
}

extension VideoCallViewController: WebRTCClientDelegate {
    func webRTCClient(_ client: WebRTCClient, didDiscoverLocalCandidate candidate: RTCIceCandidate) {
        self.signalClient.send(candidate: candidate)
    }
    
    func webRTCClient(_ client: WebRTCClient, didChangeSignalingState state: RTCSignalingState) {
        switch state {
        case .haveLocalOffer:
            print("SENDING OFFER")
        case .haveRemoteOffer:
            print("RECEIVED REMOTE OFFER")
            DispatchQueue.main.async {
                self.callButton.isHidden = true
                self.acceptButton.isHidden = false
                self.declineButton.isHidden = false
            }
        default:
            print("-------------")
        }
    }
    
    func webRTCClient(_ client: WebRTCClient, didChangeConnectionState state: RTCIceConnectionState) {
        let textColor: UIColor
        switch state {
        case .connected, .completed:
            textColor = .green
            DispatchQueue.main.async {
                self.localVideoView.isHidden = false
                self.callButton.isHidden = true
                self.acceptButton.isHidden = true
                self.declineButton.isHidden = true
                self.muteButton.isHidden = false
                
                self.captureVideo()
            }
        case .disconnected:
            textColor = .orange
        case .failed, .closed:
            textColor = .red
        case .new, .checking, .count:
            textColor = .black
        @unknown default:
            textColor = .black
        }
        DispatchQueue.main.async {
            self.webRTCStatusLabel.isHidden = false
            self.webRTCStatusLabel.text = state.description.capitalized
            self.webRTCStatusLabel.textColor = textColor
        }
    }
    
    func webRTCClient(_ client: WebRTCClient, didReceiveData data: Data) {
        DispatchQueue.main.async {
            let message = String(data: data, encoding: .utf8) ?? "(Binary: \(data.count) bytes)"
            let alert = UIAlertController(title: "Message from WebRTC", message: message, preferredStyle: .alert)
            alert.addAction(UIAlertAction(title: "OK", style: .cancel, handler: nil))
            self.present(alert, animated: true, completion: nil)
        }
    }
}

I moved the code from the original VideoViewController in your project to a method called captureVideo() in my view controller and call that method once the peer connection is successfully established.

However the neither local or remote video streams show up in the respective views. I can't figure out what I'm missing.

I attached my demo project here in case if that's helpful. I also hosted your NodeJS server on Heroku so no need to run it manually for this.

DoctorAppPrototype.zip

Speaker does not work

Hi again,

I came across an issue with the speaker. Even though I have turned it on from the app (device is not muted and volume is turned all the way up), sound still does not come from the loud speaker of the device. It still comes from the receiver so I have to listen to it like I'm taking a normal phone call.

I tested this on an iPhone 6 and in an iPhone 11 Pro. This only occurs in iPhone 11 Pro. Any idea why this is happening?

Use of undeclared...Solved

This is more a comment for anyone who ran into the "Use of undeclared..." issue that was previously closed (#9 ). I was still getting that message even after the bridging header was set properly. Also, right click and "jump to definition" always returned a "?" for any methods inside the WebRTC framework.

I solved it by renaming the project in Xcode from "WebRTC" to "WebRTC-add-your-own-value".

I suspect that references inside the WebRTC framework could not be linked properly because the project name and the framework name are the same, creating ambiguity. Anyways, this is just a guess since now everything works correctly and I am able to jump to definitions inside the framework.

@stasel once you figure out the exact reason why this was happening, I would suggest a feature request to rename this project to something other than WebRTC.

GoogleWebRTC library conflict on IOS 12 to 13.3

GoogleWebRTC
ios 12.2
objc[287]: Class RTCCVPixelBuffer is implemented in both /System/Library/PrivateFrameworks/WebCore.framework/Frameworks/libwebrtc.dylib (0x1c29eaf98) and /private/var/containers/Bundle/Application/3057B118-2044-4852-8672-13C9DF6DEC00/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC (0x101604a10). One of the two will be used. Which one is undefined. objc[287]: Class RTCWrappedNativeVideoDecoder is implemented in both /System/Library/PrivateFrameworks/WebCore.framework/Frameworks/libwebrtc.dylib (0x1c0863458) and /private/var/containers/Bundle/Application/3057B118-2044-4852-8672-13C9DF6DEC00/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC (0x101604a60). One of the two will be used. Which one is undefined. objc[287]: Class RTCWrappedNativeVideoEncoder is implemented in both /System/Library/PrivateFrameworks/WebCore.framework/Frameworks/libwebrtc.dylib (0x1c08634a8) and /private/var/containers/Bundle/Application/3057B118-2044-4852-8672-13C9DF6DEC00/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC (0x101604ab0). One of the two will be used. Which one is undefined. objc[287]: Class RTCVideoDecoderVP8 is implemented in both /System/Library/PrivateFrameworks/WebCore.framework/Frameworks/libwebrtc.dylib (0x1c08631b0) and /private/var/containers/Bundle/Application/3057B118-2044-4852-8672-13C9DF6DEC00/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC (0x101604b28). One of the two will be used. Which one is undefined.

Signaling server

Can you please tell me which signaling server is being used in your demo?example : SIP or XMPP?

how use this code for calling many times

Hello,
thanks for sharing this code!

i had checked everything and cant find way how i can call many times.
its looks like sends only 1 times list of ice candidates.

for every new call i need re-create webRTCClient ?

The file couldn’t be opened.

When trying to open the demo project, this error " The file couldn’t be opened. " pops up. The only project that opens up is the swift signaling server. Any help? or is this just an issue with the recent commit?

Group audio or video calling supported ?

Hello ,

I check to connected with three device with code. Connection is successful but while offer call I can't give answer with two device.
I have to cut call with one device to send answer with other device.

Is group-calling possible with this code ?
Thanks.

remote sdp doesn't receive

Hello, I downloaded and installed your project to my devices. Also I builded up my signaling server and its working on a remote nodejs server, ssl certificates are ok and I have 'wss' connection (I changed defaultsignallingserver in config.swift file as : wss://xxxxxx.com:8484), everything looks ok, device status are connected, when I click 'send offer' button: on the device that I clicked to button 'local sdp' is count up and I see offer and sdp datas on terminal of my server as broadcast data but nothing happening on other device. I tried both of devices but result is same. Can you help me to solve this problem?

Thank you for your works.

Cannot create connection with different network

Hi stasel.
Thank for great example.
It worked successfully when in a wifi.
But when different network RTCIceConnectionStateFailed.
I tried to use my TURN server replace Google STUN server but I still getting the same error.
I hope to receive help.
Thank

WebRTC with Wowza Signaling server.

Hello,

Im trying to implement this code with wowza signaling server url, im not able to get remote SDP. can you please help me with that.

Thanks

pod install not respond

Analyzing dependencies
Cloning spec repo cocoapods from https://github.com/CocoaPods/Specs.git
.....

there 're freeze on this step
pls help me, many thank

Identifying online users in WebRTC

Hi 👋🏼 Thanks for putting together this project. It's really helpful.

I have a question. Is there a way to do something like getting a list of users connected to the server? I want to implement a feature like showing a list of online users.

Is there a way to do that through websockets itself?

Connections with LTE are not working

It looks like that connections with LTE are not woking. I have tested:
LTE -> WIFI
WIFI -> LTE
LTE -> LTE

None of them are working. The connection is made immediately when I switch from WIFI to LTE.

This is my config regarding to the Turn service credentials.

func createPeerConnection() {
    let config = RTCConfiguration()
    
    config.iceServers = [RTCIceServer(urlStrings: [
        "turn:eu-turn4.xirsys.com:80?transport=udp",
        "turn:eu-turn4.xirsys.com:3478?transport=udp",
        "turn:eu-turn4.xirsys.com:80?transport=tcp",
        "turn:eu-turn4.xirsys.com:3478?transport=tcp",
        "turns:eu-turn4.xirsys.com:443?transport=tcp",
        "turns:eu-turn4.xirsys.com:5349?transport=tcp"
    ], username: "my username", credential: "my password"), RTCIceServer(urlStrings: [ "stun:eu-turn4.xirsys.com" ]
        )]

    // Unified plan is more superior than planB
    config.sdpSemantics = .unifiedPlan

    // gatherContinually will let WebRTC to listen to any network changes and send any new candidates to the other client
    config.continualGatheringPolicy = .gatherContinually
        
    let constraints = RTCMediaConstraints(
      mandatoryConstraints: nil,
      optionalConstraints: ["DtlsSrtpKeyAgreement": kRTCMediaConstraintsValueTrue])
    self.peerConnection = WebRTCClient.factory.peerConnection(with: config, constraints: constraints, delegate: self)
    
    self.createMediaSenders()
    self.configureAudioSession()
  }

Also, it seems like that it generates candidates where sdpMLineIndex and sdpMid have the value 0, every time. I don't know if that is related?

Consumes more battery on iPhone

I am using WebRTC framework for Video Call feature, it works proper but have issue of battery draining much fast in iPhone.
In 9 mins of call it drains around 30% the battery.

So any idea what is the issue here?

App crashes on iOS 13, Xcode 11.4

dyld: Library not loaded: @rpath/WebRTC.framework/WebRTC
  Referenced from: /private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/WebRTC-Demo
  Reason: no suitable image found.  Did find:
	/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC: code signature invalid for '/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC'

	/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC: code signature invalid for '/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC'

	/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC: stat() failed with errno=1
	/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC: code signature invalid for '/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC'

	/private/var/containers/Bundle/Application/5E137696-4DBD-4B6B-A460-0B4900939006/WebRTC-Demo.app/Frameworks/WebRTC.framework/WebRTC: stat() failed with errno=1
(lldb) 

Type 'SignalClient' does not conform to protocol 'WebSocketDelegate'

Hi! I load your project and made something for my server. When I copy/past files WebRTCClient, WebRTC-Bridging-Header, SignalClient, RTCStates, RTCSessionDescription+JSON, RTCIceCandidate+JSON to my project. I have some problems - in SignalClient file
"Type 'SignalClient' does not conform to protocol 'WebSocketDelegate'"
http://joxi.ru/BA0Leg0HJw1zXA
Xcode want to add only
func websocketDidReceiveData(socket: WebSocketClient, data: Data)
But all protocol methods are implemented.
Can you help me please?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.