Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/112.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何使用该方法在WEBRTC最新框架的videoview中显示本地流<;Anakros/WebRTC>;?-适用于webrtc框架(iOS)_Ios_Objective C_Webrtc_Video Capture_Apprtc - Fatal编程技术网

如何使用该方法在WEBRTC最新框架的videoview中显示本地流<;Anakros/WebRTC>;?-适用于webrtc框架(iOS)

如何使用该方法在WEBRTC最新框架的videoview中显示本地流<;Anakros/WebRTC>;?-适用于webrtc框架(iOS),ios,objective-c,webrtc,video-capture,apprtc,Ios,Objective C,Webrtc,Video Capture,Apprtc,在为最新的webrtc框架更新了webrtc框架之后,我不知道如何向用户显示本地流,因为方法已更改,在存储库的“iOS”文件夹中没有示例 在旧代码中 RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID]; RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints]; RTCVide

在为最新的webrtc框架更新了webrtc框架之后,我不知道如何向用户显示本地流,因为方法已更改,在存储库的“iOS”文件夹中没有示例

在旧代码中

   RTCVideoCapturer *capturer = [RTCVideoCapturer capturerWithDeviceName:cameraID];
   RTCMediaConstraints *mediaConstraints = [self defaultMediaStreamConstraints];
   RTCVideoSource *videoSource = [_factory videoSourceWithCapturer:capturer constraints:mediaConstraints];
   localVideoTrack = [_factory videoTrackWithID:@"ARDAMSv0" source:videoSource];
RTCDeoCapturer对象和RTCDeosource对象在此处相互链接

但在新代码中

  RTCVideoSource *source = [_factory videoSource];
  RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
  [_delegate appClient:self didCreateLocalCapturer:capturer];
    localVideoTrack = [_factory videoTrackWithSource:source
                                             trackId:kARDVideoTrackId];
彼此之间没有联系。 那么,委托方法做什么, [\u委托appClient:self-didCreateLocalCapturer:capturer];
我不明白。[需要帮助!]

在视频呼叫视图控制器中实现此委托方法

- (void)appClient:(ARDAppClient *)client didCreateLocalCapturer:(RTCCameraVideoCapturer *)localCapturer{

    NSLog(@"%s %@",__PRETTY_FUNCTION__ ,localCapturer);

    _captureController =  [[ARDCaptureController alloc] initWithCapturer:localCapturer
                                                                settings:[[ARDSettingsModel alloc] init]];
    [_captureController startCapture];
}
然后。。。。此方法调用它来创建相同的

 - (RTCVideoTrack *)createLocalVideoTrack {
      RTCVideoTrack* localVideoTrack = nil;
      // The iOS simulator doesn't provide any sort of camera capture
      // trying to open a local stream.
    #if !TARGET_IPHONE_SIMULATOR
      if (![_settings currentAudioOnlySettingFromStore]) {
          RTCVideoSource *source = [_factory videoSource];
          RTCCameraVideoCapturer *capturer = [[RTCCameraVideoCapturer alloc] initWithDelegate:source];
          [_delegate appClient:self didCreateLocalCapturer:capturer];
          localVideoTrack = [_factory videoTrackWithSource:source
                                                   trackId:kARDVideoTrackId];

          [_delegate appClient:self didReceiveLocalVideoTrack:localVideoTrack];

      }
然后打电话给

_localVideoTrack = [self createLocalVideoTrack]; 
在您的init方法中

- (void)initCall {
    NSLog(@"%s",__PRETTY_FUNCTION__);
    if (!_isTurnComplete) {
        return;
    }
    self.state = kARDAppClientStateConnected;
    _localVideoTrack = [self createLocalVideoTrack];
    // Create peer connection.
    _constraints = [self defaultPeerConnectionConstraints];

}

这段代码使我能够实现这一点

嗨,你解决这个问题了吗。我被同样的事情缠住了是的@WorieN我解决了这个。。。。一会儿就发布这个问题的答案…@WorieN发布了答案!如果您觉得这有帮助,请向上投票。您是否可以共享有助于将视频曲目呈现到RTCEAGLVideoView的代码?我不知道为什么,但我那里没有视频。也许您将RTCEAGLVideoView存储在某个视频视图中,以帮助呈现它?我使用RTCEAGLVideoView将我的视频视图分为子类,并使用调用方用户id标识每个视图,并且您还必须将代理分配为self。不知道你到底面临什么问题。。。我尽可能多地告诉了一个能理解你的评论的人@沃里安