Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/iphone/37.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Iphone 如何在库中设置AVCaptureVideoDataOutput_Iphone_Objective C_Augmented Reality_Unity3d - Fatal编程技术网

Iphone 如何在库中设置AVCaptureVideoDataOutput

Iphone 如何在库中设置AVCaptureVideoDataOutput,iphone,objective-c,augmented-reality,unity3d,Iphone,Objective C,Augmented Reality,Unity3d,我正在尝试为iPhone创建一个库,所以我正在尝试通过一个电话初始化相机。 当我在这个声明中称之为“自我”时,问题就来了: "[captureOutput setSampleBufferDelegate:self queue:queue];" 因为编译器说:“self没有在这个范围内声明”,我需要做什么才能将同一个类设置为“AVCaptureVideoDataOutputSampleBufferDelegate”?。至少给我指出正确的方向:P 谢谢你 以下是完整的功能: bool VideoC

我正在尝试为iPhone创建一个库,所以我正在尝试通过一个电话初始化相机。 当我在这个声明中称之为“自我”时,问题就来了:

"[captureOutput setSampleBufferDelegate:self queue:queue];"
因为编译器说:“self没有在这个范围内声明”,我需要做什么才能将同一个类设置为“AVCaptureVideoDataOutputSampleBufferDelegate”?。至少给我指出正确的方向:P

谢谢你

以下是完整的功能:

bool VideoCamera_Init(){


    //Init Capute from the camera and show the camera


    /*We setup the input*/
    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo] 
                                          error:nil];
    /*We setupt the output*/
    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    /*While a frame is processes in -captureOutput:didOutputSampleBuffer:fromConnection: delegate methods no other frames are added in the queue.
     If you don't want this behaviour set the property to NO */
    captureOutput.alwaysDiscardsLateVideoFrames = YES; 
    /*We specify a minimum duration for each frame (play with this settings to avoid having too many frames waiting
     in the queue because it can cause memory issues). It is similar to the inverse of the maximum framerate.
     In this example we set a min frame duration of 1/10 seconds so a maximum framerate of 10fps. We say that
     we are not able to process more than 10 frames per second.*/
    captureOutput.minFrameDuration = CMTimeMake(1, 20);

    /*We create a serial queue to handle the processing of our frames*/
    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    variableconnombrealeatorio= [[VideoCameraThread alloc] init];
    [captureOutput setSampleBufferDelegate:self queue:queue];


    dispatch_release(queue);
    // Set the video output to store frame in BGRA (It is supposed to be faster)
    NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
    [captureOutput setVideoSettings:videoSettings]; 
    /*And we create a capture session*/
    AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
    captureSession.sessionPreset= AVCaptureSessionPresetMedium;
    /*We add input and output*/
    [captureSession addInput:captureInput];
    [captureSession addOutput:captureOutput];
    /*We start the capture*/
    [captureSession startRunning];


    return TRUE;
}
我也上了下一节课,但缓冲区是空的:

"

#导入“VideoCameraThread.h”

CMSampleBufferRef bufferCamara

@VideoCameraThread的实现

  • (无效)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)连接 { bufferCamera=采样缓冲区

    } "


    • 您正在编写一个C函数,它没有目标C类、对象或
      自身
      标识符的概念。您需要修改函数,以获取一个参数来接受要使用的sampleBufferDelegate:

      bool VideoCamera_Init(id<AVCaptureAudioDataOutputSampleBufferDelegate> sampleBufferDelegate) {
          ...
          [captureOutput setSampleBufferDelegate:sampleBufferDelegate queue:queue];
          ...
      }
      
      bool摄像机初始化(id sampleBufferDelegate){
      ...
      [captureOutput setSampleBufferDelegate:sampleBufferDelegate队列:队列];
      ...
      }
      
      或者,您可以使用面向对象的C接口而不是C风格的接口来编写库


      此功能中的内存管理也有问题。例如,您正在分配AVCaptureSession并将其分配给局部变量。此函数返回后,您将无法检索AVCaptureSession以便释放它。

      您正在编写一个C函数,它没有目标C类、对象或
      自身标识符的概念。您需要修改函数,以获取一个参数来接受要使用的sampleBufferDelegate:

      bool VideoCamera_Init(id<AVCaptureAudioDataOutputSampleBufferDelegate> sampleBufferDelegate) {
          ...
          [captureOutput setSampleBufferDelegate:sampleBufferDelegate queue:queue];
          ...
      }
      
      bool摄像机初始化(id sampleBufferDelegate){
      ...
      [captureOutput setSampleBufferDelegate:sampleBufferDelegate队列:队列];
      ...
      }
      
      或者,您可以使用面向对象的C接口而不是C风格的接口来编写库

      此功能中的内存管理也有问题。例如,您正在分配AVCaptureSession并将其分配给局部变量。此函数返回后,您将无法检索AVCaptureSession以便释放它