Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/94.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios AvassetWriteInputPixelBufferAdapter appendPixelBuffer:EXC_BAD_访问_Ios_Opengl Es 2.0_Avfoundation_Avassetwriter - Fatal编程技术网

Ios AvassetWriteInputPixelBufferAdapter appendPixelBuffer:EXC_BAD_访问

Ios AvassetWriteInputPixelBufferAdapter appendPixelBuffer:EXC_BAD_访问,ios,opengl-es-2.0,avfoundation,avassetwriter,Ios,Opengl Es 2.0,Avfoundation,Avassetwriter,我最近几天遇到了一个让我发疯的问题。这是关于AvassetWriterInputPixelBufferAdapter appendPixelBuffer方法的。 无论我做什么,这都会导致EXC\u无法访问此线路。我知道有几个帖子在谈论这个问题,但没有人能帮我 这就是我的情况:我正在用OpenGLES渲染一个纹理,我想用这些纹理创建一个视频。这是我的密码: 创建FBO、附加纹理和像素缓冲区以填充渲染: glGenFramebuffers(1, &_secondFrameBuffer

我最近几天遇到了一个让我发疯的问题。这是关于AvassetWriterInputPixelBufferAdapter appendPixelBuffer方法的。 无论我做什么,这都会导致EXC\u无法访问此线路。我知道有几个帖子在谈论这个问题,但没有人能帮我

这就是我的情况:我正在用OpenGLES渲染一个纹理,我想用这些纹理创建一个视频。这是我的密码:

创建FBO、附加纹理和像素缓冲区以填充渲染:

    glGenFramebuffers(1, &_secondFrameBuffer);
glGenRenderbuffers(1, &_depthRenderBuffer);

CVPixelBufferPoolCreatePixelBuffer(NULL, [_videoWriter.pixelBufferInput pixelBufferPool], &_fboTexturePixelBuffer);
CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,
                                             _textureCache,
                                             _fboTexturePixelBuffer,
                                             NULL,
                                             GL_TEXTURE_2D,
                                             GL_RGBA,
                                             (int)size.width,
                                             (int)size.height,
                                             GL_BGRA,
                                             GL_UNSIGNED_BYTE,
                                             0,
                                             &_fboTexture);

glBindTexture(CVOpenGLESTextureGetTarget(_fboTexture), CVOpenGLESTextureGetName(_fboTexture));
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);


glBindRenderbuffer(GL_RENDERBUFFER, _depthRenderBuffer);
glRenderbufferStorage(GL_RENDERBUFFER, GL_DEPTH_COMPONENT16, size.width, size.height);

glBindFramebuffer(GL_FRAMEBUFFER, _secondFrameBuffer);

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_TEXTURE_2D, CVOpenGLESTextureGetName(_fboTexture), 0);
glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT, GL_RENDERBUFFER, _depthRenderBuffer);

glBindFramebuffer(GL_FRAMEBUFFER, _secondFrameBuffer);
AVAssetWriter创建:

    _dataQueue = dispatch_queue_create("data_queue", NULL);
    _assetWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeQuickTimeMovie error:nil];

    int width = [UIScreen mainScreen].applicationFrame.size.width * [UIScreen mainScreen].scale;
    int height = [UIScreen mainScreen].applicationFrame.size.height * [UIScreen mainScreen].scale;

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:height], AVVideoHeightKey,
                                   nil];

    _assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
    _assetWriterInput.expectsMediaDataInRealTime = YES;

    NSDictionary *bufferAttributes = [NSDictionary dictionaryWithObjectsAndKeys:
                                      [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                      [NSNumber numberWithBool:YES], kCVPixelBufferOpenGLESCompatibilityKey,
                                      nil];

    _assetWriterInputPixelBufferAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:_assetWriterInput sourcePixelBufferAttributes:bufferAttributes];


    [_assetWriter addInput:_assetWriterInput];

    _presentationTime = kCMTimeZero;
    [_assetWriter startWriting];
    [_assetWriter startSessionAtSourceTime:_presentationTime];
然后,我尝试添加像素缓冲区

- (void)appendPixelBuffer:(CVPixelBufferRef)pixelBuffer
{
dispatch_async(_dataQueue, ^{

    if (pixelBuffer != NULL) {
        [_assetWriterInputPixelBufferAdaptor appendPixelBuffer:pixelBuffer    withPresentationTime:_presentationTime];
        CMTime frameTime = CMTimeMake(1, 30);
        _presentationTime = CMTimeAdd(_presentationTime, frameTime);

    } else {
        NSLog(@"NULL PixelBuffer !");
    }
});
}
崩溃->执行错误访问

谁能帮帮我吗


谢谢

我在方法-voidappendPixelBuffer:CVPixelBufferRefpixelBuffer中看到了您的问题。如果您使用ARC,请确保您的pixelBuffer未在异步队列上释放,以确保您的pixelBuffer未释放,请尝试此代码

- (void)appendPixelBuffer:(CVPixelBufferRef)pixelBuffer
{
   CVPixelBufferRetain(pixelBuffer);
   dispatch_async(_dataQueue, ^{
      if (pixelBuffer != NULL) {
        [_assetWriterInputPixelBufferAdaptor appendPixelBuffer:pixelBuffer    withPresentationTime:_presentationTime];
        CMTime frameTime = CMTimeMake(1, 30);
        _presentationTime = CMTimeAdd(_presentationTime, frameTime);
        CVPixelBufferRelease(pixelBuffer);
      } else {
        NSLog(@"NULL PixelBuffer !");
      }
   });
}
GL_BGRA在哪里定义?