Objective c 如何从iPhone的框架中创建视频

Objective c 如何从iPhone的框架中创建视频,objective-c,video,avfoundation,mpmovieplayercontroller,Objective C,Video,Avfoundation,Mpmovieplayercontroller,我做过研发,并在如何从MPMoviePlayerController中播放的视频文件中获取帧方面取得了成功 获取此代码中的所有帧,并将所有图像保存在一个数组中 for(int i= 1; i <= moviePlayerController.duration; i++) { UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame]

我做过研发,并在如何从
MPMoviePlayerController
中播放的视频文件中获取帧方面取得了成功

获取此代码中的所有帧,并将所有图像保存在一个数组中

for(int i= 1; i <= moviePlayerController.duration; i++)
{
    UIImage *img = [moviePlayerController thumbnailImageAtTime:i timeOption:MPMovieTimeOptionNearestKeyFrame];
    [arrImages addObject:img];
}

我是这个话题的新手,请帮我解决这个问题。

你可以参考以下链接,希望你能得到一些帮助:-


  • 你可以用你得到的解决方案吗?如果可能的话,请发布解决方案好吗?为什么要再次转换为视频?仅使用图像…嗨,有任何更新的Swift吗?所有这些链接似乎都很旧。我们正在尝试将图像和视频合并到一个主视频中,其中图像必须作为单独的“视频”显示在主视频中。例如,假设一个图像和一个视频。在主视频中,图像显示3秒钟,然后播放视频。有什么建议吗?谢谢
    - (void) writeImagesAsMovie:(NSString*)path 
    {
        NSError *error  = nil;
        UIImage *first = [arrImages objectAtIndex:0];
        CGSize frameSize = first.size;
        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                      [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];
        NSParameterAssert(videoWriter);
    
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:640], AVVideoWidthKey,
                                       [NSNumber numberWithInt:480], AVVideoHeightKey,
                                       nil];
        AVAssetWriterInput* writerInput = [[AVAssetWriterInput
                                            assetWriterInputWithMediaType:AVMediaTypeVideo
                                            outputSettings:videoSettings] retain];
    
        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                         assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                         sourcePixelBufferAttributes:nil];
    
        NSParameterAssert(writerInput);
        NSParameterAssert([videoWriter canAddInput:writerInput]);
        [videoWriter addInput:writerInput];
    
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];
    
        int frameCount = 0;
        CVPixelBufferRef buffer = NULL;
        for(UIImage *img in arrImages)
        {
            buffer = [self newPixelBufferFromCGImage:[img CGImage] andFrameSize:frameSize]; 
    
                if (adaptor.assetWriterInput.readyForMoreMediaData) 
                {
                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) kRecordingFPS);
                    [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
    
                    if(buffer)
                        CVBufferRelease(buffer);
                }
            frameCount++;
        } 
    
         [writerInput markAsFinished];
         [videoWriter finishWriting];
    }
    
    
    - (CVPixelBufferRef) newPixelBufferFromCGImage: (CGImageRef) image andFrameSize:(CGSize)frameSize
    {
        NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                                 [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                                 nil];
        CVPixelBufferRef pxbuffer = NULL;
        CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, frameSize.width,
                                              frameSize.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options, 
                                              &pxbuffer);
        NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);
    
        CVPixelBufferLockBaseAddress(pxbuffer, 0);
        void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
        NSParameterAssert(pxdata != NULL);
    
        CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
        CGContextRef context = CGBitmapContextCreate(pxdata, frameSize.width,
                                                     frameSize.height, 8, 4*frameSize.width, rgbColorSpace, 
                                                     kCGImageAlphaNoneSkipFirst);
        NSParameterAssert(context);
        CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
        CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                               CGImageGetHeight(image)), image);
        CGColorSpaceRelease(rgbColorSpace);
        CGContextRelease(context);
    
        CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    
        return pxbuffer;
    }