Iphone iOS5 AVF图像到视频

Iphone iOS5 AVF图像到视频,iphone,uiimage,avfoundation,ios5,Iphone,Uiimage,Avfoundation,Ios5,我正试图从一张图片中创建一个视频,并将其保存到我的照片库中。我在谷歌上搜索了很久,但找不到解决方案 我有以下代码: @autoreleasepool { NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/movie2.mp4"]]; UIImage *img = [UIImage imageWithData:[

我正试图从一张图片中创建一个视频,并将其保存到我的照片库中。我在谷歌上搜索了很久,但找不到解决方案

我有以下代码:

    @autoreleasepool {
    NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/movie2.mp4"]];

    UIImage *img = [UIImage imageWithData:[[self imageDataArrya]objectAtIndex:0]imageData];
    [self writeImageAsMovie:img toPath:path size:CGSizeMake(640, 960) duration:10];

    UISaveVideoAtPathToSavedPhotosAlbum (path,self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
}
我在后台线程中调用上述方法。这是“writeImageAsMovie”的代码:

- (void)writeImageAsMovie:(UIImage*)image toPath:(NSString*)path size:(CGSize)size duration:(int)duration {
NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                               [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                               nil];
[self setInput:[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo 
                                                  outputSettings:videoSettings]];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:input
                                                 sourcePixelBufferAttributes:nil];

[videoWriter addInput:input];

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];

CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage];
[adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
[adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(duration-1, 2)];

[input markAsFinished];
[videoWriter endSessionAtSourceTime:CMTimeMake(duration, 2)];
[videoWriter finishWriting];
}

将图像转换为CVPixelBufferRef的实用方法:

- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image {
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                         [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                         nil];
CVPixelBufferRef pxbuffer = NULL;

CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, 
                                      self.view.frame.size.width,
                                      self.view.frame.size.height, 
                                      kCVPixelFormatType_32ARGB, 
                                      (__bridge CFDictionaryRef) options, 
                                      &pxbuffer);

CVPixelBufferLockBaseAddress(pxbuffer, 0);
void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
CGContextRef context = CGBitmapContextCreate(pxdata, self.view.frame.size.width,
                                             self.view.frame.size.height, 8, 4*self.view.frame.size.width, rgbColorSpace, 
                                             kCGImageAlphaNoneSkipFirst);
CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image), 
                                       CGImageGetHeight(image)), image);
CGColorSpaceRelease(rgbColorSpace);
CGContextRelease(context);

CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

return pxbuffer;
}
现在,如果我尝试从模拟器运行代码,它会给我一个错误,说数据已损坏

如果我在我的设备上运行它,它会将一个2秒的视频保存到我的照片库中,但它唯一的绿色是,我的图像不在其中


任何帮助都将不胜感激:)

我完全成功了-很抱歉,我今天之前没有看到您的回复。 这就是我使用的:

创建临时文件

 NSString *path = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/flipimator-tempfile.mp4"]];

//overwrites it if it already exists.
if([fileManager fileExistsAtPath:path]) 
    [fileManager removeItemAtPath:path error:NULL];
调用导出图像方法将图像保存到临时文件:

[self exportImages:frames 
         asVideoToPath:path 
         withFrameSize:imageSize 
       framesPerSecond:fps];
UISaveVideoAtPathToSavedPhotosAlbum (path,self, @selector(video:didFinishSavingWithError: contextInfo:), nil);

- (void)video:(NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
    NSLog(@"Finished saving video with error: %@", error);
    UIAlertView *alert = [[UIAlertView alloc]initWithTitle:@"Done"
                                                   message:@"Movie succesfully exported." 
                                          delegate:nil 
                                 cancelButtonTitle:@"OK" 
                                 otherButtonTitles:nil, nil];
    [alert show];
}
        - (void)exportImages:(NSArray *)imageArray 
           asVideoToPath:(NSString *)path 
           withFrameSize:(CGSize)imageSize
         framesPerSecond:(NSUInteger)fps {
        NSLog(@"Start building video from defined frames.");

        NSError *error = nil;

        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                      [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];    
        NSParameterAssert(videoWriter);

        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
                                       [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
                                       nil];

        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeVideo
                                                outputSettings:videoSettings];


        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                         assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                         sourcePixelBufferAttributes:nil];

        NSParameterAssert(videoWriterInput);
        NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
        videoWriterInput.expectsMediaDataInRealTime = YES;
        [videoWriter addInput:videoWriterInput];

        //Start a session:
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];

        CVPixelBufferRef buffer = NULL;

        //convert uiimage to CGImage.
        int frameCount = 0;

        for(UIImage * img in imageArray) {
            buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:imageSize];

            BOOL append_ok = NO;
            int j = 0;
            while (!append_ok && j < 30) {
                if (adaptor.assetWriterInput.readyForMoreMediaData)  {
                    //print out status::
                    NSString *border = @"**************************************************";
                    NSLog(@"\n%@\nProcessing video frame (%d,%d).\n%@",border,frameCount,[imageArray count],border);

                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
                    append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                    if(!append_ok){
                        NSError *error = videoWriter.error;
                        if(error!=nil) {
                            NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                        }
                    }

                } 
                else {
                    printf("adaptor not ready %d, %d\n", frameCount, j);
                    [NSThread sleepForTimeInterval:0.1];
                }
                j++;
            }
            if (!append_ok) {
                printf("error appending image %d times %d\n, with error.", frameCount, j);
            }
            frameCount++;
        }

        //Finish the session:
        [videoWriterInput markAsFinished];  
        [videoWriter finishWriting];
        NSLog(@"Write Ended");

    }
将临时文件保存到相册:

[self exportImages:frames 
         asVideoToPath:path 
         withFrameSize:imageSize 
       framesPerSecond:fps];
UISaveVideoAtPathToSavedPhotosAlbum (path,self, @selector(video:didFinishSavingWithError: contextInfo:), nil);

- (void)video:(NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
    NSLog(@"Finished saving video with error: %@", error);
    UIAlertView *alert = [[UIAlertView alloc]initWithTitle:@"Done"
                                                   message:@"Movie succesfully exported." 
                                          delegate:nil 
                                 cancelButtonTitle:@"OK" 
                                 otherButtonTitles:nil, nil];
    [alert show];
}
        - (void)exportImages:(NSArray *)imageArray 
           asVideoToPath:(NSString *)path 
           withFrameSize:(CGSize)imageSize
         framesPerSecond:(NSUInteger)fps {
        NSLog(@"Start building video from defined frames.");

        NSError *error = nil;

        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                      [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];    
        NSParameterAssert(videoWriter);

        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
                                       [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
                                       nil];

        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeVideo
                                                outputSettings:videoSettings];


        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                         assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                         sourcePixelBufferAttributes:nil];

        NSParameterAssert(videoWriterInput);
        NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
        videoWriterInput.expectsMediaDataInRealTime = YES;
        [videoWriter addInput:videoWriterInput];

        //Start a session:
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];

        CVPixelBufferRef buffer = NULL;

        //convert uiimage to CGImage.
        int frameCount = 0;

        for(UIImage * img in imageArray) {
            buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:imageSize];

            BOOL append_ok = NO;
            int j = 0;
            while (!append_ok && j < 30) {
                if (adaptor.assetWriterInput.readyForMoreMediaData)  {
                    //print out status::
                    NSString *border = @"**************************************************";
                    NSLog(@"\n%@\nProcessing video frame (%d,%d).\n%@",border,frameCount,[imageArray count],border);

                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
                    append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                    if(!append_ok){
                        NSError *error = videoWriter.error;
                        if(error!=nil) {
                            NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                        }
                    }

                } 
                else {
                    printf("adaptor not ready %d, %d\n", frameCount, j);
                    [NSThread sleepForTimeInterval:0.1];
                }
                j++;
            }
            if (!append_ok) {
                printf("error appending image %d times %d\n, with error.", frameCount, j);
            }
            frameCount++;
        }

        //Finish the session:
        [videoWriterInput markAsFinished];  
        [videoWriter finishWriting];
        NSLog(@"Write Ended");

    }
exportImages方法的代码:

[self exportImages:frames 
         asVideoToPath:path 
         withFrameSize:imageSize 
       framesPerSecond:fps];
UISaveVideoAtPathToSavedPhotosAlbum (path,self, @selector(video:didFinishSavingWithError: contextInfo:), nil);

- (void)video:(NSString *) videoPath didFinishSavingWithError: (NSError *) error contextInfo: (void *) contextInfo {
    NSLog(@"Finished saving video with error: %@", error);
    UIAlertView *alert = [[UIAlertView alloc]initWithTitle:@"Done"
                                                   message:@"Movie succesfully exported." 
                                          delegate:nil 
                                 cancelButtonTitle:@"OK" 
                                 otherButtonTitles:nil, nil];
    [alert show];
}
        - (void)exportImages:(NSArray *)imageArray 
           asVideoToPath:(NSString *)path 
           withFrameSize:(CGSize)imageSize
         framesPerSecond:(NSUInteger)fps {
        NSLog(@"Start building video from defined frames.");

        NSError *error = nil;

        AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                      [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                                  error:&error];    
        NSParameterAssert(videoWriter);

        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
                                       [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
                                       nil];

        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                                assetWriterInputWithMediaType:AVMediaTypeVideo
                                                outputSettings:videoSettings];


        AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                         assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                         sourcePixelBufferAttributes:nil];

        NSParameterAssert(videoWriterInput);
        NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
        videoWriterInput.expectsMediaDataInRealTime = YES;
        [videoWriter addInput:videoWriterInput];

        //Start a session:
        [videoWriter startWriting];
        [videoWriter startSessionAtSourceTime:kCMTimeZero];

        CVPixelBufferRef buffer = NULL;

        //convert uiimage to CGImage.
        int frameCount = 0;

        for(UIImage * img in imageArray) {
            buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:imageSize];

            BOOL append_ok = NO;
            int j = 0;
            while (!append_ok && j < 30) {
                if (adaptor.assetWriterInput.readyForMoreMediaData)  {
                    //print out status::
                    NSString *border = @"**************************************************";
                    NSLog(@"\n%@\nProcessing video frame (%d,%d).\n%@",border,frameCount,[imageArray count],border);

                    CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
                    append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
                    if(!append_ok){
                        NSError *error = videoWriter.error;
                        if(error!=nil) {
                            NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                        }
                    }

                } 
                else {
                    printf("adaptor not ready %d, %d\n", frameCount, j);
                    [NSThread sleepForTimeInterval:0.1];
                }
                j++;
            }
            if (!append_ok) {
                printf("error appending image %d times %d\n, with error.", frameCount, j);
            }
            frameCount++;
        }

        //Finish the session:
        [videoWriterInput markAsFinished];  
        [videoWriter finishWriting];
        NSLog(@"Write Ended");

    }
-(void)exportImages:(NSArray*)imageArray
asVideoToPath:(NSString*)路径
withFrameSize:(CGSize)图像大小
framesPerSecond:(整数)fps{
NSLog(@“从定义的帧开始构建视频”);
n错误*错误=nil;
AVAssetWriter*videoWriter=[[AVAssetWriter alloc]initWithURL:
[NSURL fileURLWithPath:path]文件类型:AVFileTypeQuickTimeMovie
错误:&错误];
NSParameterAssert(视频编写器);
NSDictionary*videoSettings=[NSDictionary Dictionary WithObjectsAndKeys:
AVVideoCodecH264,AVVideoCodeKey,
[NSNumber numberWithInt:imageSize.width],AVVideoWidthKey,
[NSNumber Number Withint:imageSize.height],AVVideoHeightKey,
零];
AVAssetWriterInput*videoWriterInput=[AVAssetWriterInput
assetWriterInputWithMediaType:AVMediaTypeVideo
输出设置:视频设置];
AvassetWriterInputPixelBufferAdapter*适配器=[AvassetWriterInputPixelBufferAdapter]
AssetWriterInputPixelBufferAdapter带AssetWriterInput:videoWriterInput
sourcePixelBufferAttributes:nil];
NSParameterAssert(videoWriterInput);
NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
videoWriterInput.expectsMediaDataInRealTime=是;
[videoWriter附加输入:videoWriterInput];
//启动会话:
[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero];
CVPixelBufferRef buffer=NULL;
//将uiimage转换为CGImage。
int frameCount=0;
用于(图像阵列中的UIImage*img){
缓冲区=[self pixelBufferFromCGImage:[img CGImage]和大小:imageSize];
BOOL append_ok=否;
int j=0;
而(!append_ok&&j<30){
if(适配器.assetWriterInput.readyForMoreMediaData){
//打印输出状态::
NSString*border=@“****************************************************************************”;
NSLog(@“\n%@\n正在处理视频帧(%d,%d)。\n%@”,边框,帧数,[imageArray count],边框);
CMTime frameTime=CMTimeMake(帧数,(int32_t)fps);
append_ok=[Adapter appendPixelBuffer:BufferwithPresentationTime:frameTime];
如果(!append_ok){
NSError*error=videoWriter.error;
如果(错误!=nil){
NSLog(@“未解决的错误%@,%@.”,错误,[error userInfo]);
}
}
} 
否则{
printf(“适配器未就绪%d,%d\n”,帧数,j);
[NSThread sleepForTimeInterval:0.1];
}
j++;
}
如果(!append_ok){
printf(“附加图像%d次时出错%d\n,有错误。”,帧数,j);
}
frameCount++;
}
//完成课程:
[videoWriterInput markAsFinished];
[视频写手完成写作];
NSLog(@“写入结束”);
}
方法的参数

  • imageArray:UIImage的NSArray
  • 路径:处理时写入的临时路径(上面定义的临时路径)
  • imageSize:以像素为单位的视频大小(宽度和高度)
  • fps:视频中每秒应显示多少图像
希望能有所帮助 很抱歉格式化-我对StackOverflow.com还是很陌生


这就是我使用代码的地方:

因为iOS 5仍在保密协议下,所以除了在苹果开发者论坛上,你不允许谈论它。哦…:/我应该删除这个问题吗?然后去那里问?你找到答案了吗?我也在为类似的事情而挣扎。我正在尝试你的代码。然而,由于某些原因,我的图像有点旋转(我想)。有什么建议吗?这是它在模拟器屏幕上的显示方式:!但是,他们最终在录制的视频中出现了这样的画面![录制的视频]()嗨..这样我得到的视频图像失真..你能帮我吗?