Ios AVAssetReader/AVAssetWriter以不同分辨率连接mp4文件

Ios AVAssetReader/AVAssetWriter以不同分辨率连接mp4文件,ios,objective-c,mp4,avassetwriter,avassetreader,Ios,Objective C,Mp4,Avassetwriter,Avassetreader,我正在编写一个iPad应用程序,其中我需要加入不同分辨率的mp4文件。为此,我使用AVAssetReader读取mp4源文件,AVAssetWriter将这些源文件写入单个mp4输出文件 我曾尝试使用AVAssetExportSession,但我遇到的问题是,不同连接的文件之间存在黑框 我现在面临的问题是,一切似乎都正常,但AVAssetWriter的完成处理程序从未被调用 下面是我的选择器,它以mp4文件URL列表、单个输出文件URL和一个完成处理程序作为输入 - (void)resizeAn

我正在编写一个iPad应用程序,其中我需要加入不同分辨率的mp4文件。为此,我使用AVAssetReader读取mp4源文件,AVAssetWriter将这些源文件写入单个mp4输出文件

我曾尝试使用AVAssetExportSession,但我遇到的问题是,不同连接的文件之间存在黑框

我现在面临的问题是,一切似乎都正常,但AVAssetWriter的完成处理程序从未被调用

下面是我的选择器,它以mp4文件URL列表、单个输出文件URL和一个完成处理程序作为输入

- (void)resizeAndJoinVideosAtURLs:(NSArray *)videoURLs toOutputURL:(NSURL *)outputURL withHandler:(void(^)(NSURL *fileURL))handler
{
    /*
     First step: create the writer and writer input
     */
    NSError *error = nil;
    self.videoAssetWriter = [[AVAssetWriter alloc] initWithURL:outputURL fileType:AVFileTypeMPEG4 error:&error];

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:AVVideoCodecH264, AVVideoCodecKey,[NSNumber numberWithInt:640], AVVideoWidthKey,[NSNumber numberWithInt:480], AVVideoHeightKey,nil];

    AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
    videoWriterInput.expectsMediaDataInRealTime = NO;

    if([self.videoAssetWriter canAddInput:videoWriterInput])
    {
        [self.videoAssetWriter addInput:videoWriterInput];
        [self.videoAssetWriter startWriting];
        [self.videoAssetWriter startSessionAtSourceTime:kCMTimeZero];

        /*
         Second step: for each video URL given create a reader and an reader input
         */

        for(NSURL *videoURL in videoURLs)
        {
            NSLog(@"Processing file: %@",videoURL);
            AVAsset *videoAsset = [[AVURLAsset alloc] initWithURL:videoURL options:nil];
            AVAssetReader *videoAssetReader = [[AVAssetReader alloc] initWithAsset:videoAsset error:&error];
            AVAssetTrack *videoAssetTrack = [videoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject;
            NSDictionary *videoOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

            AVAssetReaderTrackOutput *videoAssetTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoAssetTrack outputSettings:videoOptions];
            videoAssetTrackOutput.alwaysCopiesSampleData = NO;

            if([videoAssetReader canAddOutput:videoAssetTrackOutput])
            {
                [videoAssetReader addOutput:videoAssetTrackOutput];
                [videoAssetReader startReading];

                /*
                 Step three: copy the buffers from the reader to the writer
                 */
                while ([videoAssetReader status] == AVAssetReaderStatusReading)
                {
                    if(![videoWriterInput isReadyForMoreMediaData]) continue;

                    CMSampleBufferRef buffer = [videoAssetTrackOutput copyNextSampleBuffer];
                    if(buffer)
                    {
                        [videoWriterInput appendSampleBuffer:buffer];
                        CFRelease(buffer);
                    }
                }


            } else NSLog(@"ERROR: %@",error);
        }

       [videoWriterInput markAsFinished];

    } else NSLog(@"ERROR: %@",error);

    __weak ClipBuilder *weakself = self;
    [self.videoAssetWriter finishWritingWithCompletionHandler:^{
        handler(outputURL);
        weakself.videoAssetWriter = nil;
    }];
}
我的输出文件存在,而AVAssetWriter存在,因为它是一个属性,但仍然没有调用完成处理程序。什么能解释这一点

谢谢你的帮助


有什么可以解释呢?

这是我最终实现的解决方案,通过Avassetrader/AVAssetWriter的组合以不同的分辨率连接mp4文件

- (void)reencodeComposition:(AVComposition *)composition toMP4File:(NSURL *)mp4FileURL withCompletionHandler:(void (^)(void))handler
{
    self.status = EncoderStatusEncoding;

    /*
     Create the asset writer to write the file on disk
     */

    NSError *error = nil;
    if([[NSFileManager defaultManager] fileExistsAtPath:mp4FileURL.path isDirectory:nil])
    {
        if(![[NSFileManager defaultManager] removeItemAtPath:mp4FileURL.path error:&error])
        {
            [self failWithError:error withCompletionHandler:handler];
            return;
        }
    }

    self.assetWriter = [[AVAssetWriter alloc] initWithURL:mp4FileURL fileType:AVFileTypeMPEG4 error:&error];

    if(self.assetWriter)
    {
        /*
         Get the audio and video track of the composition
         */
        AVAssetTrack *videoAssetTrack = [composition tracksWithMediaType:AVMediaTypeVideo].firstObject;
        AVAssetTrack *audioAssetTrack = [composition tracksWithMediaType:AVMediaTypeAudio].firstObject;

        NSDictionary *videoSettings = @{AVVideoCodecKey:AVVideoCodecH264, AVVideoWidthKey:@(self.imageWidth), AVVideoHeightKey:@(self.imageHeight)};

        /*
         Add an input to be able to write the video in the file
         */
        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
        videoWriterInput.expectsMediaDataInRealTime = YES;

        if([self.assetWriter canAddInput:videoWriterInput])
        {
            [self.assetWriter addInput:videoWriterInput];

            /*
             Add an input to be able to write the audio in the file
             */
// Use this only if you know the format
//            CMFormatDescriptionRef audio_fmt_desc_ = nil;
//
//            AudioStreamBasicDescription audioFormat;
//            bzero(&audioFormat, sizeof(audioFormat));
//            audioFormat.mSampleRate = 44100;
//            audioFormat.mFormatID   = kAudioFormatMPEG4AAC;
//            audioFormat.mFramesPerPacket = 1024;
//            audioFormat.mChannelsPerFrame = 2;
//            int bytes_per_sample = sizeof(float);
//            audioFormat.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked;
//            
//            audioFormat.mBitsPerChannel = bytes_per_sample * 8;
//            audioFormat.mBytesPerPacket = bytes_per_sample * 2;
//            audioFormat.mBytesPerFrame = bytes_per_sample * 2;
//            
//            CMAudioFormatDescriptionCreate(kCFAllocatorDefault,&audioFormat,0,NULL,0,NULL,NULL,&audio_fmt_desc_);
//            
//             AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:audio_fmt_desc_];
//            
//            AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:((__bridge CMAudioFormatDescriptionRef)audioAssetTrack.formatDescriptions.firstObject)];

            audioWriterInput.expectsMediaDataInRealTime = YES;

            if([self.assetWriter canAddInput:audioWriterInput])
            {
                [self.assetWriter addInput:audioWriterInput];
                [self.assetWriter startWriting];
                [self.assetWriter startSessionAtSourceTime:kCMTimeZero];

                /*
                 Create the asset reader to read the mp4 files on the disk
                 */
                AVAssetReader *assetReader = [[AVAssetReader alloc] initWithAsset:composition error:&error];
                NSDictionary *videoOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

                /*
                 Add an output to be able to retrieve the video in the files
                 */
                AVAssetReaderTrackOutput *videoAssetTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoAssetTrack outputSettings:videoOptions];
                videoAssetTrackOutput.alwaysCopiesSampleData = NO;

                if([assetReader canAddOutput:videoAssetTrackOutput])
                {
                    [assetReader addOutput:videoAssetTrackOutput];
                    /*
                     Add an output to be able to retrieve the video in the files
                     */
                    AVAssetReaderTrackOutput *audioAssetTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:audioAssetTrack outputSettings:nil];
                    videoAssetTrackOutput.alwaysCopiesSampleData = NO;

                    if([assetReader canAddOutput:audioAssetTrackOutput])
                    {
                        [assetReader addOutput:audioAssetTrackOutput];

                        [assetReader startReading];

                        /*
                         Read the mp4 files until the end and copy them in the output file
                         */
                        dispatch_group_t encodingGroup = dispatch_group_create();

                        dispatch_group_enter(encodingGroup);
                        [audioWriterInput requestMediaDataWhenReadyOnQueue:self.encodingQueue usingBlock:^{
                            while ([audioWriterInput isReadyForMoreMediaData])
                            {
                                CMSampleBufferRef nextSampleBuffer = [audioAssetTrackOutput copyNextSampleBuffer];

                                if (nextSampleBuffer)
                                {
                                    [audioWriterInput appendSampleBuffer:nextSampleBuffer];
                                    CFRelease(nextSampleBuffer);
                                }
                                else
                                {
                                    [audioWriterInput markAsFinished];
                                    dispatch_group_leave(encodingGroup);
                                    break;
                                }
                            }
                        }];

                        dispatch_group_enter(encodingGroup);
                        [videoWriterInput requestMediaDataWhenReadyOnQueue:self.encodingQueue usingBlock:^{
                            while ([videoWriterInput isReadyForMoreMediaData])
                            {
                                CMSampleBufferRef nextSampleBuffer = [videoAssetTrackOutput copyNextSampleBuffer];

                                if (nextSampleBuffer)
                                {
                                    [videoWriterInput appendSampleBuffer:nextSampleBuffer];
                                    CFRelease(nextSampleBuffer);
                                }
                                else
                                {
                                    [videoWriterInput markAsFinished];
                                    dispatch_group_leave(encodingGroup);
                                    break;
                                }
                            }
                        }];

                        dispatch_group_wait(encodingGroup, DISPATCH_TIME_FOREVER);

                    } else [self failWithError:error withCompletionHandler:handler];
                } else [self failWithError:error withCompletionHandler:handler];
            } else [self failWithError:error withCompletionHandler:handler];
        } else [self failWithError:error withCompletionHandler:handler];

        __weak Encoder *weakself = self;
        [self.assetWriter finishWritingWithCompletionHandler:^{
            self.status = EncoderStatusCompleted;
            handler();
            weakself.assetWriter = nil;
            self.encodingQueue = nil;
        }];
    }
    else [self failWithError:error withCompletionHandler:handler];
}


这个实现是针对我的项目的,但我最终不需要它。

这里是我最终实现的解决方案,通过Avassetrader/AVAssetWriter的组合以不同的分辨率连接mp4文件

- (void)reencodeComposition:(AVComposition *)composition toMP4File:(NSURL *)mp4FileURL withCompletionHandler:(void (^)(void))handler
{
    self.status = EncoderStatusEncoding;

    /*
     Create the asset writer to write the file on disk
     */

    NSError *error = nil;
    if([[NSFileManager defaultManager] fileExistsAtPath:mp4FileURL.path isDirectory:nil])
    {
        if(![[NSFileManager defaultManager] removeItemAtPath:mp4FileURL.path error:&error])
        {
            [self failWithError:error withCompletionHandler:handler];
            return;
        }
    }

    self.assetWriter = [[AVAssetWriter alloc] initWithURL:mp4FileURL fileType:AVFileTypeMPEG4 error:&error];

    if(self.assetWriter)
    {
        /*
         Get the audio and video track of the composition
         */
        AVAssetTrack *videoAssetTrack = [composition tracksWithMediaType:AVMediaTypeVideo].firstObject;
        AVAssetTrack *audioAssetTrack = [composition tracksWithMediaType:AVMediaTypeAudio].firstObject;

        NSDictionary *videoSettings = @{AVVideoCodecKey:AVVideoCodecH264, AVVideoWidthKey:@(self.imageWidth), AVVideoHeightKey:@(self.imageHeight)};

        /*
         Add an input to be able to write the video in the file
         */
        AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
        videoWriterInput.expectsMediaDataInRealTime = YES;

        if([self.assetWriter canAddInput:videoWriterInput])
        {
            [self.assetWriter addInput:videoWriterInput];

            /*
             Add an input to be able to write the audio in the file
             */
// Use this only if you know the format
//            CMFormatDescriptionRef audio_fmt_desc_ = nil;
//
//            AudioStreamBasicDescription audioFormat;
//            bzero(&audioFormat, sizeof(audioFormat));
//            audioFormat.mSampleRate = 44100;
//            audioFormat.mFormatID   = kAudioFormatMPEG4AAC;
//            audioFormat.mFramesPerPacket = 1024;
//            audioFormat.mChannelsPerFrame = 2;
//            int bytes_per_sample = sizeof(float);
//            audioFormat.mFormatFlags = kAudioFormatFlagIsFloat | kAudioFormatFlagIsPacked;
//            
//            audioFormat.mBitsPerChannel = bytes_per_sample * 8;
//            audioFormat.mBytesPerPacket = bytes_per_sample * 2;
//            audioFormat.mBytesPerFrame = bytes_per_sample * 2;
//            
//            CMAudioFormatDescriptionCreate(kCFAllocatorDefault,&audioFormat,0,NULL,0,NULL,NULL,&audio_fmt_desc_);
//            
//             AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:audio_fmt_desc_];
//            
//            AVAssetWriterInput* audioWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeAudio outputSettings:nil sourceFormatHint:((__bridge CMAudioFormatDescriptionRef)audioAssetTrack.formatDescriptions.firstObject)];

            audioWriterInput.expectsMediaDataInRealTime = YES;

            if([self.assetWriter canAddInput:audioWriterInput])
            {
                [self.assetWriter addInput:audioWriterInput];
                [self.assetWriter startWriting];
                [self.assetWriter startSessionAtSourceTime:kCMTimeZero];

                /*
                 Create the asset reader to read the mp4 files on the disk
                 */
                AVAssetReader *assetReader = [[AVAssetReader alloc] initWithAsset:composition error:&error];
                NSDictionary *videoOptions = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];

                /*
                 Add an output to be able to retrieve the video in the files
                 */
                AVAssetReaderTrackOutput *videoAssetTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoAssetTrack outputSettings:videoOptions];
                videoAssetTrackOutput.alwaysCopiesSampleData = NO;

                if([assetReader canAddOutput:videoAssetTrackOutput])
                {
                    [assetReader addOutput:videoAssetTrackOutput];
                    /*
                     Add an output to be able to retrieve the video in the files
                     */
                    AVAssetReaderTrackOutput *audioAssetTrackOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:audioAssetTrack outputSettings:nil];
                    videoAssetTrackOutput.alwaysCopiesSampleData = NO;

                    if([assetReader canAddOutput:audioAssetTrackOutput])
                    {
                        [assetReader addOutput:audioAssetTrackOutput];

                        [assetReader startReading];

                        /*
                         Read the mp4 files until the end and copy them in the output file
                         */
                        dispatch_group_t encodingGroup = dispatch_group_create();

                        dispatch_group_enter(encodingGroup);
                        [audioWriterInput requestMediaDataWhenReadyOnQueue:self.encodingQueue usingBlock:^{
                            while ([audioWriterInput isReadyForMoreMediaData])
                            {
                                CMSampleBufferRef nextSampleBuffer = [audioAssetTrackOutput copyNextSampleBuffer];

                                if (nextSampleBuffer)
                                {
                                    [audioWriterInput appendSampleBuffer:nextSampleBuffer];
                                    CFRelease(nextSampleBuffer);
                                }
                                else
                                {
                                    [audioWriterInput markAsFinished];
                                    dispatch_group_leave(encodingGroup);
                                    break;
                                }
                            }
                        }];

                        dispatch_group_enter(encodingGroup);
                        [videoWriterInput requestMediaDataWhenReadyOnQueue:self.encodingQueue usingBlock:^{
                            while ([videoWriterInput isReadyForMoreMediaData])
                            {
                                CMSampleBufferRef nextSampleBuffer = [videoAssetTrackOutput copyNextSampleBuffer];

                                if (nextSampleBuffer)
                                {
                                    [videoWriterInput appendSampleBuffer:nextSampleBuffer];
                                    CFRelease(nextSampleBuffer);
                                }
                                else
                                {
                                    [videoWriterInput markAsFinished];
                                    dispatch_group_leave(encodingGroup);
                                    break;
                                }
                            }
                        }];

                        dispatch_group_wait(encodingGroup, DISPATCH_TIME_FOREVER);

                    } else [self failWithError:error withCompletionHandler:handler];
                } else [self failWithError:error withCompletionHandler:handler];
            } else [self failWithError:error withCompletionHandler:handler];
        } else [self failWithError:error withCompletionHandler:handler];

        __weak Encoder *weakself = self;
        [self.assetWriter finishWritingWithCompletionHandler:^{
            self.status = EncoderStatusCompleted;
            handler();
            weakself.assetWriter = nil;
            self.encodingQueue = nil;
        }];
    }
    else [self failWithError:error withCompletionHandler:handler];
}


这个实现是为我的项目设计的,但我最终不需要它。

嗨,你用这段代码真的帮我省了不少时间。但我有一个奇怪的问题:音频直到最后才被复制,不是所有的音频样本都被添加到输出视频中。你能猜出原因吗?非常感谢,我很乐意!这里的代码片段就是为了这个原因。不幸的是,这个问题可能由代码或文件的许多问题引起。没有看到他们,真的很难猜测。我建议在创建AVComposition时检查音频格式、时间范围和使用的任何持续时间。谢谢你的回复。我正在使用您的代码,但奇怪的是,
[audioWriterInput isReadyForMoreMediaData]
在仅5个音频采样后,即远早于音频曲目结束时,就变成了
NO
。有什么想法吗?最后,我终于成功了。问题是我必须先处理视频,然后才能播放音频。在我的例子中,viceversa不工作,音频块(
[AudioWriterInputRequestMediaDataWhenRepayonQueue:self.encodingQueue usingBlock:^{
)我们打了两次电话。我很高兴听到。祝你的项目好运!嗨,你用那段代码真的救了我的命。但我有一个奇怪的问题:音频直到最后才被复制,不是所有的音频样本都被添加到输出视频中。你能猜到原因吗?非常感谢!这段代码就是为了这个原因。不幸的是sue可能会在代码或文件方面出现许多问题。如果看不到它们,很难猜测。我建议检查音频格式、时间范围和创建AVComposition时使用的任何持续时间。致以问候。感谢您的回复。我正在使用您的代码,但奇怪的行为是
[audioWriterInput isReadyForMoreMediaData]
仅经过5次音频采样后,就变成了
,远远早于音频曲目的结尾。有什么想法吗?最后,我终于让它开始工作了。问题是我必须先处理视频,然后才能放入音频。在我的情况下,viceversa不工作,音频块(
[AudioWriterInputRequestMediaDataWhenReadyonQueue:self.encodingQueue usingBlock:^{
)被调用了两次。我很高兴听到这个消息。祝您的项目好运!