Cocoa touch AVFoundation-AVAssetExportSession-第二次导出尝试时停止操作

Cocoa touch AVFoundation-AVAssetExportSession-第二次导出尝试时停止操作,cocoa-touch,avfoundation,ios11,avassetexportsession,Cocoa Touch,Avfoundation,Ios11,Avassetexportsession,我正在创建一个画中画视频,这个功能已经完美地工作了1.5年(据我所知)。现在它出现在IOS 11中,只在第一次调用它时起作用…当调用它进行第二次视频时(不强制先关闭应用程序),我会收到下面的错误消息 我在stack上找到了这篇文章,但我已经按照这篇文章正确地使用了asset track: 我已经把我正在使用的确切方法。任何帮助都将不胜感激 错误消息: Error: Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stoppe

我正在创建一个画中画视频,这个功能已经完美地工作了1.5年(据我所知)。现在它出现在IOS 11中,只在第一次调用它时起作用…当调用它进行第二次视频时(不强制先关闭应用程序),我会收到下面的错误消息

我在stack上找到了这篇文章,但我已经按照这篇文章正确地使用了asset track:

我已经把我正在使用的确切方法。任何帮助都将不胜感激

错误消息:

Error: Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" 
UserInfo={NSLocalizedFailureReason=The video could not be composed., 
NSLocalizedDescription=Operation Stopped, 
NSUnderlyingError=0x1c04521e0 
{Error Domain=NSOSStatusErrorDomain Code=-17390 "(null)"}}
方法如下:

- (void) composeVideo:(NSString*)videoPIP onVideo:(NSString*)videoBG
{
@try {
    NSError *e = nil;

    AVURLAsset *backAsset, *pipAsset;

    // Load our 2 movies using AVURLAsset
    pipAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoPIP] options:nil];
    backAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:videoBG] options:nil];

    if ([[NSFileManager defaultManager] fileExistsAtPath:videoPIP])
    {
        NSLog(@"PIP File Exists!");
    }
    else
    {
        NSLog(@"PIP File DOESN'T Exist!");
    }

    if ([[NSFileManager defaultManager] fileExistsAtPath:videoBG])
    {
        NSLog(@"BG File Exists!");
    }
    else
    {
        NSLog(@"BG File DOESN'T Exist!");
    }

    float scaleH = VIDEO_SIZE.height / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].width;
    float scaleW = VIDEO_SIZE.width / [[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height;

    float scalePIP = (VIDEO_SIZE.width * 0.25) / [[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width;

    // Create AVMutableComposition Object - this object will hold our multiple AVMutableCompositionTracks.
    AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

    // Create the first AVMutableCompositionTrack by adding a new track to our AVMutableComposition.
    AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];

    // Set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to our newly created track at kCMTimeZero so video plays from the start of the track.
    [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, pipAsset.duration) ofTrack:[[pipAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];
    if (e)
    {
        NSLog(@"Error0: %@",e);
        e = nil;
    }

    // Repeat the same process for the 2nd track and also start at kCMTimeZero so both tracks will play simultaneously.
    AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&e];

    if (e)
    {
        NSLog(@"Error1: %@",e);
        e = nil;
    }

    // We also need the audio track!
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, backAsset.duration) ofTrack:[[backAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:&e];
    if (e)
    {
        NSLog(@"Error2: %@",e);
        e = nil;
    }


    // Create an AVMutableVideoCompositionInstruction object - Contains the array of AVMutableVideoCompositionLayerInstruction objects.
    AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

    // Set Time to the shorter Asset.
    MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);

    // Create an AVMutableVideoCompositionLayerInstruction object to make use of CGAffinetransform to move and scale our First Track so it is displayed at the bottom of the screen in smaller size.
    AVMutableVideoCompositionLayerInstruction *FirstlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];

    //CGAffineTransform Scale1 = CGAffineTransformMakeScale(0.3f,0.3f);
    CGAffineTransform Scale1 = CGAffineTransformMakeScale(scalePIP, scalePIP);

    // Top Left
    CGAffineTransform Move1 = CGAffineTransformMakeTranslation(3.0, 3.0);

    [FirstlayerInstruction setTransform:CGAffineTransformConcat(Scale1,Move1) atTime:kCMTimeZero];

    // Repeat for the second track.
    AVMutableVideoCompositionLayerInstruction *SecondlayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];

    CGAffineTransform Scale2 = CGAffineTransformMakeScale(scaleW, scaleH);
    CGAffineTransform rotateBy90Degrees = CGAffineTransformMakeRotation( M_PI_2);
    CGAffineTransform Move2 = CGAffineTransformMakeTranslation(0.0, ([[[backAsset tracksWithMediaType:AVMediaTypeVideo ] objectAtIndex:0] naturalSize].height) * -1);

    [SecondlayerInstruction setTransform:CGAffineTransformConcat(Move2, CGAffineTransformConcat(rotateBy90Degrees, Scale2)) atTime:kCMTimeZero];

    // Add the 2 created AVMutableVideoCompositionLayerInstruction objects to our AVMutableVideoCompositionInstruction.
    MainInstruction.layerInstructions = [NSArray arrayWithObjects:FirstlayerInstruction, SecondlayerInstruction, nil];

    // Create an AVMutableVideoComposition object.
    AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
    MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
    MainCompositionInst.frameDuration = CMTimeMake(1, 30);


    // Set the render size to the screen size.
    //        MainCompositionInst.renderSize = [[UIScreen mainScreen] bounds].size;
    MainCompositionInst.renderSize = VIDEO_SIZE;


    NSString  *fileName = [NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"fullreaction.MP4"];

    // Make sure the video doesn't exist.
    if ([[NSFileManager defaultManager] fileExistsAtPath:fileName])
    {
        [[NSFileManager defaultManager] removeItemAtPath:fileName error:nil];
    }

    // Now we need to save the video.
    NSURL *url = [NSURL fileURLWithPath:fileName];
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:QUALITY];
    exporter.videoComposition = MainCompositionInst;
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeMPEG4;


    [exporter exportAsynchronouslyWithCompletionHandler:
     ^(void )
     {
         NSLog(@"File Saved as %@!", fileName);
         NSLog(@"Error: %@", exporter.error);
         [self performSelectorOnMainThread:@selector(runProcessingComplete) withObject:nil waitUntilDone:false];
     }];

}
@catch (NSException *ex) {
    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error 3" message:[NSString stringWithFormat:@"%@",ex]
                                                   delegate:self cancelButtonTitle:@"OK" otherButtonTitles: nil];
    [alert show];
}


}

原因: 结果是“main指令”的时间范围不正确

无法使用“值”比较CMTime对象。相反,您必须内联使用CMTIME\u COMPARE\u

若要修复,请替换此行:

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTIME_COMPARE_INLINE(pipAsset.duration, >, backAsset.duration) ? pipAsset.duration : backAsset.duration);
使用此行:

MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, (pipAsset.duration.value > backAsset.duration.value) ? pipAsset.duration : backAsset.duration);
MainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTIME_COMPARE_INLINE(pipAsset.duration, >, backAsset.duration) ? pipAsset.duration : backAsset.duration);

与苹果开发支持人员交谈,经过反复讨论后,怀疑这是一个IOS错误。我被要求向苹果公司提交一份错误报告。他们希望工程师能提供一个解决方案,在我得到答案时会不断更新。有更新吗?我真的被困在这上面了!对不起,是的,我有一个解决我的问题的方法,苹果bug团队找到了我的根本原因。我现在就发。