Ios 如何使用叠加视图录制视频

Ios 如何使用叠加视图录制视频,ios,overlay,avfoundation,video-processing,camera-overlay,Ios,Overlay,Avfoundation,Video Processing,Camera Overlay,嗨,我正在尝试用叠加录制视频 我写过: -(void)addOvelayViewToVideo:(NSURL *)videoURL 在录制的视频上添加覆盖视图,但该操作不起作用 我使用AVCaptureSession编写了在viewDidLoad中录制视频的代码 //In ViewDidLoad //CONFIGURE DISPLAY OUTPUT self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.cap

嗨,我正在尝试用叠加录制视频

我写过:

-(void)addOvelayViewToVideo:(NSURL *)videoURL
在录制的视频上添加覆盖视图,但该操作不起作用

我使用
AVCaptureSession
编写了在
viewDidLoad
中录制视频的代码

//In ViewDidLoad
//CONFIGURE DISPLAY OUTPUT
self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
self.previewLayer.frame = self.view.frame;
[self.view.layer addSublayer:self.previewLayer];


-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error
{
    if(error.code != noErr)
    {
        id value = [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
        if(value)
        {
            isSuccess = [value boolValue];
        }
    }

    if(isSuccess)
    {
        ALAssetsLibrary *assetsLibrary = [[ALAssetsLibrary alloc] init];
        if([assetsLibrary videoAtPathIsCompatibleWithSavedPhotosAlbum:outputFileURL])
        {
            [self addOverviewToVideo:outputFileURL];
        }
        else{
            NSLog(@"could not saved to photos album.");
        }
    }

}


-(void)addOvelayViewToVideo:(NSURL *)videoURL
{
    AVAsset *asset = [AVAsset assetWithURL:videoURL];
    AVMutableComposition *composition = [[AVMutableComposition alloc] init];

    AVMutableCompositionTrack *compositionTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [compositionTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
    AVMutableVideoCompositionInstruction *compositionInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    compositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
    AVMutableVideoCompositionLayerInstruction *videoLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:compositionTrack];
    AVAssetTrack *assetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;
    BOOL isVideoAssetPortrait_  = NO;
    CGAffineTransform videoTransform = assetTrack.preferredTransform;
    if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
        videoAssetOrientation_ = UIImageOrientationRight;
        isVideoAssetPortrait_ = YES;
    }
    if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
        videoAssetOrientation_ =  UIImageOrientationLeft;
        isVideoAssetPortrait_ = YES;
    }
    if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
        videoAssetOrientation_ =  UIImageOrientationUp;
    }
    if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
        videoAssetOrientation_ = UIImageOrientationDown;
    }
    [videoLayerInstruction setTransform:assetTrack.preferredTransform atTime:kCMTimeZero];
    [videoLayerInstruction setOpacity:0.0 atTime:asset.duration];



    compositionInstruction.layerInstructions = [NSArray arrayWithObject:videoLayerInstruction];
    AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];

    CGSize naturalSize = CGSizeMake(assetTrack.naturalSize.height, assetTrack.naturalSize.width);
    float renderWidth, renderHeight;
    renderWidth = naturalSize.width;
    renderHeight = naturalSize.height;
    videoComposition.renderSize = CGSizeMake(renderWidth, renderHeight);
    videoComposition.instructions = [NSArray arrayWithObject:compositionInstruction];
    videoComposition.frameDuration = CMTimeMake(1, 30);


    CALayer *overlayLayer = [CALayer layer];
    UIImage *overlayImage = [UIImage imageNamed:@"sampleHUD"];
    [overlayLayer setContents:(id)[overlayImage CGImage]];
    overlayLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
    [overlayLayer setMasksToBounds:YES];

    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
    videoLayer.frame = CGRectMake(0, 0, naturalSize.width, naturalSize.height);
    [parentLayer addSublayer:videoLayer];
    [parentLayer addSublayer:overlayLayer];

    videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
    NSLog(@"renderSize:%f,%f", videoComposition.renderSize.width, videoComposition.renderSize.height);

    AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
    exportSession.outputURL = videoURL;
    exportSession.outputFileType = AVFileTypeQuickTimeMovie;
    exportSession.shouldOptimizeForNetworkUse = YES;
    exportSession.videoComposition = videoComposition;
    [exportSession exportAsynchronouslyWithCompletionHandler:^{
        dispatch_async(dispatch_get_main_queue(), ^{
            //save the video in photos album
        });
    }];

}
我仍然不知道这里出了什么问题。我们需要这方面的指导

我可以在录制视频时添加覆盖吗


任何帮助都将不胜感激。

未阅读您的完整代码。原因显而易见。您尝试过吗?[yourVideoPreviewLayer insertSublayer:yourOverlayImageVIew.layer Up:yourVideoPreviewLayer];也请参考此链接尝试此代码,并让我知道它的工作状态是否未定义?并声明分派队列和电影写入队列;AVAssetWriterInput*assetWriterVideoIn;AVAssetWriter*assetWriter;有没有其他方法可以通过使用