Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/swift/20.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios 用AVFoundation覆盖两个视频_Ios_Swift_Video_Avfoundation_Avmutablecomposition - Fatal编程技术网

Ios 用AVFoundation覆盖两个视频

Ios 用AVFoundation覆盖两个视频,ios,swift,video,avfoundation,avmutablecomposition,Ios,Swift,Video,Avfoundation,Avmutablecomposition,我试图覆盖两个视频,前景视频有点阿尔法透明。我也一直在关注这个问题 每当我试着把两个相同的视频通过它不会崩溃;但是,当我尝试给它输入两个不同的视频时,我收到了以下错误: VideoMaskingUtils.exportVideo Error: Optional(Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped

我试图覆盖两个视频,前景视频有点阿尔法透明。我也一直在关注这个问题

每当我试着把两个相同的视频通过它不会崩溃;但是,当我尝试给它输入两个不同的视频时,我收到了以下错误:

VideoMaskingUtils.exportVideo Error: Optional(Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.})
VideoMaskingUtils.exportVideo Description: <AVAssetExportSession: 0x1556be30, asset = <AVMutableComposition: 0x15567f10 tracks = (
"<AVMutableCompositionTrack: 0x15658030 trackID = 1, mediaType = vide, editCount = 1>",
"<AVMutableCompositionTrack: 0x1556e250 trackID = 2, mediaType = vide, editCount = 1>"
)>, presetName = AVAssetExportPresetHighestQuality, outputFileType = public.mpeg-4
Error Domain=AVFoundationErrorDomain Code=-11841 "Operation Stopped" UserInfo={NSLocalizedDescription=Operation Stopped, NSLocalizedFailureReason=The video could not be composed.}
这是我的
exportCompositedVideo
函数

private class func exportCompositedVideo(compiledVideo: AVMutableComposition, toURL outputUrl: NSURL, withVideoComposition videoComposition: AVMutableVideoComposition) {
    guard let exporter = AVAssetExportSession(asset: compiledVideo, presetName: AVAssetExportPresetHighestQuality) else { return }
    exporter.outputURL = outputUrl
    exporter.videoComposition = videoComposition
    exporter.outputFileType = AVFileTypeQuickTimeMovie
    exporter.shouldOptimizeForNetworkUse = true
    exporter.exportAsynchronouslyWithCompletionHandler({
        switch exporter.status {
        case .Completed:
            // we can be confident that there is a URL because
            // we got this far. Otherwise it would've failed.
            UISaveVideoAtPathToSavedPhotosAlbum(exporter.outputURL!.path!, nil, nil, nil)
            print("VideoMaskingUtils.exportVideo SUCCESS!")
            if exporter.error != nil {
                print("VideoMaskingUtils.exportVideo Error: \(exporter.error)")
                print("VideoMaskingUtils.exportVideo Description: \(exporter.description)")
            }

            NSNotificationCenter.defaultCenter().postNotificationName("videoExportDone", object: exporter.error)
            break

        case .Exporting:
            let progress = exporter.progress
            print("VideoMaskingUtils.exportVideo \(progress)")

            NSNotificationCenter.defaultCenter().postNotificationName("videoExportProgress", object: progress)
            break

        case .Failed:
            print("VideoMaskingUtils.exportVideo Error: \(exporter.error)")
            print("VideoMaskingUtils.exportVideo Description: \(exporter.description)")

            NSNotificationCenter.defaultCenter().postNotificationName("videoExportDone", object: exporter.error)
            break

        default: break
        }
    })
}

您的
min
应该是
max

更换这条线

instruction.timeRange = CMTimeRangeMake(kCMTimeZero, min(firstAsset.duration, secondAsset.duration))
有了这条线,它就可以工作了:

instruction.timeRange = CMTimeRangeMake(kCMTimeZero, max(firstAsset.duration, secondAsset.duration))

与此类似。请参阅上面问题的链接,解决方案是使用支持和alpha通道的编码方法,如链接中所述。默认情况下,iOS无法使用H.264实现这一点。
instruction.timeRange = CMTimeRangeMake(kCMTimeZero, max(firstAsset.duration, secondAsset.duration))