Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/iphone/42.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/swift/19.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Iphone 在Swift中合并视频数组_Iphone_Swift_Video_Video Processing - Fatal编程技术网

Iphone 在Swift中合并视频数组

Iphone 在Swift中合并视频数组,iphone,swift,video,video-processing,Iphone,Swift,Video,Video Processing,我正在尝试合并AVSSET阵列中的视频,但我只能获得第一个和最后一个视频。阵列之间的视频显示黑色区域。 检查我正在使用的代码。 func mergeVideoArray() { let mixComposition = AVMutableComposition() for videoAsset in videoURLArray { let videoTrack = mixComposition.addMutableTrack(wi

我正在尝试合并AVSSET阵列中的视频,但我只能获得第一个和最后一个视频。阵列之间的视频显示黑色区域。 检查我正在使用的代码。

    func mergeVideoArray() {
    let mixComposition = AVMutableComposition()
    for videoAsset in videoURLArray {

        let videoTrack =
            mixComposition.addMutableTrack(withMediaType: AVMediaType.video,
                                           preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
        do {
            try videoTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration),
                                            of: videoAsset.tracks(withMediaType: AVMediaType.video).first!,
                                           at: totalTime)
            videoSize = (videoTrack?.naturalSize)!
        } catch let error as NSError {
            print("error: \(error)")
        }
        let trackArray = videoAsset.tracks(withMediaType: .audio)
        if trackArray.count > 0 {
            let audioTrack =
                mixComposition.addMutableTrack(withMediaType: AVMediaType.audio,
                                               preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
            do {
                try audioTrack?.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: videoAsset.tracks(withMediaType: AVMediaType.audio).first!, at: audioTime)
                audioTime = audioTime + videoAsset.duration
            }
            catch {

            }
        }
        totalTime = totalTime + videoAsset.duration
        let videoInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack!)
        if videoAsset != videoURLArray.last{
            videoInstruction.setOpacity(0.0, at: videoAsset.duration)
        }
        layerInstructionsArray.append(videoInstruction)
    }
    let mainInstruction = AVMutableVideoCompositionInstruction()

    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, totalTime)
    mainInstruction.layerInstructions = layerInstructionsArray

    let mainComposition = AVMutableVideoComposition()
    mainComposition.instructions = [mainInstruction]
    mainComposition.frameDuration = CMTimeMake(1, 30)
    mainComposition.renderSize = CGSize(width: videoSize.width, height: videoSize.height)

    let url = "merge_video".outputURL
    let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
    exporter!.outputURL = url
    exporter!.outputFileType = AVFileType.mov
    exporter!.shouldOptimizeForNetworkUse = false
    exporter!.videoComposition = mainComposition

    exporter!.exportAsynchronously {

        let video = AVAsset(url: url)
        let playerItem = AVPlayerItem(asset: video)
        let player = AVPlayer(playerItem: playerItem)
        self.playerViewController.player = player

        self.present(self.playerViewController, animated: true) {
            self.playerViewController.player!.play()
        }
    }
}
请帮我解决这个问题。提前谢谢


注意我可以从阵列创建视频,但视频中只显示第一个和最后一个索引值。对于其余的值,只显示空白屏幕。

我刚刚解决了我的问题,只需要更新代码中的一行。请看一下代码

    if videoAsset != videoURLArray.last{
        videoInstruction.setOpacity(0.0, at: totalTime)
    }

注意:只需为数组的每个值更改下一个视频的at位置。

我尝试了你的解决方案,但不起作用,我的第二个视频持续时间增加了,但视频内容没有增加。如果你能解决黑屏问题,我也会遇到同样的问题。如果你有解决方案,请和我分享好吗