Ios AVFoundation-视频合并,但仅播放最后一个视频
我有一个Ios AVFoundation-视频合并,但仅播放最后一个视频,ios,swift,avfoundation,avasset,avmutablecomposition,Ios,Swift,Avfoundation,Avasset,Avmutablecomposition,我有一个[AVAsset]()数组。每当我以不同的持续时间录制不同的视频时,下面的代码会将所有持续时间合并为一个视频,但它只会在循环中播放最后一个视频 例如,视频1是1分钟,显示一只狗在走,视频2是1分钟,显示一只鸟在飞,视频3是1分钟,显示一匹马在跑。视频将合并并播放3分钟,但仅显示马连续三次每跑1分钟 我哪里出错了 var movieFileOutput = AVCaptureMovieFileOutput() var arrayVideos = [AVAsset]() var videoF
[AVAsset]()
数组。每当我以不同的持续时间录制不同的视频时,下面的代码会将所有持续时间合并为一个视频,但它只会在循环中播放最后一个视频
例如,视频1是1分钟,显示一只狗在走,视频2是1分钟,显示一只鸟在飞,视频3是1分钟,显示一匹马在跑。视频将合并并播放3分钟,但仅显示马连续三次每跑1分钟
我哪里出错了
var movieFileOutput = AVCaptureMovieFileOutput()
var arrayVideos = [AVAsset]()
var videoFileUrl: URL?
// button to record video
@objc func recordButtonTapped() {
// Stop recording
if movieFileOutput.isRecording {
movieFileOutput.stopRecording()
print("Stop Recording")
} else {
// Start recording
movieFileOutput.connection(with: AVMediaType.video)?.videoOrientation = videoOrientation()
movieFileOutput.maxRecordedDuration = maxRecordDuration()
videoFileUrl = URL(fileURLWithPath: videoFileLocation())
if let videoFileUrlFromCamera = videoFileUrl {
movieFileOutput.startRecording(to: videoFileUrlFromCamera, recordingDelegate: self)
}
}
}
func videoFileLocation() -> String {
return NSTemporaryDirectory().appending("videoFile.mov")
}
// button to save the merged video
@objc func saveButtonTapped() {
mergeVids()
}
// function to merge and save videos
func mergeVids() {
let mixComposition = AVMutableComposition()
let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
compositionVideoTrack?.preferredTransform = CGAffineTransform(rotationAngle: .pi / 2)
let soundtrackTrack = mixComposition.addMutableTrack(withMediaType: .audio,
preferredTrackID: Int32(kCMPersistentTrackID_Invalid))
var insertTime = CMTime.zero
for videoAsset in arrayVideos {
do {
try compositionVideoTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero,
duration: videoAsset.duration),
of: videoAsset.tracks(withMediaType: .video)[0],
at: insertTime)
try soundtrackTrack?.insertTimeRange(CMTimeRangeMake(start: CMTime.zero,
duration: videoAsset.duration),
of: videoAsset.tracks(withMediaType: .audio)[0],
at: insertTime)
insertTime = CMTimeAdd(insertTime, videoAsset.duration)
} catch let error as NSError {
print("\(error.localizedDescription)")
}
}
let outputFileURL = URL(fileURLWithPath: NSTemporaryDirectory() + "merge.mp4")
let path = outputFileURL.path
if FileManager.default.fileExists(atPath: path) {
try! FileManager.default.removeItem(atPath: path)
}
let exporter = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetHighestQuality)
exporter!.outputURL = outputFileURL
exporter!.outputFileType = AVFileType.mp4
exporter!.shouldOptimizeForNetworkUse = true
exporter!.exportAsynchronously { [weak self] in
let cameraVideoURL = exporter!.outputURL!
PHPhotoLibrary.shared().performChanges({
PHAssetChangeRequest.creationRequestForAssetFromVideo(atFileURL: cameraVideoURL)
}) { (saved, error) in
if let error = error { return }
if !saved { return }
// url is saved
self?.videoFileUrl = nil
self?.arrayVideos.removeAll()
}
}
}
// AVCaptureFileOutputRecording Delegates
func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
print("+++++++++++++++Started")
print("*****Started recording: \(fileURL)\n")
}
func fileOutput(_ output: AVCaptureFileOutput, didFinishRecordingTo outputFileURL: URL, from connections: [AVCaptureConnection], error: Error?) {
if error == nil {
let asset = AVAsset(url: outputFileURL)
arrayVideos.append(asset)
print(arrayVideos.count)
} else {
print("Error recording movie: \(error!.localizedDescription)")
}
func cleanUp() {
let path = outputFileURL.path
if FileManager.default.fileExists(atPath: path) {
do {
try FileManager.default.removeItem(atPath: path)
} catch {
print("Could not remove file at url: \(outputFileURL)")
}
}
}
}
func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
print("++++++Frame Drop: \(connection.description)")
}
感谢@alxlives测试了merge函数,并指出由于在他的机器上没有问题,所以问题一定出在其他地方 问题就在这里:
func videoFileLocation() -> String {
return NSTemporaryDirectory().appending("videoFile.mov")
}
在使用上述代码时所显示的RecordButton中,它一直使用相同的“videoFile.mov”扩展名:
videoFileUrl = URL(fileURLWithPath: videoFileLocation()) // <<< it gets called here every time a new video runs
if let videoFileUrlFromCamera = videoFileUrl {
movieFileOutput.startRecording(to: videoFileUrlFromCamera, recordingDelegate: self)
}
我刚刚测试了你的代码,它正在正确地合并三个视频。在iOS 13上使用本地720p.mp4文件进行测试。为了进行测试,您能否调试视频数组,以查看同一视频是否没有被添加三次?我刚刚尝试过,在循环开始之前放置了一个断点,调试器在数组中显示了3个具有3个不同内存地址的对象,因此不能这样做。我想知道DidFinishRedingTo outputFileURL回调是否导致问题?我所做的就是获取outputFileURL,将其插入到一个资源中,然后将该资源添加到数组:let Asset=AVAsset(url:outputFileURL);arrayVideos.append(资产)。真奇怪,这对你有效,但对我无效。谢谢你的帮助,这是一个非常好的建议:)你能用录音和添加到数组函数来编辑这个问题吗?显然,
mergeVids()
函数很好。当然,请给我几分钟时间获取代码up@alxlives我添加了代码,有机会的时候请看一下
func videoFileLocation() -> String {
let uuid = UUID().uuidString
return NSTemporaryDirectory().appending("videoFile_\(uuid).mov")
}