如何在iOS Swift中为CMSampleBufferGetFormatDescription创建一个AudioSampleBuffer

如何在iOS Swift中为CMSampleBufferGetFormatDescription创建一个AudioSampleBuffer,ios,swift,avfoundation,avassetwriter,video-compression,Ios,Swift,Avfoundation,Avassetwriter,Video Compression,我一直在用iOS Swift进行视频压缩,并遵循SO的答案。在我将这段代码的文件格式更改为.mp4 let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov) 我需要.mp4文件格式的输出是有原因的。因此,当我这样做时,它会使应用程序崩溃。给了我这个错误 2020-04-27 18:20:52.573614+0500 BrightCaster[7847:15137

我一直在用iOS Swift进行视频压缩,并遵循SO的答案。在我将这段代码的文件格式更改为.mp4

    let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov)
我需要.mp4文件格式的输出是有原因的。因此,当我这样做时,它会使应用程序崩溃。给了我这个错误

2020-04-27 18:20:52.573614+0500 BrightCaster[7847:1513728] *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVAssetWriter addInput:] In order to perform passthrough to file type public.mpeg-4, please provide a format hint in the AVAssetWriterInput initializer'
*** First throw call stack:
(0x1b331d5f0 0x1b303fbcc 0x1bd53b2b0 0x102383c0c 0x102382164 0x1021897cc 0x1b6ca73bc 0x1b6caba7c 0x1b6daec94 0x1b7835080 0x1b7834d30 0x1e9d077b4 0x1b786a764 0x1b783eb68 0x1b783f070 0x1e9d468f4 0x1b783f1c0 0x1e9d468f4 0x1b9e21d9c 0x105173730 0x105181710 0x1b329b748 0x1b329661c 0x1b3295c34 0x1bd3df38c 0x1b73c822c 0x10230f8a0 0x1b311d800)
libc++abi.dylib: terminating with uncaught exception of type NSException
所以我继续搜索,找到了和我的问题相关的问题。 但现在的问题是,当我尝试将其添加到函数中时,它会给我错误anAudioSampleBuffer not defined。由于我对音频/视频领域完全陌生,我无法理解为什么它会给我这个。以及如何解决这个问题。 下面是我在函数中添加的答案中的一段代码

    //setup audio writer
    //let formatDesc = CMSampleBufferGetFormatDescription(anAudioSampleBuffer)
    //let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: formatDesc)
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.add(audioWriterInput)
注释的部分不工作。任何帮助都将不胜感激

整个转换函数如下所示

func convertVideoToLowQuailtyWithInputURL(inputURL: URL, outputURL: URL, completion: @escaping (Bool , _ url: String) -> Void) {

    let videoAsset = AVURLAsset(url: inputURL as URL, options: nil)
    let videoTrack = videoAsset.tracks(withMediaType: AVMediaType.video)[0]
    let videoSize = videoTrack.naturalSize
    let videoWriterCompressionSettings = [
        AVVideoAverageBitRateKey : Int(125000)
    ]

    let videoWriterSettings:[String : AnyObject] = [
        AVVideoCodecKey : AVVideoCodecH264 as AnyObject,
        AVVideoCompressionPropertiesKey : videoWriterCompressionSettings as AnyObject,
        AVVideoWidthKey : Int(videoSize.width) as AnyObject,
        AVVideoHeightKey : Int(videoSize.height) as AnyObject
    ]

    let videoWriterInput = AVAssetWriterInput(mediaType: AVMediaType.video, outputSettings: videoWriterSettings)
    videoWriterInput.expectsMediaDataInRealTime = true
    videoWriterInput.transform = videoTrack.preferredTransform
    let videoWriter = try! AVAssetWriter(outputURL: outputURL as URL, fileType: AVFileType.mov) // for now its converting in .mov I THINK SO.
    videoWriter.add(videoWriterInput)



    //setup video reader
    let videoReaderSettings:[String : AnyObject] = [
        kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange) as AnyObject
    ]

    let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderSettings)
    var videoReader: AVAssetReader!

    do{

        videoReader = try AVAssetReader(asset: videoAsset)
    }
    catch {

        print("video reader error: \(error)")
        completion(false, "")
    }
    videoReader.add(videoReaderOutput)


    //setup audio writer
    //let formatDesc = CMSampleBufferGetFormatDescription(anAudioSampleBuffer) // this is giving me error here of un initilize, which I didn't I know.
    //let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: formatDesc)
    let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil)
    audioWriterInput.expectsMediaDataInRealTime = false
    videoWriter.add(audioWriterInput)
    //setup audio reader
    let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
    let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
    let audioReader = try! AVAssetReader(asset: videoAsset)
    audioReader.add(audioReaderOutput)
    videoWriter.startWriting()



    //start writing from video reader
    videoReader.startReading()
    videoWriter.startSession(atSourceTime: CMTime.zero)
    let processingQueue = DispatchQueue(label: "processingQueue1")
    videoWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
        while videoWriterInput.isReadyForMoreMediaData {
            let sampleBuffer:CMSampleBuffer? = videoReaderOutput.copyNextSampleBuffer();
            if videoReader.status == .reading && sampleBuffer != nil {
                videoWriterInput.append(sampleBuffer!)
            }
            else {
                videoWriterInput.markAsFinished()
                if videoReader.status == .completed {
                    //start writing from audio reader
                    audioReader.startReading()
                    videoWriter.startSession(atSourceTime: CMTime.zero)
                    let processingQueue = DispatchQueue(label: "processingQueue2")
                    audioWriterInput.requestMediaDataWhenReady(on: processingQueue, using: {() -> Void in
                        while audioWriterInput.isReadyForMoreMediaData {
                            let sampleBuffer:CMSampleBuffer? = audioReaderOutput.copyNextSampleBuffer()
                            if audioReader.status == .reading && sampleBuffer != nil {
                                audioWriterInput.append(sampleBuffer!)
                            }
                            else {
                                audioWriterInput.markAsFinished()
                                if audioReader.status == .completed {
                                    videoWriter.finishWriting(completionHandler: {() -> Void in
                                        completion(true, "\(videoWriter.outputURL)")
                                    })
                                }
                            }
                        }
                    })
                }
            }
        }
    })
}

您可以输出为mp4,通过提供以下格式提示传递音频(无转码):

let audioTrack = videoAsset.tracks(withMediaType: AVMediaType.audio)[0]
let audioWriterInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: nil, sourceFormatHint: audioTrack.formatDescriptions[0] as! CMFormatDescription)
注意
audioTrack
定义的新位置

我想苹果的
.mov
.mp4
实现都需要知道压缩音频格式来编写文件,但我想
.mov
可以在初始化后推断信息,而
.mp4
则不行。也许这是另一个惊喜


在您的例子中,我看到重新编写代码以从第一个示例缓冲区获取音频格式是很烦人的,但是我记得该格式可以从输入音频曲目获得。

您能提供有关崩溃的更多信息吗?它是在创建
AVAssetWriter
时发生的还是在以后某个时候发生的?崩溃是否导致记录任何错误消息?您好,我已编辑了带有崩溃错误的问题。您已将
AVAssetWriterInput
配置为接受已压缩的输入。这有点不寻常。这就是你想要的吗?如果没有,则需要设置
AVAssetWriterInput
输出设置
(如下面的答案所示)。你能显示你的整个
AVAssetWriter
设置代码吗?是的,我想压缩它。这就是我使用此代码的原因。我认为.mp4比.mov文件轻,如果我错了,请纠正我。您需要说明如何配置
AVAssetWriter
AVAssetWriterInput
。我不认为你会看到mp4和mov大小之间的显著差异。非常感谢我用你的代码替换了它,它不再崩溃了,你能详细说明一下这里发生了什么吗。我正在努力理解它。再次感谢你,好的,我知道了。非常感谢你,你帮了我很大的忙,非常感谢。