Ios 将CMSampleBuffer转换为AvaudioPmBuffer以获取实时音频

Ios 将CMSampleBuffer转换为AvaudioPmBuffer以获取实时音频,ios,swift,audio-streaming,Ios,Swift,Audio Streaming,我试图从AVCaptureAudioDataOutputSampleBufferDelegate的captureOutput返回的CMSampleBuffer读取频率值 这个想法是创建一个AVAudioPCMBuffer,这样我就可以读取它的浮动通道数据了。但我不知道如何将缓冲区传递给它 我想我可以用以下方法创建它: public func captureOutput(\uOutput:AVCaptureOutput, didOutput sampleBuffer:CMSampleBuffer,

我试图从
AVCaptureAudioDataOutputSampleBufferDelegate
captureOutput
返回的
CMSampleBuffer
读取频率值

这个想法是创建一个
AVAudioPCMBuffer
,这样我就可以读取它的
浮动通道数据了。但我不知道如何将缓冲区传递给它

我想我可以用以下方法创建它:

public func captureOutput(\uOutput:AVCaptureOutput,
didOutput sampleBuffer:CMSampleBuffer,
来自连接:AVCaptureConnection){
guard let blockBuffer=CMSampleBufferGetDataBuffer(sampleBuffer)else{
返回
}
let length=CMBlockBufferGetDataLength(块缓冲区)
let audioFormat=AVAudioFormat(commonFormat:.pcmFormatFloat32,采样器:44100,通道:1,交错:false)
设pcmBuffer=AVAudioPCMBuffer(pcmFormat:audioFormat!,帧容量:AVAudioFrameCount(长度))
pcmBuffer?.frameLength=pcmBuffer!.frameCapacity

但是我怎样才能填充它的数据呢?

以下几点应该会有所帮助:

var asbd = CMSampleBufferGetFormatDescription(sampleBuffer)!.audioStreamBasicDescription!
var audioBufferList = AudioBufferList()
var blockBuffer : CMBlockBuffer?

CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
    sampleBuffer,
    bufferListSizeNeededOut: nil,
    bufferListOut: &audioBufferList,
    bufferListSize: MemoryLayout<AudioBufferList>.size,
    blockBufferAllocator: nil,
    blockBufferMemoryAllocator: nil,
    flags: 0,
    blockBufferOut: &blockBuffer
)

let mBuffers = audioBufferList.mBuffers
let frameLength = AVAudioFrameCount(Int(mBuffers.mDataByteSize) / MemoryLayout<Float>.size)
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat(streamDescription: &asbd)!, frameCapacity: frameLength)!
pcmBuffer.frameLength = frameLength
pcmBuffer.mutableAudioBufferList.pointee.mBuffers = mBuffers
pcmBuffer.mutableAudioBufferList.pointee.mNumberBuffers = 1
var asbd=CMSampleBufferGetFormatDescription(sampleBuffer)!.audioStreamBasicDescription!
var audioBufferList=audioBufferList()
var blockBuffer:CMBlockBuffer?
CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
样本缓冲区,
BufferListSizeneedDout:nil,
bufferListOut:&audioBufferList,
bufferListSize:MemoryLayout.size,
blockBufferAllocator:nil,
BlockBufferMemoryLocator:无,
旗帜:0,,
blockBufferOut:&blockBuffer
)
设mBuffers=audioBufferList.mBuffers
让frameLength=AVAudioFrameCount(Int(mBuffers.mDataByteSize)/MemoryLayout.size)
设pcmBuffer=avaudiopcbuffer(pcmFormat:AVAudioFormat(streamDescription:&asbd)!,frameCapacity:frameLength)!
pcmBuffer.frameLength=帧长度
pcmBuffer.mutableAudioBufferList.pointee.mBuffers=mBuffers
pcmBuffer.mutableAudioBufferList.pointee.mNumberBuffers=1
这似乎在捕获会话的末尾创建了一个有效的AvaudiOpcBuffer。但是对于我现在的用例来说,它的帧长度是错误的,所以需要做一些进一步的缓冲