Ios 如何播放从NSData转换的AVAudioPCMBuffer音频

Ios 如何播放从NSData转换的AVAudioPCMBuffer音频,ios,objective-c,xcode,swift,Ios,Objective C,Xcode,Swift,我从udp数据包中获取音频PCM 16位单声道数据,如下所示: (void)udpSocket:(GCDAsyncUdpSocket *)sock didReceiveData:(NSData *)data fromAddress:(NSData *)address withFilterContext:(id)filter

我从udp数据包中获取音频PCM 16位单声道数据,如下所示:

(void)udpSocket:(GCDAsyncUdpSocket *)sock didReceiveData:(NSData *)data
                                               fromAddress:(NSData *)address
                                         withFilterContext:(id)filterContext
{
...
}
我通过调用swift函数将此数据转换为PCM缓冲区,如下所示:

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
    let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)  // given NSData audio format
    var PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity:1024*10)
    PCMBuffer.frameLength = PCMBuffer.frameCapacity

    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

    data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)

    return PCMBuffer
}
func-toPCMBuffer(数据:NSData)->AVAudioPCMBuffer{
let audioFormat=AVAudioFormat(commonFormat:AVAudioCommonFormat.PCMFormatFloat32,采样器:8000,通道:1,交错:false)//给定的NSData音频格式
var PCMBuffer=AVAudioPCMBuffer(PCMFormat:audioFormat,帧容量:1024*10)
PCMBuffer.frameLength=PCMBuffer.frameCapacity
let channels=UnsafeBufferPointer(开始:PCMBuffer.floatChannelData,计数:Int(PCMBuffer.format.channelCount))
data.getBytes(unsafemeutablepointer(通道[0]),长度:data.length)
回程传送器
}
数据被转换为PCM缓冲区,我可以在日志中看到它的长度。 但是当我试着播放缓冲器时,我听不到任何声音。 以下是接收代码:

func toPCMBuffer(data: NSData) -> AVAudioPCMBuffer {
        let audioFormat = AVAudioFormat(commonFormat: AVAudioCommonFormat.PCMFormatFloat32, sampleRate: 8000, channels: 1, interleaved: false)  // given NSData audio format
        var PCMBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity:1024*10)
        PCMBuffer.frameLength = PCMBuffer.frameCapacity

        let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: Int(PCMBuffer.format.channelCount))

        data.getBytes(UnsafeMutablePointer<Void>(channels[0]) , length: data.length)
        var mainMixer = audioEngine.mainMixerNode
        audioEngine.attachNode(audioFilePlayer)
        audioEngine.connect(audioFilePlayer, to:mainMixer, format: PCMBuffer.format)
        audioEngine.startAndReturnError(nil)

        audioFilePlayer.play()
        audioFilePlayer.scheduleBuffer(PCMBuffer, atTime: nil, options: nil, completionHandler: nil)
        return PCMBuffer
    }
func-toPCMBuffer(数据:NSData)->AVAudioPCMBuffer{
let audioFormat=AVAudioFormat(commonFormat:AVAudioCommonFormat.PCMFormatFloat32,采样器:8000,通道:1,交错:false)//给定的NSData音频格式
var PCMBuffer=AVAudioPCMBuffer(PCMFormat:audioFormat,帧容量:1024*10)
PCMBuffer.frameLength=PCMBuffer.frameCapacity
let channels=UnsafeBufferPointer(开始:PCMBuffer.floatChannelData,计数:Int(PCMBuffer.format.channelCount))
data.getBytes(unsafemeutablepointer(通道[0]),长度:data.length)
var mainMixer=audioEngine.mainMixerNode
audioEngine.attachNode(audioFilePlayer)
audioEngine.connect(audioFilePlayer,至:主混音器,格式:PCMBuffer.format)
音频引擎。启动和返回错误(无)
audioFilePlayer.play()
scheduleBuffer(PCMBuffer,时间:nil,选项:nil,completionHandler:nil)
回程传送器
}

最后使用了objective-c函数:数据转换很好

-(AudioBufferList *) getBufferListFromData: (NSData *) data
{
    if (data.length > 0)
    {
        NSUInteger len = [data length];
        //NSData *d2 = [data subdataWithRange:NSMakeRange(4, 1028)];
        //I guess you can use Byte*, void* or Float32*. I am not sure if that makes any difference.
        Byte* byteData = (Byte*) malloc (len);
        memcpy (byteData, [data bytes], len);
        if (byteData)
        {
            AudioBufferList * theDataBuffer =(AudioBufferList*)malloc(sizeof(AudioBufferList) * 1);
            theDataBuffer->mNumberBuffers = 1;
            theDataBuffer->mBuffers[0].mDataByteSize =(UInt32) len;
            theDataBuffer->mBuffers[0].mNumberChannels = 1;
            theDataBuffer->mBuffers[0].mData = byteData;
            // Read the data into an AudioBufferList
            return theDataBuffer;
        }
    }
    return nil;
}

嗨,你找到这个问题的解决办法了吗?是的,我找到了。我将来自android的16位音频传入pcm,并在IOS设备中播放。使用audiobufferlist。我将在一天内发布答案。声音有点滞后。希望你能解决:)嘿,你能发布解决方案吗?谢谢你。我在这里发布您是如何安排使用您的
audioFilePlayer
?我使用AudioComponentInstance通过此播放音频。