Ios 在swift中使用CocoaAsyncSocket通过UDP传输音频

Ios 在swift中使用CocoaAsyncSocket通过UDP传输音频,ios,swift,udp,avaudioengine,cocoaasyncsocket,Ios,Swift,Udp,Avaudioengine,Cocoaasyncsocket,我在swift中通过UDP通过CocoaAsyncSocket发送音频时遇到问题 首先,我运行下面的代码开始侦听4444 UDP端口 vlc --demux=rawaud --rawaud-channels=1 --rawaud-samplerate=48000 udp://@:4444 之后,我在iPad2中运行应用程序并按connect import UIKit import CocoaAsyncSocket import AVFoundation class ViewController

我在swift中通过UDP通过CocoaAsyncSocket发送音频时遇到问题

首先,我运行下面的代码开始侦听4444 UDP端口

vlc --demux=rawaud --rawaud-channels=1 --rawaud-samplerate=48000 udp://@:4444
之后,我在iPad2中运行应用程序并按connect

import UIKit
import CocoaAsyncSocket
import AVFoundation
class ViewController: UIViewController , GCDAsyncUdpSocketDelegate {

var avAudioEngine : AVAudioEngine?

@IBAction func btnAction(sender: UIButton) {

    avAudioEngine = AVAudioEngine()
    let input = avAudioEngine?.inputNode

    let socket = GCDAsyncUdpSocket(delegate: self, delegateQueue: dispatch_get_main_queue())
    do {
        try socket.bindToPort(4445)
        try socket.connectToHost("192.168.0.137",onPort : 4444)
        try socket.beginReceiving()

        input?.installTapOnBus(0, bufferSize: 2048, format: input?.inputFormatForBus(0), block: { (buffer : AVAudioPCMBuffer, timeE: AVAudioTime) -> Void in
            socket.sendData(self.toNSData(buffer), withTimeout: 0, tag: 0)
        })

        avAudioEngine?.prepare()
        try avAudioEngine?.start()

        // with nc -l -u -p 4444 and uncomment below block I can get "someText"
       //socket.sendData("someText\n".dataUsingEncoding(NSUTF8StringEncoding), withTimeout: 0, tag: 0)
       // socket.close()

    }

    catch{
        print("err")
    }

}

    //socket.sendData just accepts NSData, So I think that we must convert it to NSData!

func toNSData(PCMBuffer: AVAudioPCMBuffer) -> NSData {
    let channelCount = 1  // given PCMBuffer channel count is 1
    let channels = UnsafeBufferPointer(start: PCMBuffer.floatChannelData, count: channelCount)
    let ch0Data = NSData(bytes: channels[0], length:Int(PCMBuffer.frameCapacity * PCMBuffer.format.streamDescription.memory.mBytesPerFrame))
    return ch0Data
}

有什么想法吗?

你能展示一下你自己的。toNSData(缓冲区)方法吗?那个缓冲区有1声道还是2声道音频?我以前写过,向下滚动:)@AliKörabbaslu你解决了这个问题吗?我正在研究一个类似的解决方案,我想知道我是否应该使用这个框架。@nullforlife不幸没有工作。我现在正在编写自己的框架,完成后我将在github中使用gut。@AliKörabbaslu我明白了。很高兴听到您正在使用自己的框架。您正在深入研究CFNetwork和CFSocket API吗?