Ios 有关使用AVAudioEngine的详细信息

Ios 有关使用AVAudioEngine的详细信息,ios,swift,audio,swift2,core-audio,Ios,Swift,Audio,Swift2,Core Audio,背景:我找到了一个名为“AVAudioEngine in Practice”的苹果WWDC会议,我正在尝试制作类似于43:35的上一个演示()。我用SpriteKit代替SceneKit,但原理是一样的:我想生成球体,把它们扔过来,当它们碰撞时,引擎会播放每个球体特有的声音 问题: 我想要一个独特的AudioPlayerNode连接到每个SpriteKitNode,这样我就可以为每个球体播放不同的声音。i、 现在,如果我创建两个球体,并为它们的每个AudioPlayerNode设置不同的音高,

背景:我找到了一个名为“AVAudioEngine in Practice”的苹果WWDC会议,我正在尝试制作类似于43:35的上一个演示()。我用SpriteKit代替SceneKit,但原理是一样的:我想生成球体,把它们扔过来,当它们碰撞时,引擎会播放每个球体特有的声音

问题:

  • 我想要一个独特的AudioPlayerNode连接到每个SpriteKitNode,这样我就可以为每个球体播放不同的声音。i、 现在,如果我创建两个球体,并为它们的每个AudioPlayerNode设置不同的音高,即使原始球体发生碰撞,也只有最近创建的AudioPlayerNode似乎在播放。在演示过程中,他提到“我要给每个球绑上一名球员,一名专注的球员”。我该怎么做呢

  • 每次发生新的碰撞时都会出现音频点击/伪影。我假设这与AVAudioPlayerNodeBufferOptions有关,和/或我试图在每次接触时快速创建、调度和使用缓冲区,这不是最有效的方法。这方面有什么好办法

代码:如视频中所述,“……对于出生在这个世界上的每个球,也会创建一个新的球员节点”。我为spheres提供了一个单独的类,其中包含一个方法,该方法返回一个SpriteKitNode,并在每次调用时创建一个AudioPlayerNode:

class Sphere {

    var sphere: SKSpriteNode = SKSpriteNode(color: UIColor(), size: CGSize())
    var sphereScale: CGFloat = CGFloat(0.01)
    var spherePlayer = AVAudioPlayerNode()
    let audio = Audio()
    let sphereCollision: UInt32 = 0x1 << 0

    func createSphere(position: CGPoint, pitch: Float) -> SKSpriteNode {

        let texture = SKTexture(imageNamed: "Slice")
        let collisionTexture = SKTexture(imageNamed: "Collision")

        // Define the node

        sphere = SKSpriteNode(texture: texture, size: texture.size())

        sphere.position = position
        sphere.name = "sphere"
        sphere.physicsBody = SKPhysicsBody(texture: collisionTexture, size: sphere.size)
        sphere.physicsBody?.dynamic = true
        sphere.physicsBody?.mass = 0
        sphere.physicsBody?.restitution = 0.5
        sphere.physicsBody?.usesPreciseCollisionDetection = true
        sphere.physicsBody?.categoryBitMask = sphereCollision
        sphere.physicsBody?.contactTestBitMask = sphereCollision
        sphere.zPosition = 1

        // Create AudioPlayerNode

        spherePlayer = audio.createPlayer(pitch)

        return sphere    
    }
在我的GameSecene类中,我然后测试碰撞,安排缓冲区并在发生接触时播放AudioPlayerNode

 func didBeginContact(contact: SKPhysicsContact) {

        let firstBody: SKPhysicsBody = contact.bodyA

        if (firstBody.categoryBitMask & sphere.sphereCollision != 0) {

        let buffer1 = audio.createBuffer("PianoC1", type: "wav")
        sphere.spherePlayer.scheduleBuffer(buffer1, atTime: nil, options: AVAudioPlayerNodeBufferOptions.Interrupts, completionHandler: nil)
        sphere.spherePlayer.play()

        }
}

我是Swift新手,只具备编程的基本知识,因此欢迎提出任何建议/批评

我一直在scenekit开发AVAudioEngine,并尝试做其他事情,但这将是您想要的:

它解释了以下过程: 1-实例化您自己的AVAudioEngine子类 2-为每个AVAudioPlayer加载PCMBuffer的方法 3-更改环境节点的参数以适应大量弹球对象的混响

编辑:转换、测试并添加了一些功能:

1-创建AVAudioEngine的子类,例如将其命名为AudioLayerEngine。这是为了访问AVAudioUnit效果,例如失真、延迟、音调和许多其他音频单元可用的效果。 2-通过设置音频引擎的一些配置(如渲染算法)进行初始化,如果您处于二维但需要三维效果,则显示AVAudioEnvironmentNode以播放SCNNode对象或SKNode对象的三维位置 3-创建一些辅助方法,为您想要的每个AudioUnit效果加载预设 4-创建一个helper方法来创建音频播放器,然后将其添加到您想要的任何节点,添加次数可以根据您的需要而定,因为该SCNNode接受返回[AVAudioPlayer]或[SCNAudioPlayer]的.AudioPlayer方法 5-开始比赛

我已经粘贴了整个类以供参考,这样您就可以按照自己的意愿对其进行组织,但请记住,如果您将其与SceneKit或SpriteKit相结合,则可以使用此音频引擎来管理所有声音,而不是SceneKit的内部AVAudioEngine。这意味着您可以在AwakeFromNib方法期间在gameView中实例化它

import Foundation
import SceneKit
import AVFoundation

class AudioLayerEngine:AVAudioEngine{
    var engine:AVAudioEngine!
    var environment:AVAudioEnvironmentNode!
    var outputBuffer:AVAudioPCMBuffer!
    var voicePlayer:AVAudioPlayerNode!
    var multiChannelEnabled:Bool!
    //audio effects
    let delay = AVAudioUnitDelay()
    let distortion = AVAudioUnitDistortion()
    let reverb = AVAudioUnitReverb()

    override init(){
        super.init()
engine = AVAudioEngine()
environment = AVAudioEnvironmentNode()

engine.attachNode(self.environment)
voicePlayer = AVAudioPlayerNode()
engine.attachNode(voicePlayer)
voicePlayer.volume = 1.0
        outputBuffer = loadVoice()
        wireEngine()
        startEngine()
voicePlayer.scheduleBuffer(self.outputBuffer, completionHandler: nil)
voicePlayer.play()
    }

    func startEngine(){
        do{
            try engine.start()
        }catch{
            print("error loading engine")
        }
    }

    func loadVoice()->AVAudioPCMBuffer{
        let URL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("art.scnassets/sounds/interface/test", ofType: "aiff")!)
        do{
            let soundFile = try AVAudioFile(forReading: URL, commonFormat: AVAudioCommonFormat.PCMFormatFloat32, interleaved: false)
             outputBuffer = AVAudioPCMBuffer(PCMFormat: soundFile.processingFormat, frameCapacity: AVAudioFrameCount(soundFile.length))
            do{
            try soundFile.readIntoBuffer(outputBuffer)
            }catch{
                print("somethign went wrong with loading the buffer into the sound fiel")
            }
            print("returning buffer")
            return outputBuffer
        }catch{
        }
        return outputBuffer
    }

    func wireEngine(){
loadDistortionPreset(AVAudioUnitDistortionPreset.MultiCellphoneConcert)
        engine.attachNode(distortion)
        engine.attachNode(delay)
engine.connect(voicePlayer, to: distortion, format: self.outputBuffer.format)
        engine.connect(distortion, to: delay, format: self.outputBuffer.format)
                engine.connect(delay, to: environment, format: self.outputBuffer.format)
        engine.connect(environment, to: engine.outputNode, format: constructOutputFormatForEnvironment())

    }

    func constructOutputFormatForEnvironment()->AVAudioFormat{
let outputChannelCount = self.engine.outputNode.outputFormatForBus(1).channelCount
let hardwareSampleRate = self.engine.outputNode.outputFormatForBus(1).sampleRate
let environmentOutputConnectionFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareSampleRate, channels: outputChannelCount)
multiChannelEnabled = false
        return environmentOutputConnectionFormat
    }

    func loadDistortionPreset(preset: AVAudioUnitDistortionPreset){
        distortion.loadFactoryPreset(preset)
}

    func createPlayer(node: SCNNode){
        let player = AVAudioPlayerNode()
distortion.loadFactoryPreset(AVAudioUnitDistortionPreset.SpeechCosmicInterference)
engine.attachNode(player)
engine.attachNode(distortion)
engine.connect(player, to: distortion, format: outputBuffer.format)
        engine.connect(distortion, to: environment, format: constructOutputFormatForEnvironment())
let algo = AVAudio3DMixingRenderingAlgorithm.HRTF
        player.renderingAlgorithm = algo
        player.reverbBlend = 0.3
        player.renderingAlgorithm = AVAudio3DMixingRenderingAlgorithm.HRTF
    }

}

虽然此链接可以回答问题,但最好在此处包含答案的基本部分,并提供链接供参考。如果链接页面发生更改,仅链接的答案可能无效。-@Beaunuvelle我用经过测试的完整代码和一个额外的特性编辑了答案
import Foundation
import SceneKit
import AVFoundation

class AudioLayerEngine:AVAudioEngine{
    var engine:AVAudioEngine!
    var environment:AVAudioEnvironmentNode!
    var outputBuffer:AVAudioPCMBuffer!
    var voicePlayer:AVAudioPlayerNode!
    var multiChannelEnabled:Bool!
    //audio effects
    let delay = AVAudioUnitDelay()
    let distortion = AVAudioUnitDistortion()
    let reverb = AVAudioUnitReverb()

    override init(){
        super.init()
engine = AVAudioEngine()
environment = AVAudioEnvironmentNode()

engine.attachNode(self.environment)
voicePlayer = AVAudioPlayerNode()
engine.attachNode(voicePlayer)
voicePlayer.volume = 1.0
        outputBuffer = loadVoice()
        wireEngine()
        startEngine()
voicePlayer.scheduleBuffer(self.outputBuffer, completionHandler: nil)
voicePlayer.play()
    }

    func startEngine(){
        do{
            try engine.start()
        }catch{
            print("error loading engine")
        }
    }

    func loadVoice()->AVAudioPCMBuffer{
        let URL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("art.scnassets/sounds/interface/test", ofType: "aiff")!)
        do{
            let soundFile = try AVAudioFile(forReading: URL, commonFormat: AVAudioCommonFormat.PCMFormatFloat32, interleaved: false)
             outputBuffer = AVAudioPCMBuffer(PCMFormat: soundFile.processingFormat, frameCapacity: AVAudioFrameCount(soundFile.length))
            do{
            try soundFile.readIntoBuffer(outputBuffer)
            }catch{
                print("somethign went wrong with loading the buffer into the sound fiel")
            }
            print("returning buffer")
            return outputBuffer
        }catch{
        }
        return outputBuffer
    }

    func wireEngine(){
loadDistortionPreset(AVAudioUnitDistortionPreset.MultiCellphoneConcert)
        engine.attachNode(distortion)
        engine.attachNode(delay)
engine.connect(voicePlayer, to: distortion, format: self.outputBuffer.format)
        engine.connect(distortion, to: delay, format: self.outputBuffer.format)
                engine.connect(delay, to: environment, format: self.outputBuffer.format)
        engine.connect(environment, to: engine.outputNode, format: constructOutputFormatForEnvironment())

    }

    func constructOutputFormatForEnvironment()->AVAudioFormat{
let outputChannelCount = self.engine.outputNode.outputFormatForBus(1).channelCount
let hardwareSampleRate = self.engine.outputNode.outputFormatForBus(1).sampleRate
let environmentOutputConnectionFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareSampleRate, channels: outputChannelCount)
multiChannelEnabled = false
        return environmentOutputConnectionFormat
    }

    func loadDistortionPreset(preset: AVAudioUnitDistortionPreset){
        distortion.loadFactoryPreset(preset)
}

    func createPlayer(node: SCNNode){
        let player = AVAudioPlayerNode()
distortion.loadFactoryPreset(AVAudioUnitDistortionPreset.SpeechCosmicInterference)
engine.attachNode(player)
engine.attachNode(distortion)
engine.connect(player, to: distortion, format: outputBuffer.format)
        engine.connect(distortion, to: environment, format: constructOutputFormatForEnvironment())
let algo = AVAudio3DMixingRenderingAlgorithm.HRTF
        player.renderingAlgorithm = algo
        player.reverbBlend = 0.3
        player.renderingAlgorithm = AVAudio3DMixingRenderingAlgorithm.HRTF
    }

}