Ios 在AudioEngine中使用声音效果
背景-我从苹果最近的WWDC发布的视频列表中看到了一个名为“AVAudioEngine in Practice”的视频,该视频将声音效果应用于音频。 之后,我成功地用以下代码更改了音频的音高:Ios 在AudioEngine中使用声音效果,ios,swift,avaudioengine,avaudioplayernode,Ios,Swift,Avaudioengine,Avaudioplayernode,背景-我从苹果最近的WWDC发布的视频列表中看到了一个名为“AVAudioEngine in Practice”的视频,该视频将声音效果应用于音频。 之后,我成功地用以下代码更改了音频的音高: //Audio Engine is initialized in viewDidLoad() audioEngine = AVAudioEngine() //The following Action is called on clicking a button @IBAction func ch
//Audio Engine is initialized in viewDidLoad()
audioEngine = AVAudioEngine()
//The following Action is called on clicking a button
@IBAction func chipmunkPlayback(sender: UIButton) {
var pitchPlayer = AVAudioPlayerNode()
var timePitch = AVAudioUnitTimePitch()
timePitch.pitch = 1000
audioEngine.attachNode(pitchPlayer)
audioEngine.attachNode(timePitch)
audioEngine.connect(pitchPlayer, to: timePitch, format: myAudioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: myAudioFile.processingFormat)
pitchPlayer.scheduleFile(myAudioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(&er)
pitchPlayer.play()
}
func playAudioWithVariablePith(pitch: Float){
audioPlayer.stop()
audioEngine.stop()
audioEngine.reset()
let audioPlayerNode = AVAudioPlayerNode()
audioEngine.attachNode(audioPlayerNode)
let changePitchEffect = AVAudioUnitTimePitch()
changePitchEffect.pitch = pitch
audioEngine.attachNode(changePitchEffect)
audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
try! audioEngine.start()
audioPlayerNode.play()
}
据我所知,我使用AudioEngine将AudioPlayerNode与AudioEffect连接起来,然后将AudioEffect连接到输出
我现在对在音频中添加多种音效感到好奇。例如,音高变化和混响。我将如何向音频添加多个声音效果
另外,在viewDidLoad中附加和连接节点,而不是在iAction中这样做是否有意义
engine.connect(playerNode, to: reverbNode, format: format)
engine.connect(reverbNode, to: distortionNode, format: format)
engine.connect(distortionNode, to: delayNode, format: format)
engine.connect(delayNode, to: mixer, format: format)
背景-我从Udacity发布的将声音效果应用于音频的以下视频列表中看到了一段视频,标题为“拼凑在一起-使用Swift开发iOS应用程序简介” 之后,我成功地用以下代码更改了音频的音高:
//Audio Engine is initialized in viewDidLoad()
audioEngine = AVAudioEngine()
//The following Action is called on clicking a button
@IBAction func chipmunkPlayback(sender: UIButton) {
var pitchPlayer = AVAudioPlayerNode()
var timePitch = AVAudioUnitTimePitch()
timePitch.pitch = 1000
audioEngine.attachNode(pitchPlayer)
audioEngine.attachNode(timePitch)
audioEngine.connect(pitchPlayer, to: timePitch, format: myAudioFile.processingFormat)
audioEngine.connect(timePitch, to: audioEngine.outputNode, format: myAudioFile.processingFormat)
pitchPlayer.scheduleFile(myAudioFile, atTime: nil, completionHandler: nil)
audioEngine.startAndReturnError(&er)
pitchPlayer.play()
}
func playAudioWithVariablePith(pitch: Float){
audioPlayer.stop()
audioEngine.stop()
audioEngine.reset()
let audioPlayerNode = AVAudioPlayerNode()
audioEngine.attachNode(audioPlayerNode)
let changePitchEffect = AVAudioUnitTimePitch()
changePitchEffect.pitch = pitch
audioEngine.attachNode(changePitchEffect)
audioEngine.connect(audioPlayerNode, to: changePitchEffect, format: nil)
audioEngine.connect(changePitchEffect, to: audioEngine.outputNode, format: nil)
audioPlayerNode.scheduleFile(audioFile, atTime: nil, completionHandler: nil)
try! audioEngine.start()
audioPlayerNode.play()
}
一切似乎都有联系。为什么建议使用该代码?在OP给出的示例中,多个节点没有连接。“我现在对添加多重音效很好奇”,所以我给了他答案。谢谢你否决了我的正确答案。对不起。错过了。如何设置音频引擎或播放器节点的当前时间?