According to the Apple docs, you should be able to connect audio nodes to an AVAudioEngine instance during runtime, but I'm getting a crash while trying to do so, in particular, when trying to connect instances of AVAudioUnitTimePitch or AVAudioUnitVarispeed to an AVAudioEngine with manual rendering mode enabled. The error message I get is:
Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason:
'player started when in a disconnected state'
In my code, first, I configure the audio engine:
let engine = AVAudioEngine()
let format = AVAudioFormat(standardFormatWithSampleRate: 48000, channels: 2)!
try! engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: 1024)
try! engine.start()
Then, I try to attach the player to the engine:
let player = AVAudioPlayerNode()
configureEngine(player: player, useVarispeed: true)
player.play() // this is the line that causes the crash
Finally, this is the function I use to configure the engine nodes graph:
func configureEngine(player: AVAudioPlayerNode, useVarispeed: Bool) {
engine.attach(player)
guard useVarispeed else {
engine.connect(player, to: engine.mainMixerNode, format: format)
return
}
let varispeed = AVAudioUnitVarispeed()
engine.attach(varispeed)
engine.connect(player, to: varispeed, format: format)
engine.connect(varispeed, to: engine.mainMixerNode, format: format)
}
If I pass false as the value for the useVarispeed parameter, the crash goes away. What is even more interesting is that if I add a dummy player node before starting the engine, the crash goes away too 🤷♂️
Could anyone please add clarity on what's going on here? Is this a bug or a limitation of the framework that I'm not aware of?
Here's a simple project demonstrating the problem: https://github.com/rlaguilar/AVAudioEngineBug
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm trying to use the sample code associated to the talk Author fragmented MPEG-4 content with AVAssetWriter which can be found here.
It works well when I run it on macOS, but after adapting it to run in iOS (basically moving the code in the main file to a view controller), it doesn't work. The problem is that the function:
assetWriter(_:didOutputSegmentData:segmentType:segmentReport:)
is never called for the last segment.
In macOS, the last segment is reported after calling the function AVAssetWriter.finishWriting(completionHandler:), but before the completionHandler parameter block is invoked. In iOS, nothing happens at that point.
Is there anything I could do from my side to fix this problem?
Thanks in advance!