I have an AVMutableAudioMix and use MTAudioProcessingTap to process the audio data.But After I pass the buffer to AVAudioEngine and to render it with renderOffline,the audio has no any effects...How can I do it? Any idea?
Here is the code for MTAudioProcessingTapProcessCallback
var callback = MTAudioProcessingTapCallbacks(version: kMTAudioProcessingTapCallbacksVersion_0, clientInfo:UnsafeMutableRawPointer(Unmanaged.passUnretained(self.engine).toOpaque()), init: tapInit, finalize: tapFinalize, prepare: tapPrepare, unprepare: tapUnprepare) { tap, numberFrames, flags, bufferListInOut, numberFramesOut, flagsOut in
guard MTAudioProcessingTapGetSourceAudio(tap, numberFrames, bufferListInOut, flagsOut, nil, numberFramesOut) == noErr else {
preconditionFailure()
}
let storage = MTAudioProcessingTapGetStorage(tap)
let engine = Unmanaged<Engine>.fromOpaque(storage).takeUnretainedValue()
// render the audio with effect
engine.render(bufferPtr: bufferListInOut,numberOfFrames: numberFrames)
}
And here is the Engine code
class Engine {
let engine = AVAudioEngine()
let player = AVAudioPlayerNode()
let pitchEffect = AVAudioUnitTimePitch()
let reverbEffect = AVAudioUnitReverb()
let rateEffect = AVAudioUnitVarispeed()
let volumeEffect = AVAudioUnitEQ()
let format = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, channels: 2, interleaved: false)!
init() {
engine.attach(player)
engine.attach(pitchEffect)
engine.attach(reverbEffect)
engine.attach(rateEffect)
engine.attach(volumeEffect)
engine.connect(player, to: pitchEffect, format: format)
engine.connect(pitchEffect, to: reverbEffect, format: format)
engine.connect(reverbEffect, to: rateEffect, format: format)
engine.connect(rateEffect, to: volumeEffect, format: format)
engine.connect(volumeEffect, to: engine.mainMixerNode, format: format)
try! engine.enableManualRenderingMode(.offline, format: format, maximumFrameCount: 4096)
reverbEffect.loadFactoryPreset(AVAudioUnitReverbPreset.largeRoom2)
reverbEffect.wetDryMix = 100
pitchEffect.pitch = 2100
try! engine.start()
player.play()
}
func render(bufferPtr:UnsafeMutablePointer<AudioBufferList>,numberOfFrames:CMItemCount) {
let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: 4096)!
buffer.frameLength = AVAudioFrameCount(numberOfFrames)
buffer.mutableAudioBufferList.pointee = bufferPtr.pointee
self.player.scheduleBuffer(buffer) {
try! self.engine.renderOffline(AVAudioFrameCount(numberOfFrames), to: buffer)
}
}
}
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
The GarageBand app can import both midi and recorded audio file into a single player to play.
Just like this:
My App have the same feature but I don't know how to implement it.
I have tried the AVAudioSequencer,but it only can load and play MIDI file.
I have tried the AVPlayer and AVPlayerItem,but it seems that it can't load the MIDI file.
So How to combine MIDI file and audio file into a single AVPlayerItem or anything else to play?
This is the crash log from Firebase.
Fatal Exception: NSInvalidArgumentException
*** -[AVAssetWriter addInput:] Format ID 'lpcm' is not compatible with file type com.apple.m4a-audio
But I can't reproduce the crash ...
This is the demo code
Does anyone know where the problem is ?
let normalOutputSettings:[String:Any] = [
AVFormatIDKey : kAudioFormatLinearPCM,
AVSampleRateKey : 44100,
AVNumberOfChannelsKey : 2,
AVLinearPCMBitDepthKey : 16,
AVLinearPCMIsNonInterleaved : false,
AVLinearPCMIsFloatKey : false,
AVLinearPCMIsBigEndianKey : false
]
let writerInput = AVAssetWriterInput(mediaType: .audio, outputSettings: outputSettings)
let outputURL = URL(fileURLWithPath: NSTemporaryDirectory() + UUID().uuidString + ".m4a")
self.writer = try! AVAssetWriter(outputURL: outputURL, fileType: fileType)
writer?.add(writerInput)
On iOS 18 we can see there is a colorful wave around screen when using siri, I want to implement one in my app. But I tried many ways but not succeed. Any idea?
I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation the best choice for me) but after making the scale with scaleTimeRange API, there is some short noise sound in play back. Also, sometimes it's fine when play AVMutableCompostion using AVPlayer with AVPlayerItem, but after exporting with AVAssetReader, will catch some short noise sounds in result file.... Not sure why.
Here is the example project, which can build and run directly. https://github.com/luckysmg/daily_images/raw/refs/heads/main/TestDemo.zip