I have a kind of trivial request, I need to seek sound playback. The problem is I don't have a local sound file, I got a pointer to the sound instead (as well as other params), from (my internal) native lib.
There is a method that I use in order to convert UnsafeRawPointer to AVAudioPCMBuffer
...
var byteCount: Int32 = 0
var buffer: UnsafeMutableRawPointer?
defer {
buffer?.deallocate()
buffer = nil
}
if audioReader?.getAudioByteData(byteCount: &byteCount, data: &buffer) ?? false && buffer != nil {
let audioFormat = AVAudioFormat(standardFormatWithSampleRate: Double(audioSampleRate), channels: AVAudioChannelCount(audioChannels))!
if let pcmBuf = AVAudioPCMBuffer(pcmFormat: audioFormat, frameCapacity: AVAudioFrameCount(byteCount)) {
let monoChannel = pcmBuf.floatChannelData![0]
pcmFloatData = [Float](repeating: 0.0, count: Int(byteCount))
//>>> Convert UnsafeMutableRawPointer to [Int8] array
let int16Ptr: UnsafeMutablePointer<Int16> = buffer!.bindMemory(to: Int16.self, capacity: Int(byteCount))
let int16Buffer: UnsafeBufferPointer<Int16> = UnsafeBufferPointer(start: int16Ptr, count: Int(byteCount) / MemoryLayout<Int16>.size)
let int16Arr: [Int16] = Array(int16Buffer)
//<<<
// Int16 ranges from -32768 to 32767 -- we want to convert and scale these to Float values between -1.0 and 1.0
var scale = Float(Int16.max) + 1.0
vDSP_vflt16(int16Arr, 1, &pcmFloatData[0], 1, vDSP_Length(int16Arr.count)) // Int16 to Float
vDSP_vsdiv(pcmFloatData, 1, &scale, &pcmFloatData[0], 1, vDSP_Length(int16Arr.count)) // divide by scale
memcpy(monoChannel, pcmFloatData, MemoryLayout<Float>.size * Int(int16Arr.count))
pcmBuf.frameLength = UInt32(int16Arr.count)
usagePlayer.setupAudioEngine(with: audioFormat)
audioClip = pcmBuf
}
}
...
So, at the end of the method, you can see this line audioClip = pcmBuf, where the prepared pcmBuf is pass to the local variable.
Then, what I need is just to start it like this
...
/*player is AVAudioPlayerNode*/
player.scheduleBuffer(buf, at: nil, options: .loops)
player.play()
...
and that is it, now I can hear the sound. But let's say I need to seek forward in 10 sec, in order to do this I need stop() the player node, set a new AVAudioPCMBuffer but this time with an offset of 10 sec.
The problem is there is no method for offset, nor from the player node side nor AVAudioPCMBuffer.
For example, if I would work with a file (instead of a buffer), I could use this method
...
player.scheduleSegment(
file,
startingFrame: seekFrame,
frameCount: frameCount,
at: nil
)
...
There at least you can use startingFrame: seekFrame and frameCount: frameCount params.
But in my case I don't use file I use buffer and the problem is that - there is no such params for buffer implementation.
Looks like I can't implement seek logic if I use AVAudioPCMBuffer.
What am I doing wrong?