Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Activity

AVPlayerItem. externalMetadata not available
According to the documentation (https://developer.apple.com/documentation/avfoundation/avplayeritem/externalmetadata), AVPlayerItem should have an externalMetadata property. However it does not appear to be visible to my app. When I try, I get: Value of type 'AVPlayerItem' has no member 'externalMetadata' Documentation states iOS 12.2+; I am building with a minimum deployment target of iOS 18. Code snippet: import Foundation import AVFoundation /// ... in function ... // create metadata as described in https://developer.apple.com/videos/play/wwdc2022/110338 var title = AVMutableMetadataItem() title.identifier = .commonIdentifierAlbumName title.value = "My Title" as NSString? title.extendedLanguageTag = "und" var playerItem = await AVPlayerItem(asset: composition) playerItem.externalMetadata = [ title ]
0
0
87
Apr ’25
Unable to match music with shazamkit for Android
Hello, i can successfully match music using shazamkit on Apple using SwiftUI, a simple app that let user to load an audio file and exctracts the relative match, while i am unable to match music using shamzamkit on Android. I am trying to make the same simple app but i cannot match music as i get MATCH_ATTEMPT_FAILED every time i try to. I don't know what i am doing wrong but the shazam part in the kotlin Android code is in this method : suspend fun processAudioFileInBackground( filePath: String, developerTokenProvider: DeveloperTokenProvider ) = withContext(Dispatchers.IO) { val bufferSize = 1024 * 1024 val audioFile = FileInputStream(filePath) val byteBuffer = ByteBuffer.allocate(bufferSize) byteBuffer.order(ByteOrder.LITTLE_ENDIAN) var bytesRead: Int while (audioFile.read(byteBuffer.array()).also { bytesRead = it } != -1) { val signatureGenerator = (ShazamKit.createSignatureGenerator(AudioSampleRateInHz.SAMPLE_RATE_44100) as ShazamKitResult.Success).data signatureGenerator.append(byteBuffer.array(), bytesRead, System.currentTimeMillis()) val signature = signatureGenerator.generateSignature() println("Signature: ${signature.durationInMs}") val catalog = ShazamKit.createShazamCatalog(developerTokenProvider, Locale.ENGLISH) val session = (ShazamKit.createSession(catalog) as ShazamKitResult.Success).data val matchResult = session.match(signature) println("MatchResult : $matchResult") setMatchResult(matchResult) byteBuffer.clear() } audioFile.close() } I noticed that changing Locale in catalog creation results in different result as i get NoMatch without exception. Can you please help me with this? Do i need to create a custom catalog?
0
0
122
May ’25
Can a Location-Based Audio AR Experience Run in the Background on iOS?
Hi everyone! I’ve developed a location-based Audio AR app in Unity with FMOD & Resonance Audio and AirPods Pro Head-Tracking to create a ubiquitous augmented soundscape experience. Think of it as an audio version of Pokémon Go, but with a more precise location requirement to ensure spatial audio is placed correctly. I want this experience to run in the background on iOS, but from what I’ve gathered, it seems Unity doesn’t support this well. So, I’m considering developing a Swift version instead. Since this is primarily for research purposes, privacy concerns are not a major issue in my case. However, I’ve come across some potential challenges: Real-time precise location updates – Can iOS provide fully instantaneous, high-accuracy location updates in the background? Continuous real-time data processing – Can an app continuously process spatial audio, head-tracking, and location data while running in the background? I’m not sure if newer iOS versions have improved in these areas or if there are workarounds to achieve this. Would this kind of experience be feasible to run in the background on iOS? Any insights or pointers would be greatly appreciated! I’m very new to iOS development, so apologies if this is a basic question. Thanks in advance!
0
0
102
Apr ’25
Handling AVAudioEngine Configuration Change
Hi all, I have been quite stumped on this behavior for a little bit now, so thought it best to share here and see if someone more experience with AVAudioEngine / AVAudioSession can weigh in. Right now I have a AVAudioEngine that I am using to perform some voice chat with and give buffers to play. This works perfectly until route changes start to occur, which causes the AVAudioEngine to reset itself, which then causes all players attached to this engine to be stopped. Once a AVPlayerNode gets stopped due to this (but also any other time), all samples that were scheduled to be played then get purged. Where this becomes confusing for me is the completion handler gets called every time regardless of the sound actually being played. Is there a reliable way to know if a sample needs to be rescheduled after a player has been reset? I am not quite sure in my case what my observer of AVAudioEngineConfigurationChange needs to be doing, as this engine only handles output. All input is through a separate engine for simplicity. Currently I am storing a queue of samples as they get sent to the AVPlayerNode for playback, and after that completion checking if the player isPlaying or not. If it's playing I assume that the sound actually was played- and if not then I leave it in the queue and assume that an observer on the route change or the configuration change will realize there are samples in the queue and reset them Thanks for any feedback!
3
0
816
Oct ’25
MusicKit Web Playback States
In MusicKit Web the playback states are provided as numbers. For example the playbackStateDidChange event listener will return: {oldState: 2, state: 3, item:...} When the state changes from playing (2) to paused (3). Those are pretty easy to guess, but I'm having a hard time with some of the others: completed, ended, loading, none, paused, playing, seeking, stalled, stopped, waiting. I cannot find a mapping of states to numbers documented anywhere. I got the above states from an enum in a d.ts file that is often incorrect/incomplete. Can someone help out pointing to the docs or provide a mapping? Thanks.
2
0
444
Feb ’25
Why does AVAudioRecorder show 8 kHz when iPhone hardware is 48 kHz?
Hi everyone, I’m testing audio recording on an iPhone 15 Plus using AVFoundation. Here’s a simplified version of my setup: let settings: [String: Any] = [ AVFormatIDKey: Int(kAudioFormatLinearPCM), AVSampleRateKey: 8000, AVNumberOfChannelsKey: 1, AVLinearPCMBitDepthKey: 16, AVLinearPCMIsFloatKey: false ] audioRecorder = try AVAudioRecorder(url: fileURL, settings: settings) audioRecorder?.record() When I check the recorded file’s sample rate, it logs: Actual sample rate: 8000.0 However, when I inspect the hardware sample rate: try session.setCategory(.playAndRecord, mode: .default) try session.setActive(true) print("Hardware sample rate:", session.sampleRate) I consistently get: `Hardware sample rate: 48000.0 My questions are: Is the iPhone mic actually capturing at 8 kHz, or is it recording at 48 kHz and then downsampling to 8 kHz internally? Is there any way to force the hardware to record natively at 8 kHz? If not, what’s the recommended approach for telephony-quality audio (true 8 kHz) on iOS devices? Thanks in advance for your guidance!
1
0
248
Sep ’25
The files generated using AVAudioRecorder have a constant size of only 4kb
Hello. My app uses AVAudioRecorder to generate recording files, which are consistently only 4kb in size. Most users generate audio files normally, with only a few users experiencing this phenomenon occasionally. After uninstalling and installing the app, it will work normally, but it will reappear after a period of time. I have compared that the problematic audio files generated each time are fixed and cannot be played. Added the audioRecorderDidFinishRecording proxy method, which shows that the recording was completed normally. The user also reported that the recording is normal, but there is a problem with the generated file. How should I handle this issue? Look forward to your reply. - (void)startRecordWithOrderID:(NSString *)orderID { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryRecord error:nil]; [audioSession setActive:YES error:nil]; NSMutableDictionary *settings = [[NSMutableDictionary alloc] init]; [settings setObject:[NSNumber numberWithFloat: 8000.0] forKey:AVSampleRateKey]; [settings setObject:[NSNumber numberWithInt: kAudioFormatLinearPCM] forKey:AVFormatIDKey]; [settings setObject:[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey]; [settings setObject:[NSNumber numberWithInt: 1] forKey:AVNumberOfChannelsKey]; [settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey]; [settings setObject:[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey]; NSString *path = [WDUtility createDirInDocument:@"audios" withOrderID:orderID withPathExtension:@"wav"]; NSURL *tmpFile = [NSURL fileURLWithPath:path]; recorder = [[AVAudioRecorder alloc] initWithURL:tmpFile settings:settings error:nil]; [recorder setDelegate:self]; [recorder prepareToRecord]; [recorder record]; }
0
0
171
Jul ’25
AirPods Pro 3 Disconnecting from Apple Ultra 3 consistently
I have both apple devices, AirPods Pro 3 is up to date and Ultra 3 is on watch os 26.1 latest public beta. Each morning when I would go on my mindfulness app and start a meditation or listen to Apple Music on my watch and AirPods Pro 3, it will play for a few seconds then disconnects. My bluetooth settings on my watch says my AirPods is connected to my watch. I also have removed the tick about connecting automatically to iPhone on the AirPods setting in my iPhone. To fix this I invariably turn off my Apple Watch Ultra 3 and turn it on again. Then the connection becomes stable. I am not sure why I have to do this each morning. It is frustrating. I am not sure why this fix does not last long? Is there something wrong with my AirPods? Has anyone encountered this before?
1
0
621
Oct ’25
Switching default input/output channels using Core Audio
I wrote a Swift macOS app to control a PCI audio device. The code switches between the default output and input channels. As soon as I launch the Audio-Midi Setup utility, channel switching stops working. The driver properties allow switching, but the system doesn't respond. I have to delete the contents of /Library/Preferences/Audio and reset Core Audio. What am I missing? func setDefaultChannelsOutput() { guard let deviceID = getDeviceIDByName(deviceName: "PCI-424") else { return } let selectedIndex = DefaultChannelsOutput.indexOfSelectedItem if selectedIndex < 0 || selectedIndex >= 24 { return } let channel1 = UInt32(selectedIndex * 2 + 1) let channel2 = UInt32(selectedIndex * 2 + 2) var channels: [UInt32] = [channel1, channel2] var propertyAddress = AudioObjectPropertyAddress( mSelector: kAudioDevicePropertyPreferredChannelsForStereo, mScope: kAudioDevicePropertyScopeOutput, mElement: kAudioObjectPropertyElementWildcard ) let dataSize = UInt32(MemoryLayout<UInt32>.size * channels.count) let status = AudioObjectSetPropertyData(deviceID, &propertyAddress, 0, nil, dataSize, &channels) if status != noErr { print("Error setting default output channels: \(status)") } }
0
0
285
Dec ’25
ScaleTimeRange will cause noise in sound
I'm using AVFoundation to make a multi-track editor app, which can insert multiple track and clip, including scale some clip to change the speed of the clip, (also I'm not sure whether AVFoundation the best choice for me) but after making the scale with scaleTimeRange API, there is some short noise sound in play back. Also, sometimes it's fine when play AVMutableCompostion using AVPlayer with AVPlayerItem, but after exporting with AVAssetReader, will catch some short noise sounds in result file.... Not sure why. Here is the example project, which can build and run directly. https://github.com/luckysmg/daily_images/raw/refs/heads/main/TestDemo.zip
0
0
133
Jul ’25
Logic Pro - discover channel upstream latency
Hello everyone, I've written an audio unit plugin that needs to be aware of any upstream latency caused by heavy plugins before it on the channel. Is there any way to query this? I know that Logic applies PDC at the channel's output (summing point), but I need to know what the accumulated latency is at the point the audio enters my plugin. Thanks!
0
0
340
Jan ’26
Spatial Audio on iOS 18 don't work as inteneded
I’m facing a problem while trying to achieve spatial audio effects in my iOS 18 app. I have tried several approaches to get good 3D audio, but the effect never felt good enough or it didn’t work at all. Also what mostly troubles me is I noticed that AirPods I have doesn’t recognize my app as one having spatial audio (in audio settings it shows "Spatial Audio Not Playing"). So i guess my app doesn't use spatial audio potential. First approach uses AVAudioEnviromentNode with AVAudioEngine. Chaining position of player as well as changing listener’s doesn’t seem to change anything in how audio plays. Here's simple how i initialize AVAudioEngine import Foundation import AVFoundation class AudioManager: ObservableObject { // important class variables var audioEngine: AVAudioEngine! var environmentNode: AVAudioEnvironmentNode! var playerNode: AVAudioPlayerNode! var audioFile: AVAudioFile? ... //Sound set up func setupAudio() { do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true) } catch { print("Failed to configure AVAudioSession: \(error.localizedDescription)") } audioEngine = AVAudioEngine() environmentNode = AVAudioEnvironmentNode() playerNode = AVAudioPlayerNode() audioEngine.attach(environmentNode) audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: environmentNode, format: nil) audioEngine.connect(environmentNode, to: audioEngine.mainMixerNode, format: nil) environmentNode.listenerPosition = AVAudio3DPoint(x: 0, y: 0, z: 0) environmentNode.listenerAngularOrientation = AVAudio3DAngularOrientation(yaw: 0, pitch: 0, roll: 0) environmentNode.distanceAttenuationParameters.referenceDistance = 1.0 environmentNode.distanceAttenuationParameters.maximumDistance = 100.0 environmentNode.distanceAttenuationParameters.rolloffFactor = 2.0 // example.mp3 is mono sound guard let audioURL = Bundle.main.url(forResource: "example", withExtension: "mp3") else { print("Audio file not found") return } do { audioFile = try AVAudioFile(forReading: audioURL) } catch { print("Failed to load audio file: \(error)") } } ... //Playing sound func playSpatialAudio(pan: Float ) { guard let audioFile = audioFile else { return } // left side playerNode.position = AVAudio3DPoint(x: pan, y: 0, z: 0) playerNode.scheduleFile(audioFile, at: nil, completionHandler: nil) do { try audioEngine.start() playerNode.play() } catch { print("Failed to start audio engine: \(error)") } ... } Second more complex approach using PHASE did better. I’ve made an exemplary app that allows players to move audio player in 3D space. I have added reverb, and sliders changing audio position up to 10 meters each direction from listener but audio seems to only really change left to right (x axis) - again I think it might be trouble with the app not being recognized as spatial. //Crucial class Variables: class PHASEAudioController: ObservableObject{ private var soundSourcePosition: simd_float4x4 = matrix_identity_float4x4 private var audioAsset: PHASESoundAsset! private let phaseEngine: PHASEEngine private let params = PHASEMixerParameters() private var soundSource: PHASESource private var phaseListener: PHASEListener! private var soundEventAsset: PHASESoundEventNodeAsset? // Initialization of PHASE init{ do { let session = AVAudioSession.sharedInstance() try session.setCategory(.playback, mode: .default, options: []) try session.setActive(true) } catch { print("Failed to configure AVAudioSession: \(error.localizedDescription)") } // Init PHASE Engine phaseEngine = PHASEEngine(updateMode: .automatic) phaseEngine.defaultReverbPreset = .mediumHall phaseEngine.outputSpatializationMode = .automatic //nothing helps // Set listener position to (0,0,0) in World space let origin: simd_float4x4 = matrix_identity_float4x4 phaseListener = PHASEListener(engine: phaseEngine) phaseListener.transform = origin phaseListener.automaticHeadTrackingFlags = .orientation try! self.phaseEngine.rootObject.addChild(self.phaseListener) do{ try self.phaseEngine.start(); } catch { print("Could not start PHASE engine") } audioAsset = loadAudioAsset() // Create sound Source // Sphere soundSourcePosition.translate(z:3.0) let sphere = MDLMesh.newEllipsoid(withRadii: vector_float3(0.1,0.1,0.1), radialSegments: 14, verticalSegments: 14, geometryType: MDLGeometryType.triangles, inwardNormals: false, hemisphere: false, allocator: nil) let shape = PHASEShape(engine: phaseEngine, mesh: sphere) soundSource = PHASESource(engine: phaseEngine, shapes: [shape]) soundSource.transform = soundSourcePosition print(soundSourcePosition) do { try phaseEngine.rootObject.addChild(soundSource) } catch { print ("Failed to add a child object to the scene.") } let simpleModel = PHASEGeometricSpreadingDistanceModelParameters() simpleModel.rolloffFactor = rolloffFactor soundPipeline.distanceModelParameters = simpleModel let samplerNode = PHASESamplerNodeDefinition( soundAssetIdentifier: audioAsset.identifier, mixerDefinition: soundPipeline, identifier: audioAsset.identifier + "_SamplerNode") samplerNode.playbackMode = .looping do {soundEventAsset = try phaseEngine.assetRegistry.registerSoundEventAsset( rootNode: samplerNode, identifier: audioAsset.identifier + "_SoundEventAsset") } catch { print("Failed to register a sound event asset.") soundEventAsset = nil } } //Playing sound func playSound(){ // Fire new sound event with currently set properties guard let soundEventAsset else { return } params.addSpatialMixerParameters( identifier: soundPipeline.identifier, source: soundSource, listener: phaseListener) let soundEvent = try! PHASESoundEvent(engine: phaseEngine, assetIdentifier: soundEventAsset.identifier, mixerParameters: params) soundEvent.start(completion: nil) } ... } Also worth mentioning might be that I only own personal team account
4
0
1k
Nov ’25
How to disable the built-in speakers and microphone on a Mac
I need to implement a solution through an API or custom driver to completely block out the built-in speakers and microphone of Mac, because I need other apps to use specified external devices as audio input and output. Is there a way to achieve this requirement? What I mean is that even in system preferences, it should not be possible to choose the built-in microphone and speakers; only my external device can be used.
0
0
123
Apr ’25
access transport in Logic Pro
hi, i need to read wether the transport is playing or stopped but my current method that works for vst does not work for au. is there a lpx resource available for developers anywhere? if (auto* playHead = processor->getPlayHead()) { juce::AudioPlayHead::CurrentPositionInfo posInfo; if (playHead->getCurrentPosition(posInfo)) { bool isCurrentlyPlaying = posInfo.isPlaying; if (isCurrentlyPlaying != wasTransportPlaying) { if (isCurrentlyPlaying) { wasTransportPlaying = isCurrentlyPlaying; startAllTimers(); } else { wasTransportPlaying = isCurrentlyPlaying; stopAllTimers(); } } } } thanks :)
0
0
289
Mar ’25