Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Activity

Audio Volume increases by itself
I am using an AVAudioPlayer to play a "tick" sound once per second in a SwiftUI app. When running the app on an iPhone 16 (18.2.1) the tick sounds increase in volume after a few seconds. This does not happen in the simulator nor on an iPhone SE 2020 (18.1.1).
2
0
277
Jan ’25
Help Needed: How to Make iOS Timer More Stable?
I’m currently developing an iOS metronome app using DispatchSourceTimer as the timer. The interval is set very small, around 50 milliseconds, and I’m using CFAbsoluteTimeGetCurrent to calculate the elapsed time to ensure the beat is played within a ±0.003-second margin. The problem is that once the app goes to the background, the timing becomes unstable—it slows down noticeably, then recovers after 1–2 seconds. When coming back to the foreground, it suddenly speeds up, and again, it takes 1–2 seconds to return to normal. It feels like the app is randomly “powering off” and then “overclocking.” It’s super frustrating. I’ve noticed that some metronome apps in the App Store have similar issues, but there’s one called “Professional Metronome” that’s rock solid with no such problems. What kind of magic are they using? Any experts out there who can help? Thanks in advance! P.S. I’ve already enabled background audio permissions. The professional metronome that has no issues: https://link.zhihu.com/?target=https%3A//apps.apple.com/cn/app/pro-metronome-%25E4%25B8%2593%25E4%25B8%259A%25E8%258A%2582%25E6%258B%258D%25E5%2599%25A8/id477960671
2
0
553
Jan ’25
AVSpeechUtterance Mandarin voice output replaced by SIRI language setting after upgraded the IOS to 18
Hi, Apple's engineer. Hoping that you can reply to this one. We're developing a Text-to-Speak app. Everything went well until the IOS got upgraded to 18. AVSpeechSynthesisVoice(language: "zh-CN") is running well under IOS 16 AND IOS 17. It speaks Mandarin correctly. In IOS 18, we noticed that Siri's Language setting interrupted the performance of AVSpeechSynthesisVoice. It plays Cantonese instead of Mandarin. Buggy language setting in Siri that affects the AVSpeechSynthesisVoice : Chinese (Cantonese - China mainland) Chinese (Cantonese -Hong Kong)
3
3
823
Jan ’25
Can Logic Pro load an Audio Unit v3 in-process?
After investing more than a week into getting a bunch of audio unit projects converted into app + appex + framework, they all are now correctly loaded in-process in the demo host app that is part of Xcode's template. However, Logic Pro adamantly refuses to load them in-process. Does Logic Pro simply not do that ever, or is there some hint or configuration my plugins need to provide to enable that? If it is unsupported, will it be supported in some future version of Logic? The entire point of investing that week was performance, which is moot if it is impossible to test the impact of loading in-process in a real-world usage scenario.
1
0
660
Jan ’25
AUv3 recent "Failed to find component with type..." frequent issues
I've been generating new Audio Unit Extension apps with Xcode 16 (and newer), and although they generally work initially, it is easy (although I'm not sure how to do it reliably) to cause the app to no longer be able to instantiate the audiounit. Generally the call to AVAudioUnit.findComponent fails and SimplePlayEngine hits the fatalError("Failed to find component with type...") In the most recent project, merely adding files to the extension (without making any use of them) caused it to go off the rails. If I "Archive" the app+plugin, there is no audio unit extension in the bundle. If I switch to the audiounit extension and build it it's fine. If I look at the build folder in Library/Developer/Xcode/project_folder the extension_name.appex is there. Any ideas? If I can coax an unmodified audio unit extension project to exhibit this behavior I'll attach it here. Right now what I have has code I don't want to share.
4
1
703
Jan ’25
Logic Pro loads AUv3 when compiled in Swift 5 but not Swift 6
I have spent a long time refactoring lots of older Swift code to compile without error in Swift 6. The app is a v3 audio unit host and audio unit. Having installed Sonoma and XCode 16 I compile the code using Swift 6 and it compiles and runs without any warnings or errors. My host will load my AU no problem. LOGIC PRO is still the ONLY audio unit host that will load native Mac V3 audio units and so I like to test my code using Logic. In Sonoma with XCode 16... My AU passes the most stringent AUVAL tests both in terminal and Logic pro. If I compile the AU source in Swift 5 Logic will see the AU, load it and run it without problems. But when I compile the AU in Swift 6 Logic sees the AU, will scan it and verify it passes the tests but will not load the AU. In XCode I see a log message that a "helper application failed to run" but the debugger never connects to the AU and I don't think Logic even gets as far as instantiating the AU. So... what is causing this? I'm stumped.. Developing AUv3 is a brain-aching maze of undocumented hurdles and I'm hoping someone might have found a solution for this one. Meanwhile I guess my only option is to continue using the Swift 5 compiler. (appending a little note just to mention that all the DSP code is written in C/C++, Swift is used mainly for the user interface and also does some offline thready work )
1
0
511
Jan ’25
Garageband displaying error 100001 when loading up some AU plugins
I recently got some plugins from Universal Audio, and have licensed them properly through both UA and iLok manager. Whenever I try to load up the plugins (specifically from UA) in GarageBand, it first says that "NSCreateObjectFileImageFromMemory-p47UEwps” because the developper can not be verified. After clicking either 'show in finder' or 'okay', it opens the plugin in a form without its GUI and showing that it is not licensed (even though it is). It also displays error code 100001. I have tried only some basic stuff to troubleshoot like restarting the DAW/my computer and reinstalling/relicensing the softwares. I don't know if the macOS version has anything to do with it but for some reason I just can't get it to work.
1
0
384
Jan ’25
Title: Ambisonic B-Format Playback Issues on Vision Pro
I'm trying to implement Ambisonic B-Format audio playback on Vision Pro with head tracking. So far audio plays, head tracking works, and the sound appears to be stereo. The problem is that it is not a proper binaural playback when compared to playing back the audiofile with a DAW. Has anyone successfully implemented B-Format playback on Vision Pro? Any suggestions on my current implementation: func playAmbiAudioForum() async { do { try AVAudioSession.sharedInstance().setCategory(.playback) try AVAudioSession.sharedInstance().setActive(true) // AudioFile laoding/preperation guard let testFileURL = Bundle.main.url(forResource: "audiofile", withExtension: "wav") else { print("Test file not found") return } let audioFile = try AVAudioFile(forReading: testFileURL) let audioFileFormat = audioFile.fileFormat // create AVAudioFormat with Ambisonics B Format guard let layout = AVAudioChannelLayout(layoutTag: kAudioChannelLayoutTag_Ambisonic_B_Format) else { print("layout failed") return } let format = AVAudioFormat( commonFormat: audioFile.processingFormat.commonFormat, sampleRate: audioFile.fileFormat.sampleRate, interleaved: false, channelLayout: layout ) // write audiofile to buffer guard let buffer = AVAudioPCMBuffer(pcmFormat: format, frameCapacity: UInt32(audioFile.length)) else { print("buffer failed") return } try audioFile.read(into: buffer) playerNode.renderingAlgorithm = .HRTF // connecting nodes audioEngine.attach(playerNode) audioEngine.connect(playerNode, to: audioEngine.outputNode, format: format) audioEngine.prepare() playerNode.scheduleBuffer(buffer, at: nil) { print("File finished playing") } try audioEngine.start() playerNode.play() } catch { print("Setup error:", error) } }
0
0
478
Jan ’25
CarPlay CPNowPlayingTemplate show wrong command buttons
i have a CarPlay implementation eand I want to show previous/next track button on player UI MPRemoteCommandCenter.shared().seekForwardCommand.isEnabled = false MPRemoteCommandCenter.shared().seekBackwardCommand.isEnabled = false MPRemoteCommandCenter.shared().previousTrackCommand.isEnabled = true MPRemoteCommandCenter.shared().nextTrackCommand.isEnabled = true It works correctly on CarPlay simulator , but on some car only SEEK button are shown . I have to suppose that it is that a problem on the car side , but I would ask about your opinion , maybe there is some pieces I'm missing
3
0
490
Jan ’25
AVAssetWriterInput appendSampleBuffer failed with error -12780
I tried adding watermarks to the recorded video. Appending sample buffers using AVAssetWriterInput's append method fails and when I inspect the AVAssetWriter's error property, I get the following: Error Domain=AVFoundation Error Domain Code=-11800 "This operation cannot be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDDescription=This operation cannot be completed, NSUnderlyingError=0x302399a70 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}} As far as I can tell -11800 indicates an AVErrorUknown, however I have not been able to find information about the -12780 error code, which as far as I can tell is undocumented. Thanks! Here is the code
3
0
707
Jan ’25
changePlaybackRateCommand does not work on iOS.
Hi. I work on an audio app for iOS which is successfully using the MPRemoteCommandCenter for commands like next, back, skip forward, skip backward etc. I am trying to implement playback rate controls in my app (so that users can change the playback speed of audio to 0.5x or 2x for example). While the above commands work, the changePlaybackRateCommand does not seem to. I have enabled the command, given it a target/handler and set supported rates. With the other commands, this caused the UI to change on lock screen, in command center etc, by adding the control for the command (a next button for the next command for example). However, it does not seem to do anything for the playback rate command. I can implement my own "rate button" UI and rate change handling, but I'm wondering if this is a known bug within Apple? Looking online, it seems other people face the same issue and haven't been able to get this command to work. Why is this API provided if it doesn't seem to do anything? Is there something I'm missing? Kind regards.
0
0
340
Jan ’25
Carplay rate button always displays "0x".
Hi. I am working on an audio app for iOS. I have added the CPNowPlayingPlaybackRateButton to my CPNowPlayingTemplate. When the button is clicked, my handler changes the rate in the AVPlayer and updates the MPNowPlayingInfoCenter to the new rate, for example, 2.0. Throughout, the Carplay button always displays "0x". I am wondering how to get this UI to accurately reflect the playback rate the user has selected, as always displaying 0x is a poor user experience. You may suggest MPChangePlaybackRateCommand is relevant here, but I have not been able to get that to work either, and judging by posts online, not many other people have either. I have made a post about that here: https://developer.apple.com/forums/thread/773099 Is this a known Apple bug? Is there a way to get the UI to accurately reflect the playback rate of my audio? Kind regards.
0
0
324
Jan ’25
Changing playback rate of AVQueuePlayer not working for some Airplay devices.
Hi. I am working on an audio app for iOS. I have implemented UI and handling which allows the user to change playback rate of audio. When the user selects a different rate, I update the rate property on my AVQueuePlayer. This is working well on device. When I use Airplay, it works for some devices and not for others. Some devices won't change playback rate and will always play at 1x speed. Is this possibly a limitation of those 3rd-party devices? Or is there something I'm missing/should check? Would love to get playback rate changes working across all Airplay devices with our app. Kind regards.
0
0
424
Jan ’25
Shazamkit with AirPods
HI Guys, I'm using Shazamkit in my IOS app and successfully capturing the currently playing track details, when using the devices (iPhone) built-in mic. When I test with AirPods though, my app cannot both send the output to through the AirPods and capture that same output with the AirPods mic, for Shazamkit recognition. I believe this must be possible, because the Shazamkit widget on IOS can do this. Is it restricted in some way for third party apps? If not, I'd appreciate some guidance on how to achieve this in Swift code. Thanks in advance.
1
1
568
Feb ’25
AVAudioEngine Stop Method
Hi all! I have been experiencing some issues when using the AVAudioEngine to play audio and record input while doing a voice chat (through the PTT Interface). I noticed if I connect any players to the AudioGraph OR call start that the audio session becomes active (this is on iOS). I don't see anything in the docs or the header files in the AVFoundation, but is it possible that calling the stop method on an engine deactivates the audio session too? In a normal app this behavior seems logical, but when using PTT all activation and deactivation of the audio session must go through the framework and its delegate methods. The issue I am debugging is that when the engine with the input node tapped gets stopped, and there is a gap between the input and when the server replies with inbound audio to be played and something seems to be getting the hardware/audio session into a jammed state. Thanks for any feedback and/or confirmation on this behavior!
2
0
636
Feb ’25
PTTFramework w/ AVAudioSession
Hi all, I have spent a lot of time reading the tech note and watching the WDDC video that introduce the PTTFramework on iOS. I currently have a custom setup where I am using AVAudioEngine to schedule and play buffers that are being streamed through a call. I am looking to use the PTTFramework to allow a user to trigger this push to talk behavior from the lock screen and the various places with the system UI it provides. However I am unsure what the correct behavior is regarding the handling of the audio session. Right now I am using .playback when there is no active voice transmission so that devices such as AirPods can be in AD2P mode where applicable, and then transitioning to .playbackAndRecord category only when the mic input should become active. Following this change in my AVAudioEngine manager I am then manually activating and deactivating the audio session manually when the engine is either playing/recording or idle. In the documentation it states that you should not attempt to activate or deactivate your audio session directly, but allow the framework to handle it. Does that mean that I need to either call the request to transmit delegate function or set an active participant on the channel manager first, and then wait for the didBecomeActive delegate method to trigger before I actually attempt to play or record any audio? (I am using the fullDuplex mode currently.) I noticed that that delegate method will only trigger if the audio session wasn't active before doing one of the above (setting active participant, requesting transmit). Lastly, when using the PTTFramework it also mentions that we get support for PTT devices and I notice on the didBeginTransmittingFrom property we have a handsfreeButton case. Is there any documentation or resources for what is actually supported out of the box for this? I am currently working on handling a lot of the push to talk through bluetooth LE, and wanted to make sure there wasn't overlap with what the system provides. Thank you!
2
0
594
Feb ’25
MPRemoteCommandCenter not updating play/pause button to proper state on iOS
So I'm using AVAudioEngine. When playing audio I become the 'now playing' app using MPNowPlayingInfoCenter/MPRemoteCommandCenter APIs. When configuring MPRemoteCommandCenter I add a play/pause command target via -addTargetWithHandler on the togglePlayPauseCommand property. Now I also have a play/pause button in my app's UI. When I pause playback from my app's UI (which means I'm the active app, I'm in the foreground), what I do is this: -I pause the AVAudioPlayerNode I'm using with AVAudioEngine. I do not, stop, reset, etc. the AVAudioEngine. I only pause the player node. My thought process here is that the user just pressed pause and it is very likely that he will hit 'play' to resume playback in the near future because My app is in the foreground and the user just hit the pause button. Now if my app moves to the background and if I receive a memory warning I presume it'd make sense to tear down the engine or pause it. Perhaps I'm wrong about this? So when I initially hit the play button from my app's UI I also activate my AVAudioSession. I do this in high priority NSOperation since the documentation warns that "we recommend that applications not activate their session from a thread where a long blocking operation will be problematic." So now I'm playing and I hit pause from my app's UI. Then I quickly bring up the "Now Playing" center and I see I'm the "Now Playing" app but the play-pause button is showing the pause icon instead of the play icon but I'm in the pause state. I do set MPNowPlayingInfoCenter's playbackState to MPNowPlayingPlaybackStatePaused when I pause. Not surprisingly this doesn't work. The documentation states this is for macOS only. So the only way to get MPRemoteCommandCenter to show the "play" image for the play-pause button is to deactivate my AVAudioSession when I pause playback? Since I change the active state of my audio session in a NSOperation because documentation recommends "we recommend that applications not activate their session from a thread where a long blocking operation will be problematic." the play-pause toggle in the remote command center won't immediately update since I'm doing it on another thread. IMO it feels kind of inappropriate for a play-pause button to wait on a NSOperation activating the audio session before updating its UI when I already know my play/paused state, it should update right away like the button in my app does. Wouldn't it be nicer to just use MPNowPlayingInfoCenter's playbackState property on iOS too? If I'm no the longer the now playing app/active audio session it doesn't matter since I'm not in the now playing UI, just ignore it? Also is it recommended that I deactivate my audio session explicitly every time the user pauses audio in my app (when I'm in the foreground)? Also when I do deactivate the audio session I get an error: AVAudioSessionErrorCodeIsBusy (but the button in the now playing center updates to the proper image). I do this : -(void)pause { [self.playerNode pause]; [self runOperationToDeactivateAudioSession]; // This does nothing on iOS: MPNowPlayingInfoCenter *nowPlayingCenter = [MPNowPlayingInfoCenter defaultCenter]; nowPlayingCenter.playbackState = MPNowPlayingPlaybackStatePaused; } So in -runOperationToDeactivateAudioSession I get the AVAudioSessionErrorCodeIsBusy. According to the documentation Starting in iOS 8, if the session has running I/Os at the time that deactivation is requested, the session will be deactivated, but the method will return NO and populate the NSError with the code property set to AVAudioSessionErrorCodeIsBusy to indicate the misuse of the API. So pausing the player node when pausing isn't enough to meet the deactivation criteria. I guess I have to pause or stop the audio engine. I could probably wait until I receive a scene went to background notification or something before deactivating my audio session (which is async, so the button may not update to the correct image in time). This seems like a lot of code to have to write to get a play-pause toggle to update, especially in iPad-multi window scene environment. What's the recommended approach? Should I pause the AudioEngine instead of the player node always? Should I always explicitly deactivate my audio session when the user pauses playback from my app's UI even if I'm in the foreground? I personally like the idea of just being able to set [MPNowPlayingInfoCenter defaultCenter].playbackState = MPNowPlayingPlaybackStatePaused; But maybe that's because that would just make things easier on me. This does feels overcomplicated though. If anyone can share some tips on how I should handle this, I'd appreciate it.
4
0
687
Feb ’25