Post

Replies

Boosts

Views

Activity

Creating an initial Now Playing state of paused - impossible?
I am working on an app which plays audio - https://youtu.be/VbAfUk_eYl0?si=nJg5ayy2faWE78-g - and one of the features is, on restart, if you had paused playback of a file at the time the app was previously shut down (or were playing one at the time of shutdown), the paused state and position in the file is restored exactly as it was, on restart. The functionality works. However, it seems impossible to get the "now playing" information in iOS into the right state to reflect that via the MediaPlayer API. On restart, handlers are attached to the play/pause/togglePlayPause actions on MPRemoteCommandCenter.shared(), and the map of media info is updated on MPNowPlayingInfoCenter.default().nowPlayingInfo. What happens is that iOS's media view shows the audio as playing and offers a pause button - even though the play action is enabled and the pause action is disabled. Once playback has been initiated (my workaround is to have the pause action toggle the play state, since otherwise you wouldn't be able to initiate playback from controls in a car without initiating it once from a device first). I've created a simplified white-noise-player demo to illustrate the problem - simply build and deploy it, and then start the app, lock your device and look at the playback controls on the lock screen. It will show a pause button - same behavior I've described. https://github.com/timboudreau/ios-play-pause-demo I've tried a few things to narrow down the source of the issue - for example, thinking that not MPNowPlayingInfoPropertyPlaybackProgress and MPMediaItemPropertyPlaybackDuration might be the culprit (since the system interpolates elapsed time and it's recommended to update those properties infrequently) on startup might do the trick, but the result is the same, just without a duration or progress shown. What governs this behavior, and is there some way to explicitly tell the media player API your current state is paused?
0
0
41
Apr ’25
Audio player app is silent if device connected via CarPlay
I have a SwiftUI app - (https://youtu.be/VbAfUk_eYl0?si=JxUBh0Bpb-vc1E1U) - which I thought was almost ready for release - a manager for airdropped audio files from Logic Pro or other music creation applications. It uses AVAudioEngine and AVAudioPlayerNode to play audio, and the MediaPlayer API to integrate with car audio and similar, all of which works well. It does not currently have an explicit CarPlay integration (and I'm slightly horrified at the amount of work that is going to require). I had the good or bad luck of getting a loaner car with carplay while mine is being repaired yesterday, and lo and behold, when connected to the vehicle via CarPlay, there is no audio output in the vehicle at all. The now playing panel correctly shows the information my app provides about the currently playing song; the player node believes it is playing, the AVAudioSession is configured as it should be. But there is no sound. Obviously I cannot ship it in this state. I've tried fiddling with the parameters the AVAudioSession is configured with, in case there was some parameter that was preventing audio output, to no avail - currently: var options = AVAudioSession.CategoryOptions() options.insert(.allowAirPlay) options.insert(.allowBluetooth) options.insert(.allowBluetoothA2DP) try session.setCategory(.playback, mode: .default, options: options) try? session.setPreferredIOBufferDuration(0.002) // ~96 samples at 44.1kHz try? session.setPrefersNoInterruptionsFromSystemAlerts(true) try? session.setPrefersInterruptionOnRouteDisconnect(false) try session.setActive(true, options: [.notifyOthersOnDeactivation]) All diagnostics within the app show the player operating correctly - files are played and flushed; AVAudioPlayerNodeCompletionCallbacks are called when they should be. But the output is not audible in the vehicle. I would much prefer to ship this app without full-blown CarPlay integration, but with working audio when connected via CarPlay, and work on full CarPlay integration for the next release. Is there some secret handshake I am just missing to make this work?
1
0
48
Mar ’25
Making sense of AVAudioSession interruption notifications
I have an app under development - demo here - https://youtu.be/VbAfUk_eYl0?si=s6EDBx-4G6P_QbZO - which is sort of an audio player for airdropped files - something useful to musicians who dump work in progress to their phone, make notes, revise and update. I've been testing my handling of audio session interruption notifications, but seems to be a lot of inconsistency in how, when and why iOS delivers them, and I'm wondering if there is some rhyme or reason to it that I'm just not detecting. For example, I am playing a song in my app. Switch to Apple Music and start playing a song there. My app gets an interruption began notification - this is consistent. Switch back to my app, and about half the time, I will get an interruption ended notification (coupled often with a blast of the tail of whatever audio buffer was partially played when the interruption started, even though the engine was stopped - and followed by call to my AVAudioPlayerNodeCompletionCallback - is there some way to avoid this?). Half the time I don't get an interruption ended notification; my app can (as expected) end the interruption by activating the AVAudioSession and playing something. I have not been able to determine any pattern to this behavior, other than that if my app started playing using AVAudioPlayerNode.scheduleSegment rather than scheduleFile I think the notification will be consistently delivered on app activation rather than when I activate the session programmatically. I would like my app to behave deterministically, and would appreciate any help in deciphering what causes the inconsistent behavior in notifications from iOS.
2
0
277
Mar ’25
Accessing/scanning the iOS Downloads folder (the one airdropped-to) from an app
Here's the problem I'm trying to solve: Create an iOS app which can scan the Downloads folder (where airdropped audio files arrive), identify audio media files, and play them, retaining some of its own metadata about them (basically, create textual notes mapped to timestamps and store that information in the apps own storage). I am not able to access that folder. I am able to get a path from NSSearchPathForDirectoriesInDomains(FileManager.SearchPathDirectory.downloadsDirectory, FileManager.SearchPathDomainMask(arrayLiteral: FileManager.SearchPathDomainMask.userDomainMask), true) or a URL from NSSearchPathForDirectoriesInDomains(FileManager.SearchPathDirectory.downloadsDirectory, FileManager.SearchPathDomainMask(arrayLiteral: FileManager.SearchPathDomainMask.userDomainMask), true) but let fileUrls = try fileManager.contentsOfDirectory(at:downloads, includingPropertiesForKeys: []) fails with an error that the folder does not actually exist, with or without a call to downloadsUrl.startAccessingSecurityScopedResource(). Determining whether this is a permissions issue, or if I'm getting a URL to an application-container local folder that has nothing to do with the one I am looking for is compounded by the fact that if I set the build setting Enable App Sandbox, then deployment to my phone fails with Failed to verify code signature. I have spent hours trying every possible combination of certificates and deployment profiles, and ensured that every possibly relevant certificate is trusted on my phone. Disable app-sandbox and it deploys fine, either with automatic signing or an explicit cert and profile. I have an entitlements file with the following - though, without the ability to enable app sandbox and run it on a phone with actual contents in the downloads folder, it is probably not affecting anything: <key>com.apple.security.files.downloads.read-only</key> <true/> <key>com.apple.security.files.user-selected.read-only</key> <true/> <key>com.apple.security.app-sandbox</key> <true/> So, questions: Should the URL returned by the above call be the Downloads/ folder airdropped to in the first place? Or is it a URL to some app-local folder that does not exist? Does the entitlement com.apple.security.files.downloads.read-only even allow an app to list all files in the downloads directory (presumably asking the user's permission the first time), or does the permission only get requested when using a picker dialog? (the point here is to find any new audio files without making the user jump through hoops) If I could get it deployed with app-sandbox enabled, would the above code work? Backstory: I'm a software engineer, audio plugin author, Logic Pro user and musician. My workflow (and probably many other Logic user's) for work-in-progress music is to airdrop a mix to my phone, listen to it in a variety of places, make notes about what to change, edit - rinse and repeat. For years I used VLC for iOS to keep and play these in-progress mixes - you could airdrop and select VLC as the destination (yes, Logic can add to your Apple Music library, but trust me, you do not want 20 revisions of the same song cluttering your music library and sync'd to all your devices). Last year, the behavior of Airdrop changed so that the target app for audio is always Files, period, wrecking that workflow. While I eventually discovered that, with an elaborate and non-obvious dance of steps, it is possible to copy files into VLC's folders, and make them available that way, it is inconvenient, to say the least - and VLC is less than fabulous anyway - it would be nice to have an app that could associate to-do notes with specific timestamps in a tune, A/B compare sections between old and new versions and things like that. So, figuring sooner or later I was going to get into a car accident futzing with the Files app to listen to mixes while driving, perhaps I should write that app. But the ability to do that at all relies on the ability of an app to list and access the Downloads folder airdropped audio files land in (assuming the user has given permission to access it, but that should be needed once).
4
0
494
Jan ’25
Can Logic Pro load an Audio Unit v3 in-process?
After investing more than a week into getting a bunch of audio unit projects converted into app + appex + framework, they all are now correctly loaded in-process in the demo host app that is part of Xcode's template. However, Logic Pro adamantly refuses to load them in-process. Does Logic Pro simply not do that ever, or is there some hint or configuration my plugins need to provide to enable that? If it is unsupported, will it be supported in some future version of Logic? The entire point of investing that week was performance, which is moot if it is impossible to test the impact of loading in-process in a real-world usage scenario.
1
0
615
Oct ’24
Creating an installer for a V3 AudioUnit
I've got a bunch of AudioUnit projects approaching release, and am attempting to build an installer for them. All are based on the AudioUnit template in Xcode 14. What actually governs how the system detects an AudioUnit? The instructions I have seen say that the built .appex should be renamed to have the extension .component and installed into /Library/Audio/Plug-Ins/Components/ - great, I am able to build a signed installer that does that (i.e. strip out the built Application project that is part of the AudioUnit template but useless to, say, a Logic Pro user), include the .appex that declares the plugin and embeds a Framework that contains the actual code (so it can be loaded in-process). auval -l does not show it after running the installer, nor does Console show anything logged suggesting that it was found but malformed or something like that. Meanwhile, simply building the project causes auval -l to show an install of it in the build directory, and I have noticed that if I delete that, auval -l would still show the plugin installed, but now in the location I exported an archive of the project (!!). What black magic is this? However, deleting both the recent build and the archive, after running the installer, and there is no indication that AudioComponentRegistry even sees the copy of it in one of the two locations actually documented to be valid install locations for an AudioUnit. I have, however, installed one third-party free AUv3 which installed into /Library/Audio/Plug-Ins/Components/ Am I misunderstanding something about how this works? Is there some string other than AudioComponentRegistry I should filter on in Console that might provide a clue why my AudioUnit installed there is not picked up? Must I ship the semi-pointless Application that is part of the Xcode template project, and whatever magical mechanism detects it when I build will work its magic on end-users' machines? Or could the problem be that the Framework with the actual code under Contents/Frameworks inside the audio unit, rather than installed independently into /Library/Frameworks?
2
0
656
Sep ’24
Using non-modular libraries in an audio-unit
I've got a bunch of Audio Units I've been developing over the last few months - in Xcode 14 based on the Audio Unit template that ships in Xcode. The DSP heavy-lifting is all done in Rust libraries, which I build for Intel and Apple Silicon, merge using lipo and build XCFrameworks from. There are several of these - one with the DSP code, and several others used from the UI - a mix of SwiftUI and Cocoa. Getting the integration of Rust, Objective C, C++ and Swift working and automated took a few weeks (my first foray into C++ since the 1990s), but has been solid, reliable and working well - everything loads in Logic Pro and Garage Band and works. But now I'm attempting the (woefully underdocumented) process of making 13 audio unit projects able to be loaded in-process - which means moving all of the actual code into a Framework. And therein lies the problem: None of the xcframeworks are modular, which leads to the dreaded "Include of non-modular header inside framework module". Imported directly into the app extension project with "Allow Non-modular Includes in Framework Modules" works fine - but in a framework this seems to be verboten. So, the obvious solution would be to add a module map to each xcframework, and then, poof, they're modular. BUT... due to a peculiar limitation of Xcode's build system I've spent days searching for a workaround for, it is only possible to have ONE xcframework containing a module.modulemap file in any project. More than that and xcodebuild will try to clobber them with each other and the build will fail. And there appears to be NO WAY to name the file anything other than module.modulemap inside an xcframework and have it be detected. So I cannot modularize my frameworks, because there is more than one of them. How the heck does one work around this? A custom module map file somewhere (that the build should find and understand applies to four xcframeworks - how?)? Something else? I've seen one dreadful workaround - https://medium.com/@florentmorin/integrating-swift-framework-with-non-modular-libraries-d18098049e18 - given that I'm generating a lot of the C and Objective C code for the audio in Rust, I suppose I could write a tool that parses the header files and generates Objective C code that imports each framework and declares one method for every single Rust call. But it seems to me there has to be a correct way to do this without jumping through such hoops. Thoughts?
2
0
639
Sep ’24