Post

Replies

Boosts

Views

Activity

document-based sample code doesn't work... work around?
I just tried the "Building a document-based app with SwiftUI" sample code for iOS 18. https://developer.apple.com/documentation/swiftui/building-a-document-based-app-with-swiftui I can create a document and then close it. But once I open it back up, I can't navigate back to the documents browser. It also struggles to open documents (I would tap multiple times and nothing happens). This happens on both simulator and device. Will file a bug, but anyone know of a work-around? I can't use a document browser that is this broken.
1
1
430
Nov ’24
Xcode UIKit Document App template crashes under Swift 6
I'm trying to switch to UIKit's document lifecycle due to serious bugs with SwiftUI's version. However I'm noticing the template project from Xcode isn't compatible with Swift 6 (I already migrated my app to Swift 6.). To reproduce: File -> New -> Project Select "Document App" under iOS Set "Interface: UIKit" In Build Settings, change Swift Language Version to Swift 6 Run app Tap "Create Document" Observe: crash in _dispatch_assert_queue_fail Does anyone know of a work around other than downgrading to Swift 5?
0
1
83
Apr ’25
black screen after switching to UIKit app lifecycle
I had to switch from the SwiftUI app lifecycle to the UIKit lifecycle due to this issue: https://developer.apple.com/forums/thread/742580 When I switch to UIKit I get a black screen on startup. It's the inverse of this issue: https://openradar.appspot.com/FB9692750 For development, I can work around this by deleting and reinstalling the app, but I can't ship an app that results in a black screen for users when they update. Anyone know of a work-around? I've filed FB13462315
7
2
1.4k
Dec ’23
SwiftUI full screen animation uses less energy than Metal Game template
I've got a full-screen animation of a bunch of circles filled with gradients, with plenty of (careless) overdraw, plus real-time audio processing driving the animation, plus the overhead of SwiftUI's dependency analysis, and that app uses less energy (on iPhone 13) than the Xcode "Metal Game" template which is a rotating textured cube (a trivial GPU workload). Why is that? How can I investigate further? Does CoreAnimation have access to a compositor fast-path that a Metal app cannot access? Maybe another data point: when I do the same circles animation using SwiftUI's Canvas, the energy use is "Very High" and GPU utilization is also quite high. Eventually the phone's thermal state goes "Serious" and I get a message on the device that "Charging will resume when iPhone returns to normal temperature".
0
5
1.1k
May ’24
IAP in App Extension
How should an App Extension (in this case an Audio Unit Extension) determine if an IAP has been purchased in the containing app? (and related: can an IAP be purchased from within the extension?) On macOS, I suppose I could share the receipt file with the extension? and on iOS, suppose I could write some data to shared UserDefaults in an app group. Is there any official guidance on this? thanks!
1
1
1.4k
Aug ’22
testing multichannel AudioUnit output with AVAudioEngine
I'm extending an AudioUnit to generate multi-channel output, and trying to write a unit test using AVAudioEngine. My test installs a tap on the AVAudioNode's output bus and ensures the output is not silence. This works for stereo. I've currently got: auto avEngine = [[AVAudioEngine alloc] init]; [avEngine attachNode:avAudioUnit]; auto format = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100. channels:channelCount]; [avEngine connect:avAudioUnit to:avEngine.mainMixerNode format:format]; where avAudioUnit is my AU. So it seems I need to do more than simply setting the channel count for the format when connecting, because after this code, [avAudioUnit outputFormatForBus:0].channelCount is still 2. Printing the graph yields: AVAudioEngineGraph 0x600001e0a200: initialized = 1, running = 1, number of nodes = 3 ******** output chain ******** node 0x600000c09a80 {'auou' 'ahal' 'appl'}, 'I' inputs = 1 (bus0, en1) <- (bus0) 0x600000c09e00, {'aumx' 'mcmx' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] node 0x600000c09e00 {'aumx' 'mcmx' 'appl'}, 'I' inputs = 1 (bus0, en1) <- (bus0) 0x600000c14300, {'augn' 'brnz' 'brnz'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] outputs = 1 (bus0, en1) -> (bus0) 0x600000c09a80, {'auou' 'ahal' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] node 0x600000c14300 {'augn' 'brnz' 'brnz'}, 'I' outputs = 1 (bus0, en1) -> (bus0) 0x600000c09e00, {'aumx' 'mcmx' 'appl'}, [ 2 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] So AVAudioEngine just silently ignores whatever channel counts I pass to it. If I do: auto numHardwareOutputChannels = [avEngine.outputNode outputFormatForBus:0].channelCount; NSLog(@"hardware output channels %d\n", numHardwareOutputChannels); I get 30, because I have an audio interface connected. So I would think AVAudioEngine would support this. I've also tried setting the format explicitly on the connection between the mainMixerNode and the outputNode to no avail.
0
2
1.6k
Jun ’22