Post

Replies

Boosts

Views

Activity

SwiftData ModelActor causes 5-10 second UI freeze when opening sheet
I'm experiencing a significant UI freeze in my SwiftUI app that uses SwiftData with CloudKit sync. When users tap a button to present a sheet for the first time after app launch, the entire UI becomes unresponsive for 5-10 seconds. Subsequent sheet presentations work fine. App Architecture Service layer: An @Observable class marked with @MainActor that orchestrates operations Persistence layer: A @ModelActor class that handles SwiftData operations SwiftUI views: Using @Environment to access the service layer The structure looks like this: @Observable @MainActor final class MyServices { let persistence: DataPersistence init(modelContainer: ModelContainer) { self.persistence = DataPersistence(modelContainer: modelContainer) } func addItem(title: String) async { // Creates and saves an item through persistence layer } } @ModelActor actor DataPersistence { func saveItem(_ item: Item) async { // Save to SwiftData } } The app initializes the ModelContainer at the Scene level and passes it through the environment: @main struct MyApp: App { let container = ModelContainer(for: Item.self) var body: some Scene { WindowGroup { ContentView() .modelContainer(container) .environment(MyServices(modelContainer: container)) } } } The Problem When a user taps the "Add Item" button which presents a sheet: Button("Add Item") { showingAddSheet = true } .sheet(isPresented: $showingAddSheet) { AddItemView(onAdd: { title in await services.addItem(title: title) }) } The UI freezes completely for 5-10 seconds on first presentation. During this time: The button remains in pressed state No UI interactions work The app appears completely frozen After the freeze, the sheet opens and everything works normally This only happens on the first sheet presentation after app launch. I suspect it's related to SwiftData's ModelContext initialization happening on the main thread despite using @ModelActor, but I'm not sure why this would happen given that ModelActor should handle background execution. Environment iOS 18 SwiftData with CloudKit sync enabled Xcode 16 Swift 6 Has anyone experienced similar freezes with SwiftData and @ModelActor? Is there something wrong with how I'm structuring the initialization of these components? The documentation suggests @ModelActor should handle background operations automatically, but the freeze suggests otherwise. Any insights would be greatly appreciated!
2
0
118
Oct ’25
EKReminder semantics: startDateComponents vs dueDateComponents vs alarms
Hi everyone, I'm building a task management app that layers on top of EventKit/Reminders. I'm also moderating /r/AppleReminders. I see a confusion around the semantics of dates on both the developer side and on the user side. I'm trying to map the standard productivity mental model to the EKReminder implementation and hitting some walls. In productivity contexts, a task tends to have three distinct dates: Start Date: When the task becomes actionable — Don’t alert the user before this date. Notification: When the device should buzz/ping the user — Meaning that they can get started on the task. Due Date: Hard deadline — If the system works well, tasks are meant to rarely be past-deadline; productivity systems are about meeting deadlines rather than about missing them. The EventKit Reality Here is what I’m seeing in practice, and I’m hoping someone can correct me if I’m wrong: Field Description In Practice (Reminders App) startDateComponents Docs say "start date of the task" Seemingly unused? I can set it via API, but the Reminders app UI ignores it. It doesn't seem to trigger visibility in "Today" or Smart Lists. dueDateComponents Docs say "date by which reminder should be completed" Conflated. Acts as the "Date" you see in the list. It functions as the Start Date (shows in Today), Due Date (turns red tomorrow), AND Notification time (unless early alerts are set). alarms Inherited from EKCalendarItem seems to be used for the actual notifications, including "Early Reminders," but tightly coupled to the due date in the UI. My Questions: Is startDateComponents effectively a dead field? Is there any native behavior (Smart List filtering, sorting, visibility) that respects this field, or is it purely for metadata storage for third-party apps? Smart List Logic: I was hoping to create a Smart List that shows "Actionable" items (i.e., Start Date <= Today). However, the Smart List filters only offer a generic "Date" field, which maps to dueDateComponents. Has anyone successfully filtered by startDateComponents in a native Smart List? Conflation: Is there any "blessed" way to set a Due Date that is distinct from the Notification time without fighting the system? (e.g. Due Friday, but remind me Wednesday). Any insight into the intended semantics here would be huge. I'm trying to avoid fighting the framework, but the "One Date to Rule Them All" approach in the Reminders app is making it tricky to support separate Start/Due dates. Thanks!
1
0
61
4d
Understanding the number of input channels in Core Audio
Hello everyone, I'm new to Core Audio and still haven't found my footing. I'm learning how to capture audio from the default device, using Audio Units. On my MacBook, the default audio input is mono. But when I write a piece of code to capture audio using AUHAL, I'm discovering that I need to provide an AudioBufferList with two channels, not one. Also, when I try to capture audio from an audio interface with 20 audio inputs, I must provide an AudioBufferList with two channels, and not with 20 channels. To investigate the issue, I wrote a small diagnostic program, which opens the default audio device and probes it for the number of channels. Depending on which way I'm probing, I'm getting different results. When I probe the stream format, I'm getting information that there is 1 channels. But when I probe the input audio unit, I'm getting information that there are 2 input channels. Here's my program to demonstrate the issue: // InputDeviceChannels.m // Compile with: // clang -framework CoreAudio -framework AudioToolbox -framework CoreFoundation -framework AudioUnit -o InputDeviceChannels InputDeviceChannels.m // // On my system, this prints: // Device Name: MacBook Pro Microphone // Number of Channels (Stream Format): 1 // Number of Elements (Element Count): 2 #import <AudioToolbox/AudioToolbox.h> #import <AudioUnit/AudioUnit.h> #import <CoreAudio/CoreAudio.h> #import <Foundation/Foundation.h> void printDeviceInfo(AudioUnit audioUnit) { UInt32 size; OSStatus err; AudioStreamBasicDescription streamFormat; size = sizeof(streamFormat); err = AudioUnitGetProperty(audioUnit, kAudioUnitProperty_StreamFormat, kAudioUnitScope_Input, 1, &streamFormat, &size); if (err != noErr) { printf("Error getting stream format\n"); exit(1); } int numChannels = streamFormat.mChannelsPerFrame; UInt32 elementCount; size = sizeof(elementCount); err = AudioUnitGetProperty(audioUnit, kAudioUnitProperty_ElementCount, kAudioUnitScope_Input, 0, &elementCount, &size); if (err != noErr) { printf("Error getting element count\n"); exit(1); } printf("Number of Channels (Stream Format): %d\n", numChannels); printf("Number of Elements (Element Count): %d\n", elementCount); } void printDeviceName(AudioDeviceID deviceID) { UInt32 size; OSStatus err; CFStringRef deviceName = NULL; size = sizeof(deviceName); err = AudioObjectGetPropertyData( deviceID, &(AudioObjectPropertyAddress){kAudioDevicePropertyDeviceNameCFString, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMain}, 0, NULL, &size, &deviceName); if (err != noErr) { printf("Error getting device name\n"); exit(1); } char deviceNameStr[256]; if (!CFStringGetCString(deviceName, deviceNameStr, sizeof(deviceNameStr), kCFStringEncodingUTF8)) { printf("Error converting device name to C string\n"); exit(1); } CFRelease(deviceName); printf("Device Name: %s\n", deviceNameStr); } int main(int argc, const char *argv[]) { @autoreleasepool { OSStatus err; // Get the default input device ID AudioDeviceID input_device_id = kAudioObjectUnknown; { UInt32 property_size = sizeof(input_device_id); AudioObjectPropertyAddress input_device_property = { kAudioHardwarePropertyDefaultInputDevice, kAudioObjectPropertyScopeGlobal, kAudioObjectPropertyElementMain, }; err = AudioObjectGetPropertyData(kAudioObjectSystemObject, &input_device_property, 0, NULL, &property_size, &input_device_id); if (err != noErr || input_device_id == kAudioObjectUnknown) { printf("Error getting default input device ID\n"); exit(1); } } // Print the device name using the input device ID printDeviceName(input_device_id); // Open audio unit for the input device AudioComponentDescription desc = {kAudioUnitType_Output, kAudioUnitSubType_HALOutput, kAudioUnitManufacturer_Apple, 0, 0}; AudioComponent component = AudioComponentFindNext(NULL, &desc); AudioUnit audioUnit; err = AudioComponentInstanceNew(component, &audioUnit); if (err != noErr) { printf("Error creating AudioUnit\n"); exit(1); } // Enable IO for input on the AudioUnit and disable output UInt32 enableInput = 1; UInt32 disableOutput = 0; err = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Input, 1, &enableInput, sizeof(enableInput)); if (err != noErr) { printf("Error enabling input on AudioUnit\n"); exit(1); } err = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_EnableIO, kAudioUnitScope_Output, 0, &disableOutput, sizeof(disableOutput)); if (err != noErr) { printf("Error disabling output on AudioUnit\n"); exit(1); } // Set the current device to the input device err = AudioUnitSetProperty(audioUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &input_device_id, sizeof(input_device_id)); if (err != noErr) { printf("Error setting device for AudioUnit\n"); exit(1); } // Initialize AudioUnit err = AudioUnitInitialize(audioUnit); if (err != noErr) { printf("Error initializing AudioUnit\n"); exit(1); } // Print device info printDeviceInfo(audioUnit); // Clean up AudioUnitUninitialize(audioUnit); AudioComponentInstanceDispose(audioUnit); } return 0; } It prints: Device Name: MacBook Pro Microphone Number of Channels (Stream Format): 1 Number of Elements (Element Count): 2 I tried to set the number of channels to 1 on the input unit, but it didn’t change anything. After calling setNumberOfChannels(1, audioUnit), I’m still getting the same output. Note 1: I know that I can ignore one channel, etc, etc. My purpose here is not to "somehow get it to work", I already did that. My purpose is to understand the API, so that I'll be able to write code that handles any number of audio inputs. Note 2: I already read a bunch of documentation, especially this here: https://developer.apple.com/library/archive/technotes/tn2091/ - perhaps the channel map could help here, but I can’t make sense of it - I tried to use it based on my understanding but I only got the -50 OSStatus. How should I understand this? Is it that that audio unit is an abstraction layer and automatically converts mono input into stereo input? Can I ask AUHAL to provide me the same number of input channels that the audio device has?
1
0
1k
Sep ’24