Post

Replies

Boosts

Views

Activity

Some sharing extensions disabled when running iOS app with Mac Catalyst
When I run my iOS app on a Mac using Mac Catalyst, several sharing options that show up on an iOS device in a share sheet are absent on the Mac. Clicking on Edit Extensions, I see Mail, Message and AirDrop, their switches are on and disabled. All three items show up when I share from Safari or Notes. How can I make Mail, Message and AirDrop available? For example, when sharing data, no share extensions are shown. For text, only Simulator, Shortcuts and Copy are shown.
0
0
42
Jun ’25
Prevent backing up large Xcode files
I'm primarily an iOS developer. Every day that I develop, Mac Time Machine backs up a gigabyte or more of data. I'm trying to reduce that as much as possible. No data involving the simulators seems important enough to backup. If I ever need to restore Xcode, I'd reinstall rather than restore from Time Machine. But I'd want to back up code snippets, etc. What are the best practices to prevent large amounts of Xcode or simulator data from being backed up?
1
0
55
Jun ’25
How to switch between Core Data Persistent Stores?
What is the best way to switch between Core Data Persistent Stores? My use case is that I have a multi-user app that stores thousands of data items unique to each user. To me, having Persistent Stores for each user seems like the best design to keep their data separate and private. (If anyone believes that storing the data for all users in one Persistent Store is a better design, I'd appreciate hearing from them.) Customers might switch users 5 to 10 times a day. Switching users must be fast, say a second or two at most.
1
0
51
Jun ’25
How to upgrade an iPad to iOS 17 for testing?
Hi. I have an iPad 7th gen running iPadOS 14.8.1. For testing purposes, I want to upgrade it to the latest release of iPadOS 17, even though the device wants me to upgrade to iPadOS 18.5. Is there a way to upgrade it to iPadOS 17? btw this would free-up a newer iPad running iPadOS 17 so I can install iPadOS 26 beta. Thank you.
1
0
35
Jun ’25
Eye Tracking Availability and App Requirements in iPadOS 18
Very excited about the new eye tracking in iPadOS and iOS 18. Some general eye tracking questions. Does the initial iPadOS 18 beta include eye tracking? If not, in which beta will it be included? Do developers need to do anything to their app for users to control their app using eye tracking? Will all standard UIKit and SwiftUI views and controls work with eye tracking without code changes? Will custom subclasses of UIControl work with eye tracking without code changes? Looking forward to testing eye tracking.
4
4
1.6k
Jun ’24
AVAudioEngine connect:to:format: fails with error -10868 when using AVAudioPCMFormatInt16
I've been unable to get AVAudioEngine connect:to:format: to work when using AVAudioFormat initWithCommonFormat:AVAudioPCMFormatInt16. The method always produces a kAudioUnitErr_FormatNotSupportederror:ERROR: AVAudioNode.mm:521: AUSetFormat: error -10868*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'What do I need to do to play a sound buffer in AVAudioPCMFormatInt16 format using AVAudioEngine? AVAudioEngine *engine = [[AVAudioEngine alloc] init]; AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init]; [engine attachNode:player]; AVAudioFormat *format = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatInt16 sampleRate:(double)22050. channels:(AVAudioChannelCount)1 interleaved:NO]; AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:format frameCapacity:(AVAudioFrameCount)1024]; buffer.frameLength = (AVAudioFrameCount)buffer.frameCapacity; memset(buffer.int16ChannelData[0], 0, buffer.frameLength * format.streamDescription->mBytesPerFrame); // zero fill AVAudioMixerNode *mainMixer = [engine mainMixerNode]; // The following line results in a kAudioUnitErr_FormatNotSupported -10868 error [engine connect:player to:mainMixer format:buffer.format]; [player scheduleBuffer:buffer completionHandler:nil]; NSError *error; [engine startAndReturnError:&error]; [player play];As background, my app needs to queue audio buffers generated by third party software to play sequentially. The audio buffers play fine (individually) using AVAudioPlayer. The AVAudioFormat settings in the above code come from inspecting the AVAudioPlayer settings property when playing a generated audio buffer. I am new to Core Audio and AVAudioEngine.
4
1
6.8k
Dec ’15