When I run my iOS app on a Mac using Mac Catalyst, several sharing options that show up on an iOS device in a share sheet are absent on the Mac. Clicking on Edit Extensions, I see Mail, Message and AirDrop, their switches are on and disabled. All three items show up when I share from Safari or Notes.
How can I make Mail, Message and AirDrop available?
For example, when sharing data, no share extensions are shown.
For text, only Simulator, Shortcuts and Copy are shown.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm primarily an iOS developer. Every day that I develop, Mac Time Machine backs up a gigabyte or more of data. I'm trying to reduce that as much as possible. No data involving the simulators seems important enough to backup. If I ever need to restore Xcode, I'd reinstall rather than restore from Time Machine. But I'd want to back up code snippets, etc.
What are the best practices to prevent large amounts of Xcode or simulator data from being backed up?
I need to direct text-to-speech generated audio from my app simultaneously to a bluetooth speaker device AND to the internal iPad speaker. The app uses AVSpeechSynthesizer and several third party speech engines. How best to do this?
I noticed the outputChannels property on AVSpeechSynthesizer...are there any examples of how to use this?
Topic:
Accessibility & Inclusion
SubTopic:
General
What is the best way to switch between Core Data Persistent Stores?
My use case is that I have a multi-user app that stores thousands of data items unique to each user. To me, having Persistent Stores for each user seems like the best design to keep their data separate and private. (If anyone believes that storing the data for all users in one Persistent Store is a better design, I'd appreciate hearing from them.)
Customers might switch users 5 to 10 times a day. Switching users must be fast, say a second or two at most.
Hi. I have an iPad 7th gen running iPadOS 14.8.1. For testing purposes, I want to upgrade it to the latest release of iPadOS 17, even though the device wants me to upgrade to iPadOS 18.5. Is there a way to upgrade it to iPadOS 17?
btw this would free-up a newer iPad running iPadOS 17 so I can install iPadOS 26 beta.
Thank you.
Topic:
Developer Tools & Services
SubTopic:
General
Is there a way for developers to generate IPA notation from user voice input like in the Settings app (Accessibility > VoiceOver > Speech > Pronunciations)?
Thought this might be a useful option for AAC apps.
Topic:
Accessibility & Inclusion
SubTopic:
General
Very excited about the new eye tracking in iPadOS and iOS 18. Some general eye tracking questions.
Does the initial iPadOS 18 beta include eye tracking? If not, in which beta will it be included?
Do developers need to do anything to their app for users to control their app using eye tracking?
Will all standard UIKit and SwiftUI views and controls work with eye tracking without code changes?
Will custom subclasses of UIControl work with eye tracking without code changes?
Looking forward to testing eye tracking.
Topic:
Accessibility & Inclusion
SubTopic:
General
My iPad 8th gen running IPadOS 17.0 RC does not list Personal Voice in the Accessibility settings. Is Personal Voice creation supported on iPad? If so, on which iPad models is it supported?
In the Settings App, when I go to Accessibility, the only item listed in the SPEECH section is Live Speech. Can I create a personal voice on an iPad?
I am using an iPad 8th gen 32 GB, running iPadOS 17 beta 2 (17.0, 21A5268h)
To which session is "- While Yusef helped us add some style to our user interfaces." referring? It seems interesting and I'd like to watch the video.
I've been unable to get AVAudioEngine connect:to:format: to work when using AVAudioFormat initWithCommonFormat:AVAudioPCMFormatInt16. The method always produces a kAudioUnitErr_FormatNotSupportederror:ERROR: AVAudioNode.mm:521: AUSetFormat: error -10868*** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'What do I need to do to play a sound buffer in AVAudioPCMFormatInt16 format using AVAudioEngine? AVAudioEngine *engine = [[AVAudioEngine alloc] init];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[engine attachNode:player];
AVAudioFormat *format = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatInt16
sampleRate:(double)22050.
channels:(AVAudioChannelCount)1
interleaved:NO];
AVAudioPCMBuffer *buffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:format
frameCapacity:(AVAudioFrameCount)1024];
buffer.frameLength = (AVAudioFrameCount)buffer.frameCapacity;
memset(buffer.int16ChannelData[0], 0, buffer.frameLength * format.streamDescription->mBytesPerFrame); // zero fill
AVAudioMixerNode *mainMixer = [engine mainMixerNode];
// The following line results in a kAudioUnitErr_FormatNotSupported -10868 error
[engine connect:player to:mainMixer format:buffer.format];
[player scheduleBuffer:buffer completionHandler:nil];
NSError *error;
[engine startAndReturnError:&error];
[player play];As background, my app needs to queue audio buffers generated by third party software to play sequentially. The audio buffers play fine (individually) using AVAudioPlayer. The AVAudioFormat settings in the above code come from inspecting the AVAudioPlayer settings property when playing a generated audio buffer. I am new to Core Audio and AVAudioEngine.