In the WWDC 2025 session "Build a UIKit app with the with the new design", at the 23:22 mark, the presenter says:
And finally, when you no longer need the glass on screen animate it out by setting the effect to nil.
The video shows a UIVisualEffectView whose effect is set to a UIGlassEffect animating away as its effect is set to nil. But when I do this in my app (or a sample app), setting effect to nil does not remove the glass appearance. Is this expected? Is the video out of date? Or is this a bug?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I work on an iOS app that displays images that often contain text, and I'm adding support for ImageAnalysisInteraction as described in this WWDC 2022 session. I have gotten as far as making the interaction show up and being able to select text and get the system selection menu, and even add my own action to the menu via the buildMenuWithBuilder API. But what I really want to do with my custom action is get the selected text and do a custom lookup-like thing to check the text against other content in my app.
So how do I get the selected text from an ImageAnalysisInteraction on a UIImageView? The docs show methods to check if there is selected text, but I want to know what the text is.
I'm experimenting with downloading an audio file of spoken content, using the Speech framework to transcribe it, then using FoundationModels to clean up the formatting to add paragraph breaks and such. I have this code to do that cleanup:
private func cleanupText(_ text: String) async throws -> String? {
print("Cleaning up text of length \(text.count)...")
let session = LanguageModelSession(instructions: "The content you read is a transcription of a speech. Separate it into paragraphs by adding newlines. Do not modify the content - only add newlines.")
let response = try await session.respond(to: .init(text), generating: String.self)
return response.content
}
The content length is about 29,000 characters. And I get this error:
InferenceError::inferenceFailed::Failed to run inference: Context length of 4096 was exceeded during singleExtend..
Is 4096 a reference to a max input length? Or is this a bug?
This is running on an M1 iPad Air, with iPadOS 26 Seed 1.
I have an app that uses UITextView for some text editing. I have some custom operations I can do on the text that I want to be able to undo, and I'm representing those operations in a way that plugs into NSUndoManager nicely. For example, if I have a button that appends an emoji to the text, it looks something like this:
func addEmoji() {
let inserting = NSAttributedString(string: "😀")
self.textStorage.append(inserting)
let len = inserting.length
let range = NSRange(location: self.textStorage.length - len, length: len)
self.undoManager?.registerUndo(withTarget: self, handler: { view in
view.textStorage.deleteCharacters(in: range)
}
}
My goal is something like this:
Type some text
Press the emoji button to add the emoji
Trigger undo (via gesture or keyboard shortcut) and the emoji is removed
Trigger undo again and the typing from step 1 is reversed
If I just type and then trigger undo, the typing is reversed as you'd expect. And if I just add the emoji and trigger undo, the emoji is removed. But if I do the sequence above, step 3 works but step 4 doesn't. The emoji is removed but the typing isn't reversed.
Notably, if step 3 only changes attributes of the text, like applying a strikethrough to a selection, then the full undo chain works. I can type, apply strikethrough, undo strikethrough, and undo typing.
It's almost as if changing the text invalidates the undo manager's previous operations?
How do I insert my own changes into UITextView's NSUndoManager without invalidating its chain of other operations?
I notice in iOS 14 beta 4, collection views show index views on their trailing edge if the data source implements indexTitlesForCollectionView - https://developer.apple.com/documentation/uikit/uicollectionviewdatasource/2851455-indextitlesforcollectionview?language=objc and indexPathForIndexTitle - https://developer.apple.com/documentation/uikit/uicollectionviewdatasource/2851456-collectionview?language=objc. But I don't see a way to control any aspect of its appearance. On UITableView this shows up as the sectionIndexColor - https://developer.apple.com/documentation/uikit/uitableview/1614915-sectionindexcolor?language=objc property.
Does this property exist on UICollectionView somewhere I didn't think to look? Or is it just not there (yet)?
FB8284500
UILargeContentViewerInteraction, added in iOS 13, works out of the box on the tab bar in a UITabBarController, and it's easy to set up on a custom view using the UILargeContentViewerItem properties on UIView. But how do I set it up on a UITabBar that's not connected to a UITabBarController? There don't appear to be any relevant properties on UITabBarItem.
To try this, I made a sample app, added a tab bar, set up some items, set their largeContentSizeImage properties for good measure, ran the app, set text size to a large accessibility value, and long-pressed on the items, and I get no large content viewer.
I also tried adding a UILargeContentViewerInteraction to the tab bar, and implemented the viewControllerForInteraction method in the delegate.
I want to add a Control Center widget for my app that will open the app to a particular feature. I'm looking at the "Open your app with a control" example here, which seems like exactly what I want:
Set your control’s action to an app intent that conforms to OpenIntent to open your app when someone uses a control. Using OpenIntent allows you to take someone to a specific area of your app when a control performs its action.
The example doesn't show exactly how to hook up the LaunchAppIntent to a control widget, but I'm guessing it's something like this:
@available(iOS 18.0, *)
struct OpenFeatureControl : ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(kind: "com.example.OpenFeature") {
ControlWidgetButton(action: LaunchAppIntent()) {
Image(systemName: "book")
}
}
.displayName("Launch Feature")
}
}
But there's one critical piece missing here: how is the target feature actually opened? My initial assumption would have been that once the app launches or resumes, there's a call to some method like continueUserActivity that has a user-info dict with some key whose value is the LaunchAppEnum. But I've put breakpoints on all those methods in my app and none of them get called (I'm using UIKit scene lifecycle).
I also tried a regular AppIntent with a perform method that talks to my app directly:
@available(iOS 18.0, *)
struct OpenFeatureIntent : AppIntent {
static let title: LocalizedStringResource = "Open My Feature"
static let opensAppWhenRun: Bool = true
init() {}
func perform() async throws -> some IntentResult {
//MAIN_APP is defined in Active Compilation Conditions in build settings
#if MAIN_APP
let url = URL(string: "myapp://openfeature")!
UrlHandler.instance().handle(url)
#endif
return .result()
}
}
But when run, this simply does nothing.
Launching an app directly to a particular view or feature seems like a common use-case for control widgets, and there are apps doing it, but I can't find an example of how it's supposed to work. And the docs are really not helpful. Can anyone provide the missing piece here? What's the expected plumbing in an OpenIntent that actually launches particular UI in the app?
I've got a UIBarButtonItem in my app that currently presents an action sheet of items. I want to switch this to a UIMenu with iOS 14's new APIs for buttons and bar button items. But somme of the contents of the action sheet change based on the state of the view controller when the action sheet is triggered.
With the action sheet, I generate a different action sheet every time the bar button item's target action is called. But with the new APIs, I have to generate the menu ahead of time and assign it to the button.
What's the best way to get the menu to update every time it's presented? I tried UIDeferredMenuElement, but it caches the result of its provider.
Feedback FB7824467 suggests adding a property to UIDeferredMenuElement to disable the caching of the provider result.
In the session on StoreKit 2 (which looks amazing!), the presenter says:
In fact, if your app is running when a purchase is made on another device, you'll be notified about the new transaction.
This seems to mean that when an app uses the listener API to be notified of transactions, it will get transactions that happened on other devices.
My app offers purchases across other platforms in addition to iOS, and when a purchase happens we register it with our own account system. If a user has the app running on both their iPad and iPhone and makes a purchase on the phone, if the iPad gets notified of it the same way it would of a purchase made on the iPad, both devices will try to report it to our system. This seems undesirable.
What's the recommended approach here? Should we just make sure our system will disregard duplicate transaction reports? Or is there a way to know whether a transaction originated on this device? I don't see a property on the transaction type that looks like it could accomplish this. Maybe the deviceVerification properties? But that's seems more like the new edition of transaction receipt verification - failing that check would presumably mean that the purchase is invalid, not that it didn't happen on this device...?
I'm working on enabling Catalyst for my existing iOS app. When I try to archive and export Catalyst as a Developer ID-signed Mac app, I get the following error:
Cannot create a Mac Catalyst Developer ID provisioning profile for "[my bundle ID]".
The Siri capability is not available for Mac Catalyst Developer ID provisioning profiles. Disable this feature and try again.
My iOS app uses SiriKit to donate a Siri intent, so Siri is among the capabilities listed in the Signing and Capabilities tab in the project inspector in Xcode. I don't see a way to turn that capability off only for Catalyst (like you can link some frameworks only for Catalyst or iOS), and I don't want to disable Siri entirely in my iOS app.
What's going on here? What do I need to do to "disable this feature" for Catalyst?
I have a UICollectionView tied to a UICollectionViewDiffableDataSource, and I'm generating and applying snapshots to the datasource on a background serial queue, and generating cells using UICollectionViewCellRegistration. I'm working on supporting reordering of the contents of the collection view via drag and drop, and I'm having trouble with what to do in collectionView:performDropWithCoordinator: so the reorder animation looks right.
Normally, I would do something like this:
-(void)collectionView:(UICollectionView *)collectionView performDropWithCoordinator:(id<UICollectionViewDropCoordinator>)coordinator
{
NSIndexPath *sourcePath = (NSIndexPath *)coordinator.items.firstObject.dragItem.localObject;
NSInteger fromIndex = sourcePath.item;
NSInteger toIndex = coordinator.destinationIndexPath.item;
NSNumber *fromItem = [self.datasource itemIdentifierForIndexPath:sourcePath];
NSNumber *toItem = [self.datasource itemIdentifierForIndexPath:coordinator.destinationIndexPath];
//Do the move in the data model
[MyModel moveItemFrom:fromIndex to:toIndex];
//Do the move in the datasource. This is the data source equivalent of:
//[collectionView moveItemAtIndexPath:sourcePath toIndexPath:coordinator.destinationIndexPath];
NSDiffableDataSourceSnapshot *snap = self.datasource.snapshot;
if (toIndex < fromIndex)
[snap moveItemWithIdentifier:fromItem beforeItemWithIdentifier:toItem];
else
[snap moveItemWithIdentifier:fromItem afterItemWithIdentifier:toItem];
[self.dataSource applySnapshot:snap animated:YES];
//Drop the item
[coordinator dropItem:coordinator.items.firstObject.dragItem toItemAtIndexPath:coordinator.destinationIndexPath];
}
But because my datasource updates happen on a background queue, I have to do at least the snapshot generation and application asynchronously, and I'd like to do the actual data model modification there too to avoid hangs. And I need to call dropItem on the coordinator on the main queue in this method. This results in an odd animation where the dropped item momentarily disappears (when drop is called) and then reappears (when the data source is updated on the background queue).
The best idea I have so far is to use UICollectionViewDropPlaceholder to hold the place in the collection view until the data source is updated. But to create a placeholder I need a cell reuse identifier (docs on init method), and I don't have one of those because I'm creating my cells using cell registrations.
So my question: what do I do in the performDrop method to make this work correctly? If the placeholder is the right idea, how do I use it in this situation?
How can I use UICollectionViewDiffableDataSource reorderHandlers with a custom compositional layout?
As of iOS 14, UICollectionViewDiffableDataSource has a reorderHandlers property. It's demonstrated in some sample code and talked about in WWDC 2020 session on Advances in Diffable Data Sources. The presenter states that you have to provide a canReorder and didReorder closure to enable the feature.
The sample code uses it in a collection view with a list layout configuration, and configures the list cells with reorder accessories. The canReorder and didReorder methods are called as expected. But if I remove the reorder accessories from the cells, reordering no longer works - it doesn't call either closure. It also doesn't work in my app, where I have a grid layout using a compositional layout.
How do I enable reordering on UICollectionViewDiffableDataSource without list cells and reorder accessories?
I have an iPad app in which I'm starting to support multiple windows / scenes. I have one main window type, let's say MainScene, and at least one secondary window type for opening specific types of content, say DetailScene.
I have not declared my scene types in Info.plist. I have implemented application:configurationForConnectingSceneSession:options: like this:
-(UISceneConfiguration *)application:(UIApplication *)application configurationForConnectingSceneSession:(UISceneSession *)connectingSceneSession options:(UISceneConnectionOptions *)options
{
NSUserActivity *activity = options.userActivities.anyObject;
NSString *activityType = activity.activityType;
if ([activityType isEqualToString:@"detailType"])
return [DetailSceneDelegate makeSceneConfiguration];
return [MainSceneDelegate makeSceneConfiguration];
}
Say I perform these steps:
Launch app for the first time. I get a call to configurationForConnectingSceneSession, and the activity type is nil so it returns a MainScene.
Open a new window for some piece of content. That uses the detail scene activity type, so configurationForConnectingSceneSession returns a DetailScene. Creating the new scene looks like this:
NSUserActivity *activity = [[NSUserActivity alloc] initWithActivityType:@"detailType"];
activity.userInfo = @{@"content_id": @(contentRowId)};
[[UIApplication sharedApplication] requestSceneSessionActivation:nil userActivity:activity options:nil errorHandler:nil];
Suspend the app and open the app switcher. Discard (by flicking up) first the main window and then the detail window. The app is now killed.
Relaunch the app.
At this point I do not get a call to configurationForConnectingSceneSession. I get the detail scene back, restored from its user activity, with calls straight to DetailSceneDelegate.
My question: how do I control what scene gets restored in this situation? I want my main scene to come back.
Messages and Mail and Notes all do this. If you open Messages and drag a conversation out into a new window, you get a window for that conversation with a Done button in the corner that will dismiss the window. If you perform my steps above with Messages, you will relaunch to the full Messages view. Are they converting the detail view to a main view on the fly? Or is there a way to tell the system that the detail scene is secondary and should not be restored first, or that I should get asked what I want to restore via configurationForConnectingSceneSession? Or something else?
I have an iOS app that can play audio books as FairPlay DRM'd audio streams. I'm using AVContentKeySession to fetch keys and AVQueuePlayer to play. It works great on iOS, but on Catalyst (Monterey 12.3.1, M1 Max MacBook Pro) it crashes. When it happens in debug, I get a message "Too few bits left in input buffer" printed to the console, and an exception is thrown in the guts of what appears to be core media. Here's the output from [NSThread callStackSymbols] when paused on an exception breakpoint at that point:
0 ??? 0x0000000156b8cc9c 0x0 + 5749918876,
1 BRFree 0x000000010445cf08 main + 0,
2 AudioCodecs 0x0000000157918c24 _ZL11GetPropertyPvjPjS_ + 52,
3 AudioToolboxCore 0x00000001862536a4 _ZN15ADTSAudioStream11ParseHeaderER27AudioFileStreamContinuation + 1060,
4 AudioToolboxCore 0x000000018624259c AudioFileStreamParseBytes + 412,
5 MediaToolbox 0x00000001915f7d98 FigManifoldCreateForICY + 2176,
6 MediaToolbox 0x00000001915f7718 FigManifoldCreateForICY + 512,
7 MediaToolbox 0x00000001916bbd78 FigPlayerStreamCreate + 315920,
8 MediaToolbox 0x0000000191a269a0 FigMetadataConverterCreateForQuickTimeToFromiTunes + 45888,
9 MediaToolbox 0x0000000191a4a290 FigMetadataConverterCreateForQuickTimeToFromiTunes + 191536,
10 MediaToolbox 0x0000000191a2a470 FigMetadataConverterCreateForQuickTimeToFromiTunes + 60944,
11 MediaToolbox 0x0000000191a28c34 FigMetadataConverterCreateForQuickTimeToFromiTunes + 54740,
12 MediaToolbox 0x0000000191a413b4 FigMetadataConverterCreateForQuickTimeToFromiTunes + 154964,
13 MediaToolbox 0x0000000191a2f144 FigMetadataConverterCreateForQuickTimeToFromiTunes + 80612,
14 MediaToolbox 0x00000001917cae64 FigAlternateFilterMonitorCreateForThermalNotification + 30428,
15 MediaToolbox 0x00000001917ccb44 FigAlternateFilterMonitorCreateForThermalNotification + 37820,
16 MediaToolbox 0x00000001917cb134 FigAlternateFilterMonitorCreateForThermalNotification + 31148,
17 CFNetwork 0x0000000189937cd4 _CFHostIsDomainTopLevelForCertificatePolicy + 27728,
18 Foundation 0x0000000185ba980c __NSBLOCKOPERATION_IS_CALLING_OUT_TO_A_BLOCK__ + 24,
19 Foundation 0x0000000185ba96b4 -[NSBlockOperation main] + 104,
20 Foundation 0x0000000185ba9644 __NSOPERATION_IS_INVOKING_MAIN__ + 24,
21 Foundation 0x0000000185ba88dc -[NSOperation start] + 788,
22 Foundation 0x0000000185ba85c0 __NSOPERATIONQUEUE_IS_STARTING_AN_OPERATION__ + 24,
23 Foundation 0x0000000185ba8478 __NSOQSchedule_f + 184,
24 libdispatch.dylib 0x00000001099f60f4 _dispatch_block_async_invoke2 + 148,
25 libdispatch.dylib 0x00000001099e2394 _dispatch_client_callout + 20,
26 libdispatch.dylib 0x00000001099eb778 _dispatch_lane_serial_drain + 980,
27 libdispatch.dylib 0x00000001099ec814 _dispatch_lane_invoke + 492,
28 libdispatch.dylib 0x00000001099f9fc8 _dispatch_root_queue_drain + 408,
29 libdispatch.dylib 0x00000001099f9d3c _dispatch_worker_thread + 264,
30 libsystem_pthread.dylib 0x00000001094e1890 _pthread_start + 148,
31 libsystem_pthread.dylib 0x00000001094ebaa8 thread_start + 8
In the Debug Navigator, the top of the stack looks a little different:
#0 0x0000000184bdeb3c in __cxa_throw ()
#1 0x000000015791941c in ACMP4AACLowComplexityDecoder::GetProperty(unsigned int, unsigned int&, void*) ()
#2 0x0000000157918c24 in GetProperty(void*, unsigned int, unsigned int*, void*) ()
#3 0x00000001862536a4 in ADTSAudioStream::ParseHeader(AudioFileStreamContinuation&) ()
#4 0x000000018624259c in AudioFileStreamParseBytes ()
#5 0x00000001915f7d98 in ___lldb_unnamed_symbol598$$MediaToolbox ()
What's going wrong here? Can anyone give me direction on what to fix? Or is FairPlay audio not even expected to work in Catalyst at this point?
I have been getting crash reports from users of my Mac app on Sonoma 14.0 and 14.1 when typing into an NSTextView subclass. The crash logs I have show involvement of the spell-checking system - NSTestCheckingController, NSSpellChecker, and NSCorrectionPanel. The crash is because of an exception being thrown. The throwing method is either [NSString getParagraphStart:end:contentsEnd:forRange:] or [NSTextStorage ensureAttributesAreFixedInRange:].
I have not yet reproduced the crash. I have tried modifying the reference finding process to simply link every word, via NSStringEnumerationByWords.
The text view in question recognizes certain things in the entered text and adds hyperlinks to the text while the user is typing. It re-parses and re-adds the links on every key press (via overriding the didChangeText method), on a background thread.
From user reports, I have learned that:
The crash only occurs on macOS 14.0 and 14.1, not on previous versions
The call stack always involves the spell checker, and sometimes involves adding recognized links to the text storage (the call to DispatchQueue.main.async in the code below)
The crash stops happening if the user turns off the system spell checker in System Settings -> Keyboard -> Edit on an Input Source -> Correct Spelling Automatically switch
The crash does not happen when there are no links in the text view.
Here is the relevant code:
extension NSMutableAttributedString {
func batchUpdates(_ updates: () -> ()) {
self.beginEditing()
updates()
self.endEditing()
}
}
class MyTextView : NSTextView {
func didChangeText() {
super.didChangeText()
findReferences()
}
var parseToken: CancelationToken? = nil
let parseQueue = DispatchQueue(label: "com.myapp.ref_parser")
private func findReferences() {
guard let storage = self.textStorage else { return }
self.parseToken?.requestCancel()
let token = CancelationToken()
self.parseToken = token
let text = storage.string
self.parseQueue.async {
if token.cancelRequested { return }
let refs = RefParser.findReferences(inText: text, cancelationToken: token)
DispatchQueue.main.async {
if !token.cancelRequested {
storage.batchUpdates {
var linkRanges: [NSRange] = []
storage.enumerateAttribute(.link, in: NSRange(location: 0, length: storage.length)) { linkValue, linkRange, stop in
if let linkUrl = linkValue as? NSURL {
linkRanges.append(linkRange)
}
}
for rng in linkRanges {
storage.removeAttribute(.link, range: rng)
}
for r in refs {
storage.addAttribute(.link, value: r.url, range: r.range)
}
}
self.verseParseToken = nil
}
}
}
}
}
I've filed this as FB13306015 if any engineers see this. Can anyone