Post

Replies

Boosts

Views

Activity

StoreKit 2: Handle unfinished consumables
I have non-consumable and consumable in-app purchases in my app. The tutorial I was following stated Transaction.currentEntitlements includes unfinished consumables, which is incorrect according to the documentation. Is the correct way to handle unfinished consumables (and non-consumables) to implement Transaction.updates and call finish() if it’s verified? The documentation says that listener will receive unfinished transactions once upon app launch, so with that, do I understand correctly you do not need to implement Transaction.unfinished unless you want to look for unfinished transactions manually later on? Otherwise what is the correct and most recommended way to handle unfinished consumables? Is there a way to test that scenario in Xcode?
1
0
187
Jun ’25
Widget error upon restore iPhone: The file "Name.sqlite" couldn't be opened
I have an app that uses NSPersistentCloudKitContainer stored in a shared location via App Groups so my widget can fetch data to display. It works. But if you reset your iPhone and restore it from a backup, an error occurs: The file "Name.sqlite" couldn't be opened. I suspect this happens because the widget is created before the app's data is restored. Restarting the iPhone is the only way to fix it though, opening the app and reloading timelines does not. Anything I can do to fix that to not require turning it off and on again?
12
0
237
Jul ’25
How to replace layoutManager with textLayoutManager for a flexible dynamic height UITextView
In order to create a UITextView like that of the Messages app whose height grows to fits its contents (number of lines), I subclassed UITextView and customized the intrinsicContentSize like so: override var intrinsicContentSize: CGSize { var size = super.intrinsicContentSize if size.height == UIView.noIntrinsicMetric { layoutManager.glyphRange(for: textContainer) size.height = layoutManager.usedRect(for: textContainer).height + textContainerInset.top + textContainerInset.bottom } return size } As noted at WWDC, accessing layoutManager will force TextKit 1, we should instead use textLayoutManager. How can this code be migrated to support TextKit 2?
3
0
153
Jun ’25
How to initialize OpenIntent parameter when returning OpensIntent in perform
I have an app that lets you create cars. I have a CarEntity, an OpenCarIntent, and a CreateCarIntent. I want to support the Open When Run option when creating a car. I understand to do this, you just update the return type of your perform function to include & OpensIntent, then change your return value to include opensIntent: OpenCarIntent(target: carEntity). When I do this, I get a compile-time error: Cannot convert value of type 'CarEntity' to expected argument type 'IntentParameter<CarEntity>' What am I doing wrong here? struct CreateCarIntent: ForegroundContinuableIntent { static let title: LocalizedStringResource = "Create Car" @Parameter(title: "Name") var name: String @MainActor func perform() async throws -> some IntentResult & ReturnsValue<CarEntity> & OpensIntent { let managedObjectContext = PersistenceController.shared.container.viewContext let car = Car(context: managedObjectContext) car.name = name try await managedObjectContext.perform { try managedObjectContext.save() } let carEntity = CarEntity(car: car) return .result( value: carEntity, opensIntent: OpenCarIntent(target: carEntity) // FIXME: Won't compile ) } } struct OpenCarIntent: OpenIntent { static let title: LocalizedStringResource = "Open Car" @Parameter(title: "Car") var target: CarEntity @MainActor func perform() async throws -> some IntentResult { await UIApplication.shared.open(URL(string: "carapp://cars/view?id=\(target.id)")!) return .result() } }
2
0
173
Jun ’25
How to donate IndexedEntity, if required in iOS 26
In the Get to Know App Intents WWDC session, it was said New this year, you can now add Spotlight indexing keys directly on properties. Annotating properties allows Spotlight to show more relevant information to customers. When donating indexed entities, the framework will handle creating the searchable item and attribute set for you. After donating entities, they can be found in Spotlight. How do you donate indexed app entities? Making app entities available in Spotlight seems to state it's not necessary to donate entities: The system can automatically extract the keys for Spotlight indexing at compile time and store them in the App Intents metadata that Xcode generates as part of your app’s bundle. As a result, Spotlight indexing is faster and can find your app entities without launching your app, and without you having to explicitly donate the entities to Spotlight. You also don’t need to manually update or remove entities from the Spotlight index when your app’s data changes. Say I have a CarEntity. The user can create/update/delete cars at any time. What is the modern way to get cars to appear in Spotlight in iOS 26?
2
0
167
Jun ’25
How to show confirmationDialog from a Button in a Menu
I have a More button in my nav bar that contains a Delete action, and when you tap that I want to show a confirmation dialog before performing the deletion. In order words, I have a toolbar containing a toolbar item containing a menu containing a button that when tapped needs to show a confirmation dialog. In iOS 26, you're supposed to add the confirmationDialog on the view that presents the action sheet so that it can point to the source view or morph out of it if it's liquid glass. But when I do that, the confirmation dialog does not appear. Is that a bug or am I doing something wrong? struct ContentView: View { @State private var showingDeleteConfirmation = false var body: some View { NavigationStack { Text("👋, 🌎!") .toolbar { ToolbarItem { Menu { Button(role: .destructive) { showingDeleteConfirmation.toggle() } label: { Label("Delete", systemImage: "trash") } .confirmationDialog("Are you sure?", isPresented: $showingDeleteConfirmation) { Button(role: .destructive) { print("Delete it") } label: { Text("Delete") } Button(role: .cancel, action: {}) } } label: { Label("More", systemImage: "ellipsis") } } } } } }
1
0
167
Jul ’25
How to hide scroll edge effect until scroll down
I present a view in a sheet that consists of a navigation stack and a scroll view which has a photo pushed to the top by setting .ignoresSafeArea(edges: .top). The problem is the top of the photo is blurry due to the scroll edge effect. I would like to hide the scroll edge effect so the photo is fully visible when scrolled to the top but let the effect become visible upon scrolling down. Is that possible? struct ContentView: View { @State private var showingSheet = false var body: some View { VStack { Button("Present Sheet") { showingSheet = true } } .sheet(isPresented: $showingSheet) { SheetView() } } } struct SheetView: View { @Environment(\.dismiss) private var dismiss var body: some View { NavigationStack { ScrollView { VStack { Image("Photo") .resizable() .scaledToFill() } } .ignoresSafeArea(edges: .top) .toolbar { ToolbarItem(placement: .cancellationAction) { Button(role: .close) { dismiss() } } ToolbarItem { EditButton() } } } } }
1
0
120
Jul ’25
Different toolbar item placement for iPhone vs iPad
On iPhone, I would like to have a more button at the top right of the navigation bar, a search field in the bottom toolbar, and a plus button to the right of the search field. I've achieved this via the code below. But on iPad they should be in the navigation bar at the trailing edge from left to right: plus, more, search field. Just like the Shortcuts app, if there's not enough horizontal space, the search field should collapse into a button, and with even smaller space the search bar should become full-width under the navigation bar. Right now on iPad the search bar is full width under the navigation bar, more at top right, plus at bottom middle, no matter how big the window is. How can I achieve that? Any way to specify them for the system to more automatically do the right thing, or would I need to check specifically for iPhone vs iPad UIDevice to change the code? struct ContentView: View { @State private var searchText = "" var body: some View { NavigationStack { VStack { Text("Hello, world!") } .navigationTitle("Test App") .searchable(text: $searchText) .toolbar { ToolbarItem { Menu { //... } label: { Label("More", systemImage: "ellipsis") } } DefaultToolbarItem(kind: .search, placement: .bottomBar) ToolbarSpacer(.fixed, placement: .bottomBar) ToolbarItem(placement: .bottomBar) { Button { print("Add tapped") } label: { Label("Add", systemImage: "plus") } } } } } }
3
0
132
Aug ’25
MKReverseGeocodingRequest and CNPostalAddress from MKMapItem
My app is currently using CLGeocoder to get a CLPlacemark, then using placemark.postalAddress with CNPostalAddressFormatter to get an attributed string for the full address, I then enumerate its attributes to pull out specific elements like just the street or state or zip etc. This is deprecated in iOS 26 with MKReverseGeocodingRequest being the intended replacement. This API returns an MKMapItem which doesn’t provide a CNPostalAddress - you can get a full address as a String but not structured address data that I’m seeing. Am I missing some way to get the postal address? Or is it a non-goal to provide that anymore? Thanks!
5
0
176
Aug ’25
BGContinuedProcessingTask launchHandler invocation
I'm trying to understand how the API works to perform a function that can continue running if the user closes the app. For a very simple example, consider a function that increments a number on screen every second, counting from 1 to 100, reaching completion at 100. The user can stay in the app for 100s watching it work to completion, or the user can close the app say after 2s and do other things while watching it work to completion in the Live Activity. To do this when the user taps a Start Counting button, you'd 1 Call BGTaskScheduler.shared.register(forTaskWithIdentifier:using:launchHandler:). Question 1: Do I understand correctly, all of the logic to perform this counting operation would exist entirely in the launchHandler block (noting you could call another function you define passing it the task to be able to update its progress)? I am confused because the documentation states "The system runs the block of code for the launch handler when it launches the app in the background." but the app is already open in the foreground. This made me think this block is not going to be invoked until the user closes the app to inform you it's okay to continue processing in the background, but how would you know where to pick up. I want to confirm my thinking was wrong, that all the logic should be in this block from start to completion of the operation, and it's fine even if the app stays in the foreground the whole time. 2 Then you'd create a BGContinuedProcessingTaskRequest and set request.strategy = .fail for this example because you need it to start immediately per the user's explicit tap on the Start Counting button. 3 Call BGTaskScheduler.shared.submit(request). Question 2: If the submit function throws an error, should you handle it by just performing the counting operation logic (call your function without passing a task)? I understand this can happen if for some reason the system couldn't immediately run it, like if there's already too many pending task requests. Seems you should not show an error message to the user, should still perform the request and just not support background continued processing for it (and perhaps consider showing a light warning "this operation can't be continued in the background so keep the app open"). Or should you still queue it up even though the user wants to start counting now? That leads to my next question Question 3: In what scenario would you not want the operation to start immediately (the queue behavior which is the default), given the app is already in the foreground and the user requested some operation? I'm struggling to think of an example, like a button titled Compress Photos Whenever You Can, and it may start immediately or maybe it won't? While waiting for the launchHandler to be invoked, should the UI just show 0% progress or "Pending" until the system can get to this task in the queue? Struggling to understand the use cases here, why make the user wait to start processing when they might not even intend to close the app during the operation? Thanks for any insights! As an aside, a sample project with a couple use cases would have been incredibly helpful to understand how the API is expected to be used.
8
0
292
Oct ’25
How to add view below navigation bar to extend scroll edge effect
Hello! What UIKit API enables you to add a view below the navigation bar and extend the scroll edge effect below it in iOS 26? safeAreaBar is how you do it in SwiftUI but I need to achieve this design in my UIKit app (which has a collection view in a view controller in a navigation controller). struct ContentView: View { let segments = ["First", "Second", "Third"] @State private var selectedSegment = "First" var body: some View { NavigationStack { List(0..<50, id: \.self) { i in Text("Row \(i + 1)") } .safeAreaBar(edge: .top) { Picker("Segment", selection: $selectedSegment) { ForEach(segments, id: \.self) { Text($0) } } .pickerStyle(.segmented) .padding(.horizontal) .padding(.bottom, 8) } .navigationTitle("Title") .navigationBarTitleDisplayMode(.inline) } } }
3
0
198
1h
App Store Connect release notes RSS feed
I want to keep an eye on the App Store Connect release notes to find out when builds created with Xcode 26.2 RC will be accepted. I tried to add https://developer.apple.com/help/app-store-connect/release-notes/ to my RSS reeder but the items listed are not the same, it’s the items from the latest news from Apple Developer instead. Can we get an RSS feed please? Seems will be useful to monitor these release notes over time.
4
0
186
6d
Most performant method to fetch just images from a collection
It was discussed in What's New in Photos APIs at WWDC that we should avoid using NSPredicate for custom fetch options if at all possible for performance purposes. In my app, I'm using NSPredicate to get only images from the user's library. I'm not seeing an API that would allow me to get assets from a specific collection filtered to just images without the use of NSPredicate though. Is there a more efficient way to perform this query that I'm not seeing?let photoLibraryFetchResult = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil) let assetCollection = photoLibraryFetchResult.firstObject! let fetchOptions = PHFetchOptions() fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue) let fetchResults = PHAsset.fetchAssets(in: assetCollection, options: fetchOptions)
3
0
2.6k
Jan ’22
Is it possible to get a song's bit rate from MPMediaItem?
I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible? I tried to get it via: AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps. I can get the sample rate of 44100 via: let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee let sampleRate = basicDescription.mSampleRate Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
0
0
824
Jun ’21