I have non-consumable and consumable in-app purchases in my app. The tutorial I was following stated Transaction.currentEntitlements includes unfinished consumables, which is incorrect according to the documentation. Is the correct way to handle unfinished consumables (and non-consumables) to implement Transaction.updates and call finish() if it’s verified? The documentation says that listener will receive unfinished transactions once upon app launch, so with that, do I understand correctly you do not need to implement Transaction.unfinished unless you want to look for unfinished transactions manually later on? Otherwise what is the correct and most recommended way to handle unfinished consumables? Is there a way to test that scenario in Xcode?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have an AppIntent that edits an object in my app. The intent accepts an app entity as a parameter, so if you run the intent it will ask which one do you want to edit, then you select one from the list and it shows a dialog that it was edited successfully. I use this same intent in my Home Screen widget initializing it with an objectEntity. The code needs to run in the app's process, not the widget extension process, so the file is added to both targets and it conforms to ForegroundContinuableIntent, and that is supposed to ensure it always runs in the app process. This works great when run from the Shortcuts app and when involved via a button in the Home Screen widget, exactly as expected. Here is that app intent:
@available(iOS 17.0, *)
struct EditObjectIntent: AppIntent {
static let title: LocalizedStringResource = "Edit Object"
@Parameter(title: "Object", requestValueDialog: "Which object do you want to edit?", inputConnectionBehavior: .connectToPreviousIntentResult)
var objectEntity: ObjectEntity
init() {
print("INIT")
}
init(objectEntity: ObjectEntity) {
self.objectEntity = objectEntity
}
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<ObjectEntity> & ProvidesDialog {
// Edit the object from objectEntity.id...
return .result(value: objectEntity, dialog: "Done")
}
}
@available(iOS 17.0, *)
@available(iOSApplicationExtension, unavailable)
extension EditObjectIntent: ForegroundContinuableIntent { }
I now want to create a ControlButton that uses this intent:
struct EditObjectControlWidget: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(kind: "EditObjectControlWidget") {
ControlWidgetButton(action: EditObjectIntent()) {
Label("Edit Object", systemImage: "pencil")
}
}
}
}
When I add the button to Control Center and tap it (on iOS 18), init is called 3x in the app process and 2x in the widget process, yet the perform function is not invoked in either process. No error appears in console logs for the app's process, but this appears for the widget process:
LaunchServices: store <private> or url <private> was nil: Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database" UserInfo={NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Attempt to map database failed: permission was denied. This attempt will not be retried.
Failed to initialize client context with error Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database" UserInfo={NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
What am I doing wrong here? Thanks!
Topic:
App & System Services
SubTopic:
Widgets & Live Activities
Tags:
iOS
SwiftUI
WidgetKit
App Intents
I have a More button in my nav bar that contains a Delete action, and when you tap that I want to show a confirmation dialog before performing the deletion. In order words, I have a toolbar containing a toolbar item containing a menu containing a button that when tapped needs to show a confirmation dialog.
In iOS 26, you're supposed to add the confirmationDialog on the view that presents the action sheet so that it can point to the source view or morph out of it if it's liquid glass. But when I do that, the confirmation dialog does not appear. Is that a bug or am I doing something wrong?
struct ContentView: View {
@State private var showingDeleteConfirmation = false
var body: some View {
NavigationStack {
Text("👋, 🌎!")
.toolbar {
ToolbarItem {
Menu {
Button(role: .destructive) {
showingDeleteConfirmation.toggle()
} label: {
Label("Delete", systemImage: "trash")
}
.confirmationDialog("Are you sure?", isPresented: $showingDeleteConfirmation) {
Button(role: .destructive) {
print("Delete it")
} label: {
Text("Delete")
}
Button(role: .cancel, action: {})
}
} label: {
Label("More", systemImage: "ellipsis")
}
}
}
}
}
}
I present a view in a sheet that consists of a navigation stack and a scroll view which has a photo pushed to the top by setting .ignoresSafeArea(edges: .top). The problem is the top of the photo is blurry due to the scroll edge effect. I would like to hide the scroll edge effect so the photo is fully visible when scrolled to the top but let the effect become visible upon scrolling down. Is that possible?
struct ContentView: View {
@State private var showingSheet = false
var body: some View {
VStack {
Button("Present Sheet") {
showingSheet = true
}
}
.sheet(isPresented: $showingSheet) {
SheetView()
}
}
}
struct SheetView: View {
@Environment(\.dismiss) private var dismiss
var body: some View {
NavigationStack {
ScrollView {
VStack {
Image("Photo")
.resizable()
.scaledToFill()
}
}
.ignoresSafeArea(edges: .top)
.toolbar {
ToolbarItem(placement: .cancellationAction) {
Button(role: .close) {
dismiss()
}
}
ToolbarItem {
EditButton()
}
}
}
}
}
In what scenario will an app receive the limitExceeded PHPhotosError code? This case was added in iOS 26.1 and is not currently documented. What PhotoKit APIs can encounter this error and how should it be handled?
I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible?
I tried to get it via:
AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate
but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps.
I can get the sample rate of 44100 via:
let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first
let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee
let sampleRate = basicDescription.mSampleRate
Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
I have a UIKit Mac Catalyst app, optimized for Mac idiom, with an NSToolbar manually added to the windowScene. Is it possible to implement a full-height inspector sidebar with this setup? It seems to always appear underneath the toolbar in my testing. Even if I remove my NSToolbar and let the system create a toolbar from a NavigationStack. It works on iOS - stretches all the way up the window splitting the app into two columns.
var body: some View {
NavigationStack {
AnimalTable(state: $state)
.inspector(isPresented: $state.inspectorPresented) {
AnimalInspectorForm(animal: $state.binding())
}
.toolbar {
Button {
state.inspectorPresented.toggle()
} label: {
Label("Toggle Inspector", systemImage: "info.circle")
}
}
}
}
During WWDC Q&As I asked how I could add an inspector to my UIKit app that’s using UISplitViewController with a double column style featuring a sidebar and detail view controller. I initially tried a full-height inspector (by putting my split view controller into a SwiftUI view and applying the inspector on that, embedding that into a UIHostingController to be the rootViewController) but this caused a bunch of UI bugs (seemingly related to optimizations made for a size class that doesn’t match the actual appearance) and it doesn’t extend into the NSToolbar on Mac Catalyst anyways. I now want to try implementing the under-the-toolbar solution. An engineer said:
For an under toolbar appearance, you should be able to use .inspector on the detail view controller (after wrapping it in a SwiftUI view), but you may have to do manual toolbar management here (hiding and showing) to make sure you don't end up with stacked toolbars/UINavigationBars
I have indeed run into the problem with two navigation bars in my inspector. I want to keep the navigation bar visible in the detail screen, but I do not want any navigation bars visible in my inspector since I’m going to provide my own button to toggle the inspector (via a button in the detail on iOS and an NSToolbarItem on macOS). Is this layout possible? I tried applying .toolbar(.hidden) on the inspector’s view but this doesn’t do anything, there’s still two stacked navigation bars (tested on iPadOS 17 beta 2). I think even if that worked it would only hide the inner navigation bar, I’d still have an undesirable navigation bar. :/ Wishing there were a UIKit API I could avoid the interop complexity ha
In the sample project attached to FB12447791, the root view controller is a UISplitViewController. The primary view controller is a UINavigationController containing a sidebar. The secondary view controller is a UINavigationController containing a UIHostingController whose root view is a SecondaryColumnView. SecondaryColumnView is a Form that has a button to Toggle Inspector, a navigation title, and an inspector. The inspector is a Form that has .toolbar(.hidden).
I've provided several screen recordings in the feedback report as well. Thanks for your help and insight!
The updated Photos access dialog in iOS 17 states:
Photos may contain metadata, such as location, depth information, or captions.
How do I access the caption a user added to a photo in my app? This wasn’t possible in iOS 16, is there new API in 17? I previously requested this ability via metadata in FB10205012 and via PHAsset in FB8244665. If it remains inaccessible I’ve submitted FB12437093 to request captions be removed from this wording.
In iOS 17 when you search Spotlight for Notes you can see it has an App Shortcut titled New Note that simply opens the Notes app and starts composing a new note. When you open the Shortcuts app, create a new shortcut, search for and tap Notes, notice the New Note action is ONLY included at the top - it's not in the list of actions underneath. There is another intent in the list titled Create Note which will create a new note using the content you specify without opening the Notes app.
I want to achieve this same thing in my app - an App Shortcut to open the app and start creating a new item, and a shortcut action to create a new item without opening the app. How can this be done?
So far I have created two AppIntents, NewItem and CreateItem. My AppShortcutsProvider only includes NewItem. This works great when searching for my app in Spotlight. But when I open Shortcuts and go to add an action from my app to a shortcut, it includes Add Item at the top as an App Shortcut but also in the actions list underneath. Create Item is included in the list as well which is confusing. I don't want Add Item to be an available action because it's fairly useless to open the app and start creating an item, instead they should use Create Item to create an item in the background.
Do I need to instead create a single shortcut that behaves differently in Spotlight vs Shortcuts, is that possible?
Is it possible to get the original date created for an IntentFile? The following code always gets the date for right now, surely because it's copied into a temporary directory so that's when it was created at that location.
if let fileURL = file.fileURL, fileURL.startAccessingSecurityScopedResource() {
if let attributes = try? FileManager.default.attributesOfItem(atPath: fileURL.path), let date = attributes[.creationDate] as? Date {
print(date)
}
fileURL.stopAccessingSecurityScopedResource()
}
Is it possible to sort the user library assets by date captured? The Photos app in iOS 18 lets you choose between Date Captured and Recently Added and I want to offer that same choice in my app. This seems to always sort them by creation date (which I believe is the same as recently added):
let assetCollection = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil).firstObject!
let fetchResult = PHAsset.fetchAssets(in: assetCollection, options: PHFetchOptions.imageMediaType())
I've had an app that edits photos in your library since the PhotoKit API was released in iOS 8. I know it was required if you preserve photo metadata you had to change the value of Orientation to 1 (up), otherwise PhotoKit would fail to perform the asset change request. When I remove this code, I'm seeing Orientation is getting changed to 1 automatically both at root and in the TIFF dictionary (tested with iOS 18). I wanted to confirm this is expected behavior, the system does this for us now? If so, can I remove this code for iOS 15+, or was it a recent iOS version this started happening? Thanks!
I've created a closed source iOS SDK from a local Swift package, which has dependencies on other Swift packages, and successfully created a binary XCFramework following the solution from my previous post. I would now like to create a Package.swift to vend this XCFramework and test it in an example app to verify it works as expected before I upload it to a public repo for distribution.
I understand that binaryTarget does not support dependencies so we need to use a wrapper. I created a directory containing the following:
Package.swift
MyFramework.xcframework/
MyFrameworkWrapper/
├─ dummy.swift
Package.swift contains:
// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "MyFramework",
platforms: [
.iOS(.v14)
],
products: [
.library(
name: "MyFramework",
targets: ["MyFramework", "MyFrameworkWrapper"]
)
],
dependencies: [
.package(url: "https://github.com/gordontucker/FittedSheets.git", from: "2.6.1")
],
targets: [
.target(
name: "MyFrameworkWrapper",
dependencies: [
"FittedSheets"
],
path: "MyFrameworkWrapper"
),
.binaryTarget(
name: "MyFramework",
path: "MyFramework.xcframework"
)
]
)
I created a new iOS app, selected the project, Package Dependencies > + > Add Local, and added the directory containing this Package.swift. Xcode resolves the dependencies and lists them in the sidebar. I added code to import and use the framework. It builds successfully but the app crashes when run:
dyld[63959]: Library not loaded: @rpath/FittedSheets.framework/FittedSheets
Referenced from: <7DE247FC-DAFF-3946-AD21-E80F5AF841C9> /Users/Jordan/Library/Developer/Xcode/DerivedData/MyFramework-Example-gaeeymnqzenzrbbmhuebpodqctsz/Build/Products/Debug-iphonesimulator/MyFramework.framework/MyFramework
How do I get this working? I'm wondering is my package set up properly to vend the framework specifying its dependencies, and is my XCFramework created correctly?
The Package.swift for the framework's source code contains:
// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "MyFramework",
platforms: [
.iOS(.v14)
],
products: [
.library(
name: "MyFramework",
type: .dynamic,
targets: ["MyFramework"]
)
],
dependencies: [
.package(url: "https://github.com/gordontucker/FittedSheets.git", from: "2.6.1")
],
targets: [
.target(
name: "MyFramework",
dependencies: [
"FittedSheets"
],
path: "Sources"
)
]
)
And I created the XCFramework following the steps in that previous thread:
Create archive from package via xcodebuild archive -workspace "$PACKAGE_PATH" -scheme "$FRAMEWORK_NAME" -destination 'generic/platform=iOS' -archivePath "$ARCHIVE_PATH/iOS" SKIP_INSTALL=NO BUILD_LIBRARY_FOR_DISTRIBUTION=YES ENABLE_USER_SCRIPT_SANDBOXING=NO ENABLE_MODULE_VERIFIER=NO OTHER_SWIFT_FLAGS=-no-verify-emitted-module-interface
Create the Modules directory in the framework via mkdir -p "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework/Modules"
Copy the Swift interface files into the framework from the build in DerivedData via cp -a "$BUILD_PRODUCTS_PATH/Build/Intermediates.noindex/ArchiveIntermediates/$FRAMEWORK_NAME/BuildProductsPath/Release-iphoneos/$FRAMEWORK_NAME.swiftmodule" "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework/Modules"
Repeat 1-3 for iOS Simulator
Create an XCFramework via xcodebuild -create-xcframework -framework "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework" -framework "$ARCHIVE_PATH/iOS_Simulator.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework" -output "$ARCHIVE_PATH/$FRAMEWORK_NAME.xcframework"
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo.
To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks!
let imageURL = contentEditingInput.fullSizeImageURL!
let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])!
var metadata: [AnyHashable: Any] = inputImage.properties
// Edit the metadata as desired...
let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)!
editingContext.frameProcessor = { frame, error -> CIImage? in
// Edit only the still photo
if frame.type == .photo {
return frame.image.settingProperties(metadata)
}
return frame.image
}
let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in
let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput)
editingOutput.adjustmentData = adjustmentData
editingContext.saveLivePhoto(to: editingOutput) { success, error in
if success {
continuation.resume(returning: editingOutput)
} else {
continuation.resume(throwing: error!)
}
}
}
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetChangeRequest(for: asset)
request.contentEditingOutput = contentEditingOutput
}