It was discussed in What's New in Photos APIs at WWDC that we should avoid using NSPredicate for custom fetch options if at all possible for performance purposes. In my app, I'm using NSPredicate to get only images from the user's library. I'm not seeing an API that would allow me to get assets from a specific collection filtered to just images without the use of NSPredicate though. Is there a more efficient way to perform this query that I'm not seeing?let photoLibraryFetchResult = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
let assetCollection = photoLibraryFetchResult.firstObject!
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let fetchResults = PHAsset.fetchAssets(in: assetCollection, options: fetchOptions)
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible?
I tried to get it via:
AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate
but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps.
I can get the sample rate of 44100 via:
let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first
let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee
let sampleRate = basicDescription.mSampleRate
Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
The Music macOS app shows various info about a song in the Get Info window. Most of this metadata is available in the iOS SDK via MPMediaItem. I'm wanting to access the information displayed in the File tab but I'm not seeing several pieces of data in the API. Is this possible?
□ Kind - Apple Music AAC audio file - ?
☑︎ Duration - 3:00 - playbackDuration
□ Size - 10 MB - ?
□ Bit rate - 256 kbps - ?
□ Sample rate - 44.100 kHz - ?
□ Date modified - 1/1/2001 - ?
☑︎ Date added - 1/1/2001 - dateAdded
□ Cloud status - Apple Music - ?
☑︎ Location - Cloud - isCloudItem
Is it possible for an app using NSPersistentCloudKitContainer to enable sync in the background, if so, how?
Install app on your iPhone and iPad, create some data, it automatically syncs to both and life is good
Close the iPad app
Modify the data on iPhone
Desired behavior: The backgrounded iPad app should sync (even if it takes a while) and be informed that its local database has finished syncing or similarly that changes were made.
The use case is I want to reload my widget when data changes so it's up-to-date, so I need my app to sync it in the background, then notify when it's complete to be able to trigger the widget reload.
I am concerned it will be a poor widget experience if it's always showing stale data until they manually open the app to initiate sync - kind of defeats the purpose of widgets. ha
According to this post, they found sync is never run in the background. Is this not the case, or has it changed in iOS 15?
Thanks!
Topic:
App & System Services
SubTopic:
iCloud & Data
Tags:
CloudKit
WidgetKit
Background Tasks
Core Data
I have a UIKit Mac Catalyst app, optimized for Mac idiom, with an NSToolbar manually added to the windowScene. Is it possible to implement a full-height inspector sidebar with this setup? It seems to always appear underneath the toolbar in my testing. Even if I remove my NSToolbar and let the system create a toolbar from a NavigationStack. It works on iOS - stretches all the way up the window splitting the app into two columns.
var body: some View {
NavigationStack {
AnimalTable(state: $state)
.inspector(isPresented: $state.inspectorPresented) {
AnimalInspectorForm(animal: $state.binding())
}
.toolbar {
Button {
state.inspectorPresented.toggle()
} label: {
Label("Toggle Inspector", systemImage: "info.circle")
}
}
}
}
During WWDC Q&As I asked how I could add an inspector to my UIKit app that’s using UISplitViewController with a double column style featuring a sidebar and detail view controller. I initially tried a full-height inspector (by putting my split view controller into a SwiftUI view and applying the inspector on that, embedding that into a UIHostingController to be the rootViewController) but this caused a bunch of UI bugs (seemingly related to optimizations made for a size class that doesn’t match the actual appearance) and it doesn’t extend into the NSToolbar on Mac Catalyst anyways. I now want to try implementing the under-the-toolbar solution. An engineer said:
For an under toolbar appearance, you should be able to use .inspector on the detail view controller (after wrapping it in a SwiftUI view), but you may have to do manual toolbar management here (hiding and showing) to make sure you don't end up with stacked toolbars/UINavigationBars
I have indeed run into the problem with two navigation bars in my inspector. I want to keep the navigation bar visible in the detail screen, but I do not want any navigation bars visible in my inspector since I’m going to provide my own button to toggle the inspector (via a button in the detail on iOS and an NSToolbarItem on macOS). Is this layout possible? I tried applying .toolbar(.hidden) on the inspector’s view but this doesn’t do anything, there’s still two stacked navigation bars (tested on iPadOS 17 beta 2). I think even if that worked it would only hide the inner navigation bar, I’d still have an undesirable navigation bar. :/ Wishing there were a UIKit API I could avoid the interop complexity ha
In the sample project attached to FB12447791, the root view controller is a UISplitViewController. The primary view controller is a UINavigationController containing a sidebar. The secondary view controller is a UINavigationController containing a UIHostingController whose root view is a SecondaryColumnView. SecondaryColumnView is a Form that has a button to Toggle Inspector, a navigation title, and an inspector. The inspector is a Form that has .toolbar(.hidden).
I've provided several screen recordings in the feedback report as well. Thanks for your help and insight!
The updated Photos access dialog in iOS 17 states:
Photos may contain metadata, such as location, depth information, or captions.
How do I access the caption a user added to a photo in my app? This wasn’t possible in iOS 16, is there new API in 17? I previously requested this ability via metadata in FB10205012 and via PHAsset in FB8244665. If it remains inaccessible I’ve submitted FB12437093 to request captions be removed from this wording.
Is it possible to get the original date created for an IntentFile? The following code always gets the date for right now, surely because it's copied into a temporary directory so that's when it was created at that location.
if let fileURL = file.fileURL, fileURL.startAccessingSecurityScopedResource() {
if let attributes = try? FileManager.default.attributesOfItem(atPath: fileURL.path), let date = attributes[.creationDate] as? Date {
print(date)
}
fileURL.stopAccessingSecurityScopedResource()
}
I've had an app that edits photos in your library since the PhotoKit API was released in iOS 8. I know it was required if you preserve photo metadata you had to change the value of Orientation to 1 (up), otherwise PhotoKit would fail to perform the asset change request. When I remove this code, I'm seeing Orientation is getting changed to 1 automatically both at root and in the TIFF dictionary (tested with iOS 18). I wanted to confirm this is expected behavior, the system does this for us now? If so, can I remove this code for iOS 15+, or was it a recent iOS version this started happening? Thanks!
I've created a closed source iOS SDK from a local Swift package, which has dependencies on other Swift packages, and successfully created a binary XCFramework following the solution from my previous post.
Now I'm proceeding with the process to distribute this SDK. I believe I want to upload the XCFramework to a public repo alongside a Package.swift file and an Example app project that uses the XCFramework. So each time I go to create a new release I’ll create a new XCFramework replacing the current one, verify it's working properly in the example app, then commit, tag, and push to the repo.
My question is how do I set this up as a Swift package that includes an example app that uses the local XCFramework (not a remote url to a zip of the framework) and properly resolves dependencies?
So far I created a directory containing MyFramework.xcframework and Package.swift containing:
// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "MyFramework",
platforms: [
.iOS(.v14)
],
products: [
.library(
name: "MyFramework",
targets: ["MyFramework"]
)
],
dependencies: [
.package(url: "https://github.com/example/example.git", from: "1.0.0")
],
targets: [
.binaryTarget(
name: "MyFramework",
path: "MyFramework.xcframework"
)
]
)
I then created an Example iOS app project in that directory and now need to integrate the local XCFramework. I wondered if I could do that via File > Add Package Dependencies > Add Local, but when I navigate to that Package.swift and click Add Package it says
The selected package cannot be a direct ancestor of the project.
Do I need a different Package.swift for the Example app, and if so, how do I get that set up? I created a separate Package.swift (contents below) alongside the xcodeproj but when I try to add that in Xcode I get the same error, despite the fact this package is a sibling of the project not an ancestor.
// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "MyFramework-Example",
platforms: [
.iOS(.v14)
],
dependencies: [
.package(name: "MyFramework", path: "../")
],
targets: [
.target(
name: "MyFramework-Example",
dependencies: ["MyFramework"]
)
]
)
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo.
To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks!
let imageURL = contentEditingInput.fullSizeImageURL!
let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])!
var metadata: [AnyHashable: Any] = inputImage.properties
// Edit the metadata as desired...
let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)!
editingContext.frameProcessor = { frame, error -> CIImage? in
// Edit only the still photo
if frame.type == .photo {
return frame.image.settingProperties(metadata)
}
return frame.image
}
let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in
let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput)
editingOutput.adjustmentData = adjustmentData
editingContext.saveLivePhoto(to: editingOutput) { success, error in
if success {
continuation.resume(returning: editingOutput)
} else {
continuation.resume(throwing: error!)
}
}
}
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetChangeRequest(for: asset)
request.contentEditingOutput = contentEditingOutput
}
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}}
I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue.
I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Media
Photos and Imaging
PhotoKit
AVFoundation
I have an app that displays artwork via MPMediaItem.artwork, requesting an image with a specific size. How do I get a media item's MPMediaItemAnimatedArtwork, and how to get the preview image and video to display to the user?
I have an AppIntent that edits an object in my app. The intent accepts an app entity as a parameter, so if you run the intent it will ask which one do you want to edit, then you select one from the list and it shows a dialog that it was edited successfully. I use this same intent in my Home Screen widget initializing it with an objectEntity. The code needs to run in the app's process, not the widget extension process, so the file is added to both targets and it conforms to ForegroundContinuableIntent, and that is supposed to ensure it always runs in the app process. This works great when run from the Shortcuts app and when involved via a button in the Home Screen widget, exactly as expected. Here is that app intent:
@available(iOS 17.0, *)
struct EditObjectIntent: AppIntent {
static let title: LocalizedStringResource = "Edit Object"
@Parameter(title: "Object", requestValueDialog: "Which object do you want to edit?", inputConnectionBehavior: .connectToPreviousIntentResult)
var objectEntity: ObjectEntity
init() {
print("INIT")
}
init(objectEntity: ObjectEntity) {
self.objectEntity = objectEntity
}
@MainActor
func perform() async throws -> some IntentResult & ReturnsValue<ObjectEntity> & ProvidesDialog {
// Edit the object from objectEntity.id...
return .result(value: objectEntity, dialog: "Done")
}
}
@available(iOS 17.0, *)
@available(iOSApplicationExtension, unavailable)
extension EditObjectIntent: ForegroundContinuableIntent { }
I now want to create a ControlButton that uses this intent:
struct EditObjectControlWidget: ControlWidget {
var body: some ControlWidgetConfiguration {
StaticControlConfiguration(kind: "EditObjectControlWidget") {
ControlWidgetButton(action: EditObjectIntent()) {
Label("Edit Object", systemImage: "pencil")
}
}
}
}
When I add the button to Control Center and tap it (on iOS 18), init is called 3x in the app process and 2x in the widget process, yet the perform function is not invoked in either process. No error appears in console logs for the app's process, but this appears for the widget process:
LaunchServices: store <private> or url <private> was nil: Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database" UserInfo={NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
Attempt to map database failed: permission was denied. This attempt will not be retried.
Failed to initialize client context with error Error Domain=NSOSStatusErrorDomain Code=-54 "process may not map database" UserInfo={NSDebugDescription=process may not map database, _LSLine=72, _LSFunction=_LSServer_GetServerStoreForConnectionWithCompletionHandler}
What am I doing wrong here? Thanks!
Topic:
App & System Services
SubTopic:
Widgets & Live Activities
Tags:
iOS
SwiftUI
WidgetKit
App Intents
iOS 26 seems to have changed the way action extension icons appear in the share sheet. My icon is too small now compared to the Copy button in Safari (and Shortcuts’ icons are too small too, a bug?). How do you update it, and how do you ensure it looks fine in iOS 18 and earlier?
My current icon is an AppIcon in the asset catalog, single size 1024x1024, with about 130px padding around it.