I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible?
I tried to get it via:
AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate
but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps.
I can get the sample rate of 44100 via:
let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first
let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee
let sampleRate = basicDescription.mSampleRate
Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a UIKit Mac Catalyst app, optimized for Mac idiom, with an NSToolbar manually added to the windowScene. Is it possible to implement a full-height inspector sidebar with this setup? It seems to always appear underneath the toolbar in my testing. Even if I remove my NSToolbar and let the system create a toolbar from a NavigationStack. It works on iOS - stretches all the way up the window splitting the app into two columns.
var body: some View {
NavigationStack {
AnimalTable(state: $state)
.inspector(isPresented: $state.inspectorPresented) {
AnimalInspectorForm(animal: $state.binding())
}
.toolbar {
Button {
state.inspectorPresented.toggle()
} label: {
Label("Toggle Inspector", systemImage: "info.circle")
}
}
}
}
During WWDC Q&As I asked how I could add an inspector to my UIKit app that’s using UISplitViewController with a double column style featuring a sidebar and detail view controller. I initially tried a full-height inspector (by putting my split view controller into a SwiftUI view and applying the inspector on that, embedding that into a UIHostingController to be the rootViewController) but this caused a bunch of UI bugs (seemingly related to optimizations made for a size class that doesn’t match the actual appearance) and it doesn’t extend into the NSToolbar on Mac Catalyst anyways. I now want to try implementing the under-the-toolbar solution. An engineer said:
For an under toolbar appearance, you should be able to use .inspector on the detail view controller (after wrapping it in a SwiftUI view), but you may have to do manual toolbar management here (hiding and showing) to make sure you don't end up with stacked toolbars/UINavigationBars
I have indeed run into the problem with two navigation bars in my inspector. I want to keep the navigation bar visible in the detail screen, but I do not want any navigation bars visible in my inspector since I’m going to provide my own button to toggle the inspector (via a button in the detail on iOS and an NSToolbarItem on macOS). Is this layout possible? I tried applying .toolbar(.hidden) on the inspector’s view but this doesn’t do anything, there’s still two stacked navigation bars (tested on iPadOS 17 beta 2). I think even if that worked it would only hide the inner navigation bar, I’d still have an undesirable navigation bar. :/ Wishing there were a UIKit API I could avoid the interop complexity ha
In the sample project attached to FB12447791, the root view controller is a UISplitViewController. The primary view controller is a UINavigationController containing a sidebar. The secondary view controller is a UINavigationController containing a UIHostingController whose root view is a SecondaryColumnView. SecondaryColumnView is a Form that has a button to Toggle Inspector, a navigation title, and an inspector. The inspector is a Form that has .toolbar(.hidden).
I've provided several screen recordings in the feedback report as well. Thanks for your help and insight!
The updated Photos access dialog in iOS 17 states:
Photos may contain metadata, such as location, depth information, or captions.
How do I access the caption a user added to a photo in my app? This wasn’t possible in iOS 16, is there new API in 17? I previously requested this ability via metadata in FB10205012 and via PHAsset in FB8244665. If it remains inaccessible I’ve submitted FB12437093 to request captions be removed from this wording.
In iOS 17 when you search Spotlight for Notes you can see it has an App Shortcut titled New Note that simply opens the Notes app and starts composing a new note. When you open the Shortcuts app, create a new shortcut, search for and tap Notes, notice the New Note action is ONLY included at the top - it's not in the list of actions underneath. There is another intent in the list titled Create Note which will create a new note using the content you specify without opening the Notes app.
I want to achieve this same thing in my app - an App Shortcut to open the app and start creating a new item, and a shortcut action to create a new item without opening the app. How can this be done?
So far I have created two AppIntents, NewItem and CreateItem. My AppShortcutsProvider only includes NewItem. This works great when searching for my app in Spotlight. But when I open Shortcuts and go to add an action from my app to a shortcut, it includes Add Item at the top as an App Shortcut but also in the actions list underneath. Create Item is included in the list as well which is confusing. I don't want Add Item to be an available action because it's fairly useless to open the app and start creating an item, instead they should use Create Item to create an item in the background.
Do I need to instead create a single shortcut that behaves differently in Spotlight vs Shortcuts, is that possible?
Is it possible to get the original date created for an IntentFile? The following code always gets the date for right now, surely because it's copied into a temporary directory so that's when it was created at that location.
if let fileURL = file.fileURL, fileURL.startAccessingSecurityScopedResource() {
if let attributes = try? FileManager.default.attributesOfItem(atPath: fileURL.path), let date = attributes[.creationDate] as? Date {
print(date)
}
fileURL.stopAccessingSecurityScopedResource()
}
Is it possible to sort the user library assets by date captured? The Photos app in iOS 18 lets you choose between Date Captured and Recently Added and I want to offer that same choice in my app. This seems to always sort them by creation date (which I believe is the same as recently added):
let assetCollection = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil).firstObject!
let fetchResult = PHAsset.fetchAssets(in: assetCollection, options: PHFetchOptions.imageMediaType())
I've had an app that edits photos in your library since the PhotoKit API was released in iOS 8. I know it was required if you preserve photo metadata you had to change the value of Orientation to 1 (up), otherwise PhotoKit would fail to perform the asset change request. When I remove this code, I'm seeing Orientation is getting changed to 1 automatically both at root and in the TIFF dictionary (tested with iOS 18). I wanted to confirm this is expected behavior, the system does this for us now? If so, can I remove this code for iOS 15+, or was it a recent iOS version this started happening? Thanks!
I've created a closed source iOS SDK from a local Swift package, which has dependencies on other Swift packages, and successfully created a binary XCFramework following the solution from my previous post. I would now like to create a Package.swift to vend this XCFramework and test it in an example app to verify it works as expected before I upload it to a public repo for distribution.
I understand that binaryTarget does not support dependencies so we need to use a wrapper. I created a directory containing the following:
Package.swift
MyFramework.xcframework/
MyFrameworkWrapper/
├─ dummy.swift
Package.swift contains:
// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "MyFramework",
platforms: [
.iOS(.v14)
],
products: [
.library(
name: "MyFramework",
targets: ["MyFramework", "MyFrameworkWrapper"]
)
],
dependencies: [
.package(url: "https://github.com/gordontucker/FittedSheets.git", from: "2.6.1")
],
targets: [
.target(
name: "MyFrameworkWrapper",
dependencies: [
"FittedSheets"
],
path: "MyFrameworkWrapper"
),
.binaryTarget(
name: "MyFramework",
path: "MyFramework.xcframework"
)
]
)
I created a new iOS app, selected the project, Package Dependencies > + > Add Local, and added the directory containing this Package.swift. Xcode resolves the dependencies and lists them in the sidebar. I added code to import and use the framework. It builds successfully but the app crashes when run:
dyld[63959]: Library not loaded: @rpath/FittedSheets.framework/FittedSheets
Referenced from: <7DE247FC-DAFF-3946-AD21-E80F5AF841C9> /Users/Jordan/Library/Developer/Xcode/DerivedData/MyFramework-Example-gaeeymnqzenzrbbmhuebpodqctsz/Build/Products/Debug-iphonesimulator/MyFramework.framework/MyFramework
How do I get this working? I'm wondering is my package set up properly to vend the framework specifying its dependencies, and is my XCFramework created correctly?
The Package.swift for the framework's source code contains:
// swift-tools-version: 5.10
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "MyFramework",
platforms: [
.iOS(.v14)
],
products: [
.library(
name: "MyFramework",
type: .dynamic,
targets: ["MyFramework"]
)
],
dependencies: [
.package(url: "https://github.com/gordontucker/FittedSheets.git", from: "2.6.1")
],
targets: [
.target(
name: "MyFramework",
dependencies: [
"FittedSheets"
],
path: "Sources"
)
]
)
And I created the XCFramework following the steps in that previous thread:
Create archive from package via xcodebuild archive -workspace "$PACKAGE_PATH" -scheme "$FRAMEWORK_NAME" -destination 'generic/platform=iOS' -archivePath "$ARCHIVE_PATH/iOS" SKIP_INSTALL=NO BUILD_LIBRARY_FOR_DISTRIBUTION=YES ENABLE_USER_SCRIPT_SANDBOXING=NO ENABLE_MODULE_VERIFIER=NO OTHER_SWIFT_FLAGS=-no-verify-emitted-module-interface
Create the Modules directory in the framework via mkdir -p "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework/Modules"
Copy the Swift interface files into the framework from the build in DerivedData via cp -a "$BUILD_PRODUCTS_PATH/Build/Intermediates.noindex/ArchiveIntermediates/$FRAMEWORK_NAME/BuildProductsPath/Release-iphoneos/$FRAMEWORK_NAME.swiftmodule" "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework/Modules"
Repeat 1-3 for iOS Simulator
Create an XCFramework via xcodebuild -create-xcframework -framework "$ARCHIVE_PATH/iOS.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework" -framework "$ARCHIVE_PATH/iOS_Simulator.xcarchive/Products/usr/local/lib/$FRAMEWORK_NAME.framework" -output "$ARCHIVE_PATH/$FRAMEWORK_NAME.xcframework"
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo.
To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks!
let imageURL = contentEditingInput.fullSizeImageURL!
let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])!
var metadata: [AnyHashable: Any] = inputImage.properties
// Edit the metadata as desired...
let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)!
editingContext.frameProcessor = { frame, error -> CIImage? in
// Edit only the still photo
if frame.type == .photo {
return frame.image.settingProperties(metadata)
}
return frame.image
}
let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in
let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput)
editingOutput.adjustmentData = adjustmentData
editingContext.saveLivePhoto(to: editingOutput) { success, error in
if success {
continuation.resume(returning: editingOutput)
} else {
continuation.resume(throwing: error!)
}
}
}
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetChangeRequest(for: asset)
request.contentEditingOutput = contentEditingOutput
}
I have an app that displays artwork via MPMediaItem.artwork, requesting an image with a specific size. How do I get a media item's MPMediaItemAnimatedArtwork, and how to get the preview image and video to display to the user?
iOS 26 seems to have changed the way action extension icons appear in the share sheet. My icon is too small now compared to the Copy button in Safari (and Shortcuts’ icons are too small too, a bug?). How do you update it, and how do you ensure it looks fine in iOS 18 and earlier?
My current icon is an AppIcon in the asset catalog, single size 1024x1024, with about 130px padding around it.
The documentation for CLGeocoder states
Geocoding requests are rate-limited for each app, so making too many requests in a short period of time may cause some of the requests to fail. (When the maximum rate is exceeded, the geocoder returns an error object with the CLError.Code.network error to the associated completion handler.)
And it provides helpful guidance on how and when to submit geocoding requests.
The documentation for MKReverseGeocodingRequest does not mention requests are rate-limited. Does this mean it is not rate-limited? If it is rate-limited, is it similar to CLGeocoder, what is its behavior?
It is important to understand behavior of the API in order to understand impact on my app’s use case and how users will be affected should I change the implementation. Thanks!
I discovered when editing photos with the PhotoKit API, PHContentEditingOutput's renderedContentURL is a file in the app container's tmp directory with a filename that seems to follow the format render.<uuid>.JPG, and that file does not get deleted if the edit does not complete successfully (the user cancels the edit request, an error occurs, the app crashes, etc). I understand the system is supposed to automatically delete tmp files every once in a while, but some users are noticing my app's Documents & Data inflates, so I'm considering deleting these render files each time the app is launched. But I don't want to delete everything in the tmp directory as there could possibly be other data in there.
What's the best way to remove those temporary files? Does the filename always start with render. no matter the device language? I thought I'd delete files in NSTemporaryDirectory() with that prefix but then I discovered in Mac Catalyst the location is not the tmp directory directly, they're in tmp/TemporaryItems/<bundleid>.
Thanks!
We are creating a watch party app that allows you to video chat with your friends and play a YouTube video at the same time. The video is played using Google's youtube-ios-player-helperlibrary which uses a WKWebView with their iframe API, as that's the only way to play it without violating the Terms of Service. We need the ability to change the volume of the YouTube video separately from the video chat, so you can hear your friends over the video for example.
Unfortunately it's not possible to directly change the volume because iOS does not support changing the volume via JavaScript - https://developer.apple.com/library/archive/documentation/AudioVideo/Conceptual/Using_HTML5_Audio_Video/Device-SpecificConsiderations/Device-SpecificConsiderations.html#//apple_ref/doc/uid/TP40009523-CH5-SW10, unlike macOS. Setting volume doesn't do anything and getting it always returns 1. Users can change the volume with the hardware buttons but this applies to all audio including the video chat, not just the YouTube video.
Someone found a workaround - https://stackoverflow.com/a/37315071/1795356 to get the underlying AVPlayer and change its volume natively. This worked with UIWebView but does not work now that it uses WKWebView.
What can be done to change the volume of the YouTube video?