iOS 14 users can change the default email app. What effect, if any, does this have on MFMailComposeViewController?
Currently if canSendMail() returns false I instruct the user to set up an account in the Mail app. Maybe the wording needs to be different?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'd like to get a song's bit rate, for example 256 kbps, from a MPMediaItem retrieved via MPMediaPickerController. Is this possible?
I tried to get it via:
AVAsset(url: mediaItem.assetURL).tracks.first?.estimatedDataRate
but this is 0 for most songs I've tried, and it's 127999 for a song that's really 64 kbps.
I can get the sample rate of 44100 via:
let trackDescription = AVAsset(url: url).tracks.first?.formatDescriptions.first
let basicDescription = CMAudioFormatDescriptionGetStreamBasicDescription(trackDescription as! CMAudioFormatDescription)?.pointee
let sampleRate = basicDescription.mSampleRate
Supposedly one can calculate the bit rate given the sample rate, bit depth, and channels count, but I'm seeing mBitsPerChannel is always 0 in my testing.
Given an Apple Music trackId is it possible to query the user’s media library to see if they’ve added it to their library? Something like:
let predicate = MPMediaPropertyPredicate(value: "1440818675", forProperty: MPMediaItemPropertyPersistentID)
let query = MPMediaQuery(filterPredicates: Set([predicate]))
let songs = query.items
let isInLibrary = !songs.isEmpty
The Music macOS app shows various info about a song in the Get Info window. Most of this metadata is available in the iOS SDK via MPMediaItem. I'm wanting to access the information displayed in the File tab but I'm not seeing several pieces of data in the API. Is this possible?
□ Kind - Apple Music AAC audio file - ?
☑︎ Duration - 3:00 - playbackDuration
□ Size - 10 MB - ?
□ Bit rate - 256 kbps - ?
□ Sample rate - 44.100 kHz - ?
□ Date modified - 1/1/2001 - ?
☑︎ Date added - 1/1/2001 - dateAdded
□ Cloud status - Apple Music - ?
☑︎ Location - Cloud - isCloudItem
I have shipped an app that utilizes Core Data in CloudKit viaNSPersistentCloudKitContainer. I now want to add a widget that can query for the current data to display. It's my understanding you need to migrate this to a new location available to a shared App Group. How do you do this?
container = NSPersistentCloudKitContainer(name: "AppName")
container.loadPersistentStores { description, error in
//handle error
}
container.viewContext.mergePolicy = NSMergeByPropertyObjectTrumpMergePolicy
container.viewContext.automaticallyMergesChangesFromParent = true
I have a UIViewController that initially does not display any scrollable content, but later on I add a child view controller that does scroll - a UIHostingController whose rootView is a GeometryReader containing a ScrollView. The problem is when you scroll the UINavigationBar remains transparent I’m sure because it couldn’t find a scroll view in the view hierarchy. There is an API to specify which scroll view to use but that’s a UIScrollView. How can I tell it about my SwiftUI scroll view?
viewController.setContentScrollView(scrollView, for: .bottom)
Is it possible for an app using NSPersistentCloudKitContainer to enable sync in the background, if so, how?
Install app on your iPhone and iPad, create some data, it automatically syncs to both and life is good
Close the iPad app
Modify the data on iPhone
Desired behavior: The backgrounded iPad app should sync (even if it takes a while) and be informed that its local database has finished syncing or similarly that changes were made.
The use case is I want to reload my widget when data changes so it's up-to-date, so I need my app to sync it in the background, then notify when it's complete to be able to trigger the widget reload.
I am concerned it will be a poor widget experience if it's always showing stale data until they manually open the app to initiate sync - kind of defeats the purpose of widgets. ha
According to this post, they found sync is never run in the background. Is this not the case, or has it changed in iOS 15?
Thanks!
Topic:
App & System Services
SubTopic:
iCloud & Data
Tags:
CloudKit
WidgetKit
Background Tasks
Core Data
It was discussed in What's New in Photos APIs at WWDC that we should avoid using NSPredicate for custom fetch options if at all possible for performance purposes. In my app, I'm using NSPredicate to get only images from the user's library. I'm not seeing an API that would allow me to get assets from a specific collection filtered to just images without the use of NSPredicate though. Is there a more efficient way to perform this query that I'm not seeing?let photoLibraryFetchResult = PHAssetCollection.fetchAssetCollections(with: .smartAlbum, subtype: .smartAlbumUserLibrary, options: nil)
let assetCollection = photoLibraryFetchResult.firstObject!
let fetchOptions = PHFetchOptions()
fetchOptions.predicate = NSPredicate(format: "mediaType = %d", PHAssetMediaType.image.rawValue)
let fetchResults = PHAsset.fetchAssets(in: assetCollection, options: fetchOptions)
I am trying to figure out how to programatically install a per-user launchd agent - I have an executable Swift script I wrote and I need macOS to enforce it always be running. I found the SMJobBless sample code which I could play with to see how this works, but it hasn't been updated since it was last built with Xcode 4.6. As you can imagine it doesn't compile in Xcode 10. I was able to get it to build by upgrading to the recommended project settings, increasing the deployment target, and selecting my team for the two targets. Following the ReadMe I need to run ./SMJobBlessUtil.py setreq to configure the Info.plists appropriately. These instructions are out of date but eskimo was kind enough to provide updated instructions here to find the .app url. But when I do this and run the command I receive the following output:MacBook:SMJobBless Jordan$ ./SMJobBlessUtil.py setreq /Users/Jordan/Library/Developer/Xcode/DerivedData/SMJobBless-dffakkidazmiowcishyrborysygm/Build/Products/Debug/SMJobBlessApp.app SMJobBlessApp/SMJobBlessApp-Info.plist SMJobBlessHelper/SMJobBlessHelper-Info.plist
Traceback (most recent call last):
File "./SMJobBlessUtil.py", line 424, in
main()
File "./SMJobBlessUtil.py", line 418, in main
setreq(appArgs[1], appArgs[2], appArgs[3:])
File "./SMJobBlessUtil.py", line 360, in setreq
appToolDict[bundleID] = toolNameToReqMap[bundleID]
KeyError: '$(PRODUCT_BUNDLE_IDENTIFIER)'It would seem this python script isn't able to work with the newer project structures, not surprisingly. I wasn't able to find any other information on how to accomplish this task in the modern days. So could you please explain how to go about this? 🙂I have an executable .swift file and a .plist that works when loaded from ~/Library/LaunchAgents/ ready to be added to an existing Xcode project. Thanks!
Is it ok to call requestContentEditingInput for a lot of PHAssets to get URLs for their full size image? It seems odd because I would not be using the content editing input to actually modify these images. Is that ok are or are there implications to be aware of?
Use case:
I want to allow the user to share multiple PHAssets via UIActivityViewController. I can download and share an array of UIImage, which works, but I found if you tap Copy the app freezes for like 1 second for each photo (10 seconds if you shared 10 photos). Profiling the app it looks like iOS is spending the time creating a PNG for each image. Also it's probably not a good idea to store huge images in memory like that. I figured I'd try sharing an array of URLs to the images. Seemingly the only way you can get a URL for a photo is by requesting a content editing input for the asset and accessing its fullSizeImageURL property. Is this a good idea, and is this the right approach to share PHAssets?
I believe when trait collections were first introduced, the values were unknown initially, so you could put code that accessed those values in traitCollectionDidChange because it always changed from unknown to known values.
An iOS update changed this behavior to provide an estimated initial value, so traitCollectionDidChange would only get called if its value changed from its initial value. This required us to optimize for the trait collection in viewDidLoad for example to handle its initial value and handle changes in traitCollectionDidChange.
In iOS 17, it’s stated if you access traits before the view is added to the hierarchy, the values won’t be up-to-date. It’s recommended to use viewIsAppearing instead of viewDidLoad and viewWillAppear. traitCollectionDidChange is still invoked but deprecated replaced with a new registration API to be informed when a value changes.
My question is, will the code written using the previous approach still work when compiled with the iOS 17 SDK? Meaning, does the system still provide an estimated initial value and inform you if it changed upon getting added to the view hierarchy? Or is this a breaking change in behavior that will require us to rewrite our logic moving code that accesses the traitCollection from viewDidLoad to viewIsAppearing (and be really careful in doing so because this function is called every time the view appears not just once)? Are there any scenarios where the code written for iOS 16 would stop working once compiled for iOS 17 if you access trait values in viewDidLoad and handle changes in traitCollectionDidChange?
I’m trying to understand if I can keep my existing code and use the new approach going forward or if I need to revisit existing code that utilizes trait collections. Thanks!
I have a UIKit Mac Catalyst app, optimized for Mac idiom, with an NSToolbar manually added to the windowScene. Is it possible to implement a full-height inspector sidebar with this setup? It seems to always appear underneath the toolbar in my testing. Even if I remove my NSToolbar and let the system create a toolbar from a NavigationStack. It works on iOS - stretches all the way up the window splitting the app into two columns.
var body: some View {
NavigationStack {
AnimalTable(state: $state)
.inspector(isPresented: $state.inspectorPresented) {
AnimalInspectorForm(animal: $state.binding())
}
.toolbar {
Button {
state.inspectorPresented.toggle()
} label: {
Label("Toggle Inspector", systemImage: "info.circle")
}
}
}
}
During WWDC Q&As I asked how I could add an inspector to my UIKit app that’s using UISplitViewController with a double column style featuring a sidebar and detail view controller. I initially tried a full-height inspector (by putting my split view controller into a SwiftUI view and applying the inspector on that, embedding that into a UIHostingController to be the rootViewController) but this caused a bunch of UI bugs (seemingly related to optimizations made for a size class that doesn’t match the actual appearance) and it doesn’t extend into the NSToolbar on Mac Catalyst anyways. I now want to try implementing the under-the-toolbar solution. An engineer said:
For an under toolbar appearance, you should be able to use .inspector on the detail view controller (after wrapping it in a SwiftUI view), but you may have to do manual toolbar management here (hiding and showing) to make sure you don't end up with stacked toolbars/UINavigationBars
I have indeed run into the problem with two navigation bars in my inspector. I want to keep the navigation bar visible in the detail screen, but I do not want any navigation bars visible in my inspector since I’m going to provide my own button to toggle the inspector (via a button in the detail on iOS and an NSToolbarItem on macOS). Is this layout possible? I tried applying .toolbar(.hidden) on the inspector’s view but this doesn’t do anything, there’s still two stacked navigation bars (tested on iPadOS 17 beta 2). I think even if that worked it would only hide the inner navigation bar, I’d still have an undesirable navigation bar. :/ Wishing there were a UIKit API I could avoid the interop complexity ha
In the sample project attached to FB12447791, the root view controller is a UISplitViewController. The primary view controller is a UINavigationController containing a sidebar. The secondary view controller is a UINavigationController containing a UIHostingController whose root view is a SecondaryColumnView. SecondaryColumnView is a Form that has a button to Toggle Inspector, a navigation title, and an inspector. The inspector is a Form that has .toolbar(.hidden).
I've provided several screen recordings in the feedback report as well. Thanks for your help and insight!
The updated Photos access dialog in iOS 17 states:
Photos may contain metadata, such as location, depth information, or captions.
How do I access the caption a user added to a photo in my app? This wasn’t possible in iOS 16, is there new API in 17? I previously requested this ability via metadata in FB10205012 and via PHAsset in FB8244665. If it remains inaccessible I’ve submitted FB12437093 to request captions be removed from this wording.
I have an app that uses NSPersistentCloudKitContainer and a widget that displays a record. I want to add a button with interactive widgets in iOS 17 to modify the visible record via an AppIntent. When I do this the app logs:
CoreData: debug: CoreData+CloudKit: -[NSCloudKitMirroringDelegate managedObjectContextSaved:](2945): <NSCloudKitMirroringDelegate: 0x2818002a0>: Observed context save: <NSPersistentStoreCoordinator: 0x280a05180> - <NSManagedObjectContext: 0x281a00410>
It does not automatically sync this change to iCloud, not until I manually return the app to the foreground, even if I delay returning from the perform() function. Is there a way to sync NSPersistentCloudKitContainer while the app is in the background as a result of this change triggered in the widget? Thanks!