The documentation for CLGeocoder states
Geocoding requests are rate-limited for each app, so making too many requests in a short period of time may cause some of the requests to fail. (When the maximum rate is exceeded, the geocoder returns an error object with the CLError.Code.network error to the associated completion handler.)
And it provides helpful guidance on how and when to submit geocoding requests.
The documentation for MKReverseGeocodingRequest does not mention requests are rate-limited. Does this mean it is not rate-limited? If it is rate-limited, is it similar to CLGeocoder, what is its behavior?
It is important to understand behavior of the API in order to understand impact on my app’s use case and how users will be affected should I change the implementation. Thanks!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
My app is currently using CLGeocoder to get a CLPlacemark, then using placemark.postalAddress with CNPostalAddressFormatter to get an attributed string for the full address, I then enumerate its attributes to pull out specific elements like just the street or state or zip etc.
This is deprecated in iOS 26 with MKReverseGeocodingRequest being the intended replacement. This API returns an MKMapItem which doesn’t provide a CNPostalAddress - you can get a full address as a String but not structured address data that I’m seeing. Am I missing some way to get the postal address? Or is it a non-goal to provide that anymore? Thanks!
Hello! What UIKit API enables you to add a view below the navigation bar and extend the scroll edge effect below it in iOS 26? safeAreaBar is how you do it in SwiftUI but I need to achieve this design in my UIKit app (which has a collection view in a view controller in a navigation controller).
struct ContentView: View {
let segments = ["First", "Second", "Third"]
@State private var selectedSegment = "First"
var body: some View {
NavigationStack {
List(0..<50, id: \.self) { i in
Text("Row \(i + 1)")
}
.safeAreaBar(edge: .top) {
Picker("Segment", selection: $selectedSegment) {
ForEach(segments, id: \.self) {
Text($0)
}
}
.pickerStyle(.segmented)
.padding(.horizontal)
.padding(.bottom, 8)
}
.navigationTitle("Title")
.navigationBarTitleDisplayMode(.inline)
}
}
}
If you add a new string in your app (for example String(localized: "contact_support_message", defaultValue: "Please contact support")), then later you change that default value and rebuild, the string catalog updates to match as expected.
But once that string is translated, changing the default value in code and rebuilding does not update the catalog. You seemingly have to go manually change the default value for English in the catalog to match the code (which marks the translation as Needs Review).
Is there a better way? Or is there a way to determine what strings have default values in code that do not match the catalog values to see if any were missed as wording was tweaked over time?
I am trying to figure out how to programatically install a per-user launchd agent - I have an executable Swift script I wrote and I need macOS to enforce it always be running. I found the SMJobBless sample code which I could play with to see how this works, but it hasn't been updated since it was last built with Xcode 4.6. As you can imagine it doesn't compile in Xcode 10. I was able to get it to build by upgrading to the recommended project settings, increasing the deployment target, and selecting my team for the two targets. Following the ReadMe I need to run ./SMJobBlessUtil.py setreq to configure the Info.plists appropriately. These instructions are out of date but eskimo was kind enough to provide updated instructions here to find the .app url. But when I do this and run the command I receive the following output:MacBook:SMJobBless Jordan$ ./SMJobBlessUtil.py setreq /Users/Jordan/Library/Developer/Xcode/DerivedData/SMJobBless-dffakkidazmiowcishyrborysygm/Build/Products/Debug/SMJobBlessApp.app SMJobBlessApp/SMJobBlessApp-Info.plist SMJobBlessHelper/SMJobBlessHelper-Info.plist
Traceback (most recent call last):
File "./SMJobBlessUtil.py", line 424, in
main()
File "./SMJobBlessUtil.py", line 418, in main
setreq(appArgs[1], appArgs[2], appArgs[3:])
File "./SMJobBlessUtil.py", line 360, in setreq
appToolDict[bundleID] = toolNameToReqMap[bundleID]
KeyError: '$(PRODUCT_BUNDLE_IDENTIFIER)'It would seem this python script isn't able to work with the newer project structures, not surprisingly. I wasn't able to find any other information on how to accomplish this task in the modern days. So could you please explain how to go about this? 🙂I have an executable .swift file and a .plist that works when loaded from ~/Library/LaunchAgents/ ready to be added to an existing Xcode project. Thanks!
Given an MPMediaItem the user selected from MPMediaPickerController or from MPMusicPlayerController.systemMusicPlayer.nowPlayingItem, is it possible to find out if this song is lossless and if it supports Spatial Audio? Thanks!
Is it ok to call requestContentEditingInput for a lot of PHAssets to get URLs for their full size image? It seems odd because I would not be using the content editing input to actually modify these images. Is that ok are or are there implications to be aware of?
Use case:
I want to allow the user to share multiple PHAssets via UIActivityViewController. I can download and share an array of UIImage, which works, but I found if you tap Copy the app freezes for like 1 second for each photo (10 seconds if you shared 10 photos). Profiling the app it looks like iOS is spending the time creating a PNG for each image. Also it's probably not a good idea to store huge images in memory like that. I figured I'd try sharing an array of URLs to the images. Seemingly the only way you can get a URL for a photo is by requesting a content editing input for the asset and accessing its fullSizeImageURL property. Is this a good idea, and is this the right approach to share PHAssets?
Our app is not localized but we want to begin the localization process starting with push notifications we are going to integrate. The documentation notes:
you can store your message strings in the Localizable.strings file of your app bundle and use the title-loc-key, subtitle-loc-key, and loc-key payload keys to specify which strings you want to display
String Catalogs in Xcode 15 supersedes Localizable.strings. How do you support this when using String Catalogs? Do you just manually add a Localizable.xcstrings file to your project then manually add a new entry for your loc-key, and the system will find this string without issue? Or will we need to have a Localizable.strings file too?
I believe when trait collections were first introduced, the values were unknown initially, so you could put code that accessed those values in traitCollectionDidChange because it always changed from unknown to known values.
An iOS update changed this behavior to provide an estimated initial value, so traitCollectionDidChange would only get called if its value changed from its initial value. This required us to optimize for the trait collection in viewDidLoad for example to handle its initial value and handle changes in traitCollectionDidChange.
In iOS 17, it’s stated if you access traits before the view is added to the hierarchy, the values won’t be up-to-date. It’s recommended to use viewIsAppearing instead of viewDidLoad and viewWillAppear. traitCollectionDidChange is still invoked but deprecated replaced with a new registration API to be informed when a value changes.
My question is, will the code written using the previous approach still work when compiled with the iOS 17 SDK? Meaning, does the system still provide an estimated initial value and inform you if it changed upon getting added to the view hierarchy? Or is this a breaking change in behavior that will require us to rewrite our logic moving code that accesses the traitCollection from viewDidLoad to viewIsAppearing (and be really careful in doing so because this function is called every time the view appears not just once)? Are there any scenarios where the code written for iOS 16 would stop working once compiled for iOS 17 if you access trait values in viewDidLoad and handle changes in traitCollectionDidChange?
I’m trying to understand if I can keep my existing code and use the new approach going forward or if I need to revisit existing code that utilizes trait collections. Thanks!
I’m implementing App Shortcuts in my iOS app to allow you to add and find plants. In attempt to get a “Find Plants” shortcut, I created a query that conforms to EnumerableEntityQuery and set that as the defaultQuery in my PlantAppEntity. I have the typeDisplayRepresentation set to TypeDisplayRepresentation(name: "Plant", numericFormat: "\(placeholder: .int) plants"). I added a Localizable.stringsdict to the app target, added Plant and %lld plants as the header comments shows, then clicked Localize so now English is selected in the Localization section. But when I run the app then open Shortcuts and tap my app, there’s a Find Plant shortcut, but I expected it to be titled Find Plants. When I tap the info button it shows “plant” instead of “plants” in every parameter description. When you add that action to a shortcut the placeholder is All Plant, unlike similar shortcuts from Reminders and Contacts that say “All Reminders” and “All Contacts”. The action is working properly as it returns an array of plants, the only issue is it’s using the singular form of plant in places it should be plural. Have I done something wrong, am I missing anything, or is this a bug? (FB12908309)
I have an iOS app and I added Vision Pro as a supported destination. I'm ready to add an app icon. When I select my existing AppIcon there's no option to add visionOS assets to it. I went ahead and created a new visionOS App Icon titled VisionAppIcon. Now how do I configure the project to use VisionAppIcon for visionOS while continuing to use AppIcon for iOS?
When I select the target and go to Build Settings there's Primary App Icon Set Name currently set to AppIcon. When I run the visionOS app, no app icon appears. If I change that to VisionAppIcon then it appears of course. But I don't see a way to add variants for it other than Debug and Release.
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR:
SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage])
SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage])
This won't compile because the format argument is missing. What format should be used?
In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use.
If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Media
PhotoKit
Photos and Imaging
Core Image
In a photo editing extension, is it possible to display the photo in HDR? In this context you only have a placeholder UIImage and a PHContentEditingInput which has a displaySizeImage and fullSizeImageURL. The displaySizeImage has isHighDynamicRange false.
I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so:
let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true])
let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true])
// edit them...
try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage])
I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?