Post

Replies

Boosts

Views

Activity

Using SMJobBless in modern Xcode
I am trying to figure out how to programatically install a per-user launchd agent - I have an executable Swift script I wrote and I need macOS to enforce it always be running. I found the SMJobBless sample code which I could play with to see how this works, but it hasn't been updated since it was last built with Xcode 4.6. As you can imagine it doesn't compile in Xcode 10. I was able to get it to build by upgrading to the recommended project settings, increasing the deployment target, and selecting my team for the two targets. Following the ReadMe I need to run ./SMJobBlessUtil.py setreq to configure the Info.plists appropriately. These instructions are out of date but eskimo was kind enough to provide updated instructions here to find the .app url. But when I do this and run the command I receive the following output:MacBook:SMJobBless Jordan$ ./SMJobBlessUtil.py setreq /Users/Jordan/Library/Developer/Xcode/DerivedData/SMJobBless-dffakkidazmiowcishyrborysygm/Build/Products/Debug/SMJobBlessApp.app SMJobBlessApp/SMJobBlessApp-Info.plist SMJobBlessHelper/SMJobBlessHelper-Info.plist Traceback (most recent call last): File "./SMJobBlessUtil.py", line 424, in main() File "./SMJobBlessUtil.py", line 418, in main setreq(appArgs[1], appArgs[2], appArgs[3:]) File "./SMJobBlessUtil.py", line 360, in setreq appToolDict[bundleID] = toolNameToReqMap[bundleID] KeyError: '$(PRODUCT_BUNDLE_IDENTIFIER)'It would seem this python script isn't able to work with the newer project structures, not surprisingly. I wasn't able to find any other information on how to accomplish this task in the modern days. So could you please explain how to go about this? 🙂I have an executable .swift file and a .plist that works when loaded from ~/Library/LaunchAgents/ ready to be added to an existing Xcode project. Thanks!
3
0
3.2k
May ’22
Is it ok to get fullSizeImageURL by requesting content editing input to share photos?
Is it ok to call requestContentEditingInput for a lot of PHAssets to get URLs for their full size image? It seems odd because I would not be using the content editing input to actually modify these images. Is that ok are or are there implications to be aware of? Use case: I want to allow the user to share multiple PHAssets via UIActivityViewController. I can download and share an array of UIImage, which works, but I found if you tap Copy the app freezes for like 1 second for each photo (10 seconds if you shared 10 photos). Profiling the app it looks like iOS is spending the time creating a PNG for each image. Also it's probably not a good idea to store huge images in memory like that. I figured I'd try sharing an array of URLs to the images. Seemingly the only way you can get a URL for a photo is by requesting a content editing input for the asset and accessing its fullSizeImageURL property. Is this a good idea, and is this the right approach to share PHAssets?
1
0
1.7k
Aug ’22
Localizing push notification alert messages with String Catalogs
Our app is not localized but we want to begin the localization process starting with push notifications we are going to integrate. The documentation notes: you can store your message strings in the Localizable.strings file of your app bundle and use the title-loc-key, subtitle-loc-key, and loc-key payload keys to specify which strings you want to display String Catalogs in Xcode 15 supersedes Localizable.strings. How do you support this when using String Catalogs? Do you just manually add a Localizable.xcstrings file to your project then manually add a new entry for your loc-key, and the system will find this string without issue? Or will we need to have a Localizable.strings file too?
3
0
3.1k
Aug ’23
Trait collection changes in iOS 17 and code compatibility
I believe when trait collections were first introduced, the values were unknown initially, so you could put code that accessed those values in traitCollectionDidChange because it always changed from unknown to known values. An iOS update changed this behavior to provide an estimated initial value, so traitCollectionDidChange would only get called if its value changed from its initial value. This required us to optimize for the trait collection in viewDidLoad for example to handle its initial value and handle changes in traitCollectionDidChange. In iOS 17, it’s stated if you access traits before the view is added to the hierarchy, the values won’t be up-to-date. It’s recommended to use viewIsAppearing instead of viewDidLoad and viewWillAppear. traitCollectionDidChange is still invoked but deprecated replaced with a new registration API to be informed when a value changes. My question is, will the code written using the previous approach still work when compiled with the iOS 17 SDK? Meaning, does the system still provide an estimated initial value and inform you if it changed upon getting added to the view hierarchy? Or is this a breaking change in behavior that will require us to rewrite our logic moving code that accesses the traitCollection from viewDidLoad to viewIsAppearing (and be really careful in doing so because this function is called every time the view appears not just once)? Are there any scenarios where the code written for iOS 16 would stop working once compiled for iOS 17 if you access trait values in viewDidLoad and handle changes in traitCollectionDidChange? I’m trying to understand if I can keep my existing code and use the new approach going forward or if I need to revisit existing code that utilizes trait collections. Thanks!
1
0
2.8k
Jun ’23
EnumerableEntityQuery references AppEntity type in the singular form when it should be plural
I’m implementing App Shortcuts in my iOS app to allow you to add and find plants. In attempt to get a “Find Plants” shortcut, I created a query that conforms to EnumerableEntityQuery and set that as the defaultQuery in my PlantAppEntity. I have the typeDisplayRepresentation set to TypeDisplayRepresentation(name: "Plant", numericFormat: "\(placeholder: .int) plants"). I added a Localizable.stringsdict to the app target, added Plant and %lld plants as the header comments shows, then clicked Localize so now English is selected in the Localization section. But when I run the app then open Shortcuts and tap my app, there’s a Find Plant shortcut, but I expected it to be titled Find Plants. When I tap the info button it shows “plant” instead of “plants” in every parameter description. When you add that action to a shortcut the placeholder is All Plant, unlike similar shortcuts from Reminders and Contacts that say “All Reminders” and “All Contacts”. The action is working properly as it returns an array of plants, the only issue is it’s using the singular form of plant in places it should be plural. Have I done something wrong, am I missing anything, or is this a bug? (FB12908309)
1
0
956
Aug ’23
How to configure visionOS app icon in existing iOS project
I have an iOS app and I added Vision Pro as a supported destination. I'm ready to add an app icon. When I select my existing AppIcon there's no option to add visionOS assets to it. I went ahead and created a new visionOS App Icon titled VisionAppIcon. Now how do I configure the project to use VisionAppIcon for visionOS while continuing to use AppIcon for iOS? When I select the target and go to Build Settings there's Primary App Icon Set Name currently set to AppIcon. When I run the visionOS app, no app icon appears. If I change that to VisionAppIcon then it appears of course. But I don't see a way to add variants for it other than Debug and Release.
2
0
2.3k
Jan ’24
What format for writeHEIFRepresentation preserves HDR?
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR: SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage]) SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage]) This won't compile because the format argument is missing. What format should be used? In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use. If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
1
0
711
Oct ’24
Get View Full HDR state from Settings > Photos to properly set preferredImageDynamicRange in editing extension
I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
1
0
655
Oct ’24
How to edit gain map image to preserve HDR in Live Photo
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so: let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true]) let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true]) // edit them... try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage]) I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
1
0
767
Nov ’24
Failed to launch Photo Editing Extension from Mac Catalyst app
I have an iOS app that includes a Photo Editing Extension and is optimized for Mac Catalyst so you can edit photos in the Photos app on your Mac. This has worked really well but now I am encountering an error alert trying to open the photo editing extension: RBSLaunchRequest error trying to launch plugin com.company.TestEditor. TestPhotoEditor (B7A616A7-2 5A8-4E02-8B32-5CAB37C8B4B2): ErrorDomain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7f08fafd0 {ErrorDomain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}} Create a new iOS app project in Xcode Create a new target and choose iOS > Photo Editing Extension For both targets in the project, add Mac Catalyst as a supported destination Run the app on My Mac (Mac Catalyst) Open the Photos app, double click a photo, click Edit, click the more plugins button, and click TestPhotoEditor in the list macOS 15.4.1 + Xcode 16.3
1
0
181
May ’25
PHAssetChangeRequest revertAssetContentToOriginal without original asset content
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
1
0
130
Jun ’25