I'm updating my Photo Editing Extension to support HDR. To do this I set imageView.preferredImageDynamicRange = .high. But you can turn off the option to view HDR photos in the complete dynamic range in Settings > Photos. When you do that, open a photo, and tap the edit button, it does not appear in the full range as expected, but when you select my app from More > Extensions, it does appear in the complete dynamic range unexpectedly. I need to set imageView.preferredImageDynamicRange = .standard when View Full HDR is off, but I don't see any way to get that in my PHContentEditingController.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In a photo editing extension, is it possible to display the photo in HDR? In this context you only have a placeholder UIImage and a PHContentEditingInput which has a displaySizeImage and fullSizeImageURL. The displaySizeImage has isHighDynamicRange false.
In the WWDC 24 session "Use HDR for dynamic image experiences in your app" it's noted this is how you save edits for Adaptive HDR:
SDR + HDR: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrImage: hdrImage])
SDR + Gain: writeHEIFRepresentation(of: sdrImage, to: url, colorSpace: p3Space, options: [.hdrGainMapImage: gainImage])
This won't compile because the format argument is missing. What format should be used?
In the WWDC 23 session "Support HDR images in your app" RGBAf, RGBAh, and RGBA16, and RGB10 were mentioned but I'm not sure which one to use.
If relevant, I'm editing photos from the user's photo library, so the image was probably taken on iPhone but perhaps not. Thanks!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Media
PhotoKit
Photos and Imaging
Core Image
I have an app that allows you to edit your photos. To preserve HDR, I edit both the SDR image and gain map image, like so:
let sdrImage = CIImage(data: data, options: [.applyOrientationProperty: true])
let gainMapImage = CIImage(data: data, options: [.applyOrientationProperty: true, .auxiliaryHDRGainMap: true])
// edit them...
try CIContext().writeHEIFRepresentation(of: sdrImage, to: url, format: .RGBA8, colorSpace: colorSpace, options: [.hdrGainMapImage: gainMapImage])
I also support editing the still photo in Live Photos. To do this you create a PHLivePhotoEditingContext, set the frameProcessor block which gives you a CIImage that I edit when the frame.type is .photo, then you create a PHContentEditingOutput and call saveLivePhoto. I’m not seeing any way to preserve HDR here. Interestingly the frame processor is called twice with .photo frame.type, but I don’t see any difference between these images. How can I edit a gain map image to preserve HDR in the still photo of a Live Photo?
I have an app that allows the user to change a photo’s EXIF metadata. To do this, I request a content editing input, get the full size image, modify its properties, create a content editing output, write the output image to the rendered content URL, then call performChanges on the PHPhotoLibrary creating an asset change request for that asset setting its content editing output. This works as expected for regular photos but Live Photos get turned off converted to a regular photo.
To address this, I’m doing something similar by changing the properties of the .photo image in the Live Photo. I detect when the content editing input has a Live Photo, create a Live Photo editing context, set a frame processor that returns the frame’s image after setting its properties to the updated properties when the frame type is photo, then I create the content editing output and save the Live Photo to that output. It modifies the Live Photo successfully, but the metadata is not updated. If you get the full size image again the properties are the original properties. If you look at the EXIF metadata using an app like Metapho it remains unchanged. What am I doing wrong here? Thanks!
let imageURL = contentEditingInput.fullSizeImageURL!
let inputImage = CIImage(contentsOf: imageURL, options: [.applyOrientationProperty: true])!
var metadata: [AnyHashable: Any] = inputImage.properties
// Edit the metadata as desired...
let editingContext = PHLivePhotoEditingContext(livePhotoEditingInput: contentEditingInput)!
editingContext.frameProcessor = { frame, error -> CIImage? in
// Edit only the still photo
if frame.type == .photo {
return frame.image.settingProperties(metadata)
}
return frame.image
}
let contentEditingOutput = try await withCheckedThrowingContinuation { continuation in
let editingOutput = PHContentEditingOutput(contentEditingInput: contentEditingInput)
editingOutput.adjustmentData = adjustmentData
editingContext.saveLivePhoto(to: editingOutput) { success, error in
if success {
continuation.resume(returning: editingOutput)
} else {
continuation.resume(throwing: error!)
}
}
}
try await PHPhotoLibrary.shared().performChanges {
let request = PHAssetChangeRequest(for: asset)
request.contentEditingOutput = contentEditingOutput
}
I have an app that edits photos in your library. When I call
try CIContext().writeHEIFRepresentation(of: editedImage, to: fileURL, format: .RGBA8, colorSpace: originalImage.colorSpace!)
The following is logged to the console:
writeImageAtIndex:1012: ⭕️ ERROR: 'App' is trying to save an opaque image (5712x4284) with 'AlphaLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha.
What does that mean and how can I resolve it?
Xcode Version 16.0 (16A242d)
iOS 18.1 (22B82)
I have an iOS app that includes a Photo Editing Extension and is optimized for Mac Catalyst so you can edit photos in the Photos app on your Mac. This has worked really well but now I am encountering an error alert trying to open the photo editing extension:
RBSLaunchRequest error trying to launch plugin com.company.TestEditor. TestPhotoEditor (B7A616A7-2 5A8-4E02-8B32-5CAB37C8B4B2): ErrorDomain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x7f08fafd0 {ErrorDomain=NSPOSIXErrorDomain Code=153 "Unknown error: 153" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}}
Create a new iOS app project in Xcode
Create a new target and choose iOS > Photo Editing Extension
For both targets in the project, add Mac Catalyst as a supported destination
Run the app on My Mac (Mac Catalyst)
Open the Photos app, double click a photo, click Edit, click the more plugins button, and click TestPhotoEditor in the list
macOS 15.4.1 + Xcode 16.3
I set iOS 26 to install overnight, put my iPhone 16 Pro on the MagSafe charger, watched it charge just fine, and went to sleep. When I woke up the iPhone showed the “plug into power” dead battery screen. I took it off MagSafe and put it back on. A half hour later the phone was warm but still wouldn’t power on, just showed the battery screen with a little red in it. I took it off MagSafe and plugged it into my iPad charging brick with USB cable to give it more power, still it did not turn on. I tried holding all the buttons to try to force a restart but didn’t work.
For anyone else encountering this, do this to enter DFU mode and restore it. I had to do it a few times before I got the timing right.
Plug into your Mac and open Finder (or apparently a PC with Apple Devices or iTunes)
Press and quickly release volume up
Press and quickly release volume down
Press and hold right side button
When the battery disappears and screen goes black, hold volume down and continue holding side button
After a couple seconds release the side button and continue holding volume down
A prompt to allow connecting to the iPhone should appear after a couple seconds, click Allow, and it’ll say the iPhone entered DFU mode - proceed to restore the firmware
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}}
I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue.
I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
Media
Photos and Imaging
PhotoKit
AVFoundation
Some users have reported an error editing portrait photo assets in my app:
The operation couldn’t be completed. (CINonLocalizedDescriptionKey error 3.)
What is that error? Will affected photos always encounter this error (due to data corruption for example) or can it be resolved in a future iOS update?
FB16241301
The documentation for PHAssetChangeRequest.revertAssetContentToOriginal says it will fail if the original asset content is not on the current device so you should use PHAssetResourceManager to download it first, but this no longer seems to be the case in the latest iOS versions because an error no longer occurs when I take a photo on my iPhone, edit it, open Photos on my iPad and let it sync, then open my app on iPad and call revertAssetContentToOriginal for that asset. Does the system now take care of downloading the original when needed?
iOS 26 added smoothness to CIRoundedRectangleGenerator, for use with CIFilter.roundedRectangleGenerator. What should the smoothness value be to achieve the same corner curve as CALayerCornerCurve.continuous? Does it need to be calculated based on the extent size, if so, how?
I have an app that displays artwork via MPMediaItem.artwork, requesting an image with a specific size. How do I get a media item's MPMediaItemAnimatedArtwork, and how to get the preview image and video to display to the user?
I have non-consumable and consumable in-app purchases in my app. The tutorial I was following stated Transaction.currentEntitlements includes unfinished consumables, which is incorrect according to the documentation. Is the correct way to handle unfinished consumables (and non-consumables) to implement Transaction.updates and call finish() if it’s verified? The documentation says that listener will receive unfinished transactions once upon app launch, so with that, do I understand correctly you do not need to implement Transaction.unfinished unless you want to look for unfinished transactions manually later on? Otherwise what is the correct and most recommended way to handle unfinished consumables? Is there a way to test that scenario in Xcode?
In order to create a UITextView like that of the Messages app whose height grows to fits its contents (number of lines), I subclassed UITextView and customized the intrinsicContentSize like so:
override var intrinsicContentSize: CGSize {
var size = super.intrinsicContentSize
if size.height == UIView.noIntrinsicMetric {
layoutManager.glyphRange(for: textContainer)
size.height = layoutManager.usedRect(for: textContainer).height + textContainerInset.top + textContainerInset.bottom
}
return size
}
As noted at WWDC, accessing layoutManager will force TextKit 1, we should instead use textLayoutManager. How can this code be migrated to support TextKit 2?