I'm processing a 4K video with a complex Core Image pipeline that also invokes a neural style transfer Core ML model. This works very well, but sometimes, for very few frames, the model execution fails with the following error messages:
Execution of the command buffer was aborted due to an error during execution. Internal Error (0000000e:Internal Error)
Error: command buffer exited with error status.
The Metal Performance Shaders operations encoded on it may not have completed.
Error:
(null)
Internal Error (0000000e:Internal Error)
<CaptureMTLCommandBuffer: 0x280b95d90> -> <AGXG15FamilyCommandBuffer: 0x108f143c0>
label = <none>
device = <AGXG15Device: 0x106034e00>
name = Apple A16 GPU
commandQueue = <AGXG15FamilyCommandQueue: 0x1206cee40>
label = <none>
device = <AGXG15Device: 0x106034e00>
name = Apple A16 GPU
retainedReferences = 1
[espresso] [Espresso::handle_ex_plan] exception=Espresso exception: "Generic error": Internal Error (0000000e:Internal Error); code=1 status=-1
[coreml] Error computing NN outputs -1
[coreml] Failure in -executePlan:error:.
It's really hard to reproduce it since it only happens occasionally. I also didn't find a way to access that Internal Error mentioned, so I don't know the real reason why it fails.
Any advice would be appreciated!
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
When using the heif10Representation and writeHEIF10Representation APIs of CIContext, the resulting image doesn’t contain an alpha channel.
When using the heifRepresentation and writeHEIFRepresentation APIs, the alpha channel is properly preserved, i.e., the resulting HEIC will contain a urn:mpeg:hevc:2015:auxid:1 auxiliary image. This image is missing when exporting as HEIF10.
Is this a bug or is this intentional? If I understand the spec correctly, HEIF10 should be able to support alpha via auxiliary image (like HEIF8).
Apple's new Journal app was introduced with the iOS 17.2 beta.
In the release notes, the following is mentioned:
If your app donates activities or interactions to SiriKit or CallKit or if someone authorizes your app to save data to HealthKit, some data might show up as part of Journaling Suggestions.
Is there any documentation on how this works exactly? What kind of activities can be featured in Journal? How does the system decide what to feature?
For instance, if I have an app that allows the user to create art images, can I somehow make those images appear in the Journaling Suggestions?
With iOS 18, TipKit got explicit support for syncing tip state via iCloud.
However, before that, TipKit already did iCloud syncing implicitly, as far as I know.
How does the new explicit syncing relate to the previous mechanism? Do we have to enable iCloud syncing manually now to retain the functionality in iOS 18? Is there a way to sync with the state that was already stored by TipKit in iCloud on iOS 17?
Our apps can currently be installed on Apple Silicon Macs via the iPad app on Mac feature (“Designed for iPad”). Now we are working on “proper” (universal) Catalyst-based Mac apps that will be available on the Mac App Store.
How does the transition work for users that currently have the iPad version installed? Will they automatically update to the Mac Catalyst app once it’s available, or do they need to re-install the app from the Mac App Store?
Topic:
App Store Distribution & Marketing
SubTopic:
General
Tags:
Universal Apps
Mac Catalyst
Apple Silicon
One of our users reported a very strange bug where our app freezes and eventually crashes on some screen transitions.
From different crash logs we could determine that the app freezes up when we call view.layoutIfNeeded() for animating constraint changes. It then gets killed by the watchdog 10 seconds later:
Exception Type: EXC_CRASH (SIGKILL)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: FRONTBOARD 2343432205
<RBSTerminateContext| domain:10 code:0x8BADF00D explanation:scene-update watchdog transgression: app<bundleID(2A01F261-3554-44C0-B5A9-EBEB446484AD)>:6921 exhausted real (wall clock) time allowance of 10.00 seconds
ProcessVisibility: Background
ProcessState: Running
WatchdogEvent: scene-update
WatchdogVisibility: Background
WatchdogCPUStatistics: (
"Elapsed total CPU time (seconds): 24.320 (user 18.860, system 5.460), 29% CPU",
"Elapsed application CPU time (seconds): 10.630, 12% CPU"
) reportType:CrashLog maxTerminationResistance:Interactive>
The crash stack trace looks slightly different, depending on the UI transition that is happening. Here are the two we observed so far. Both are triggered by the layoutIfNeeded() call.
Thread 0 name: Dispatch queue: com.apple.main-thread
Thread 0 Crashed:
0 CoreAutoLayout 0x1b09f90e4 -[NSISEngine valueForEngineVar:] + 8
1 UIKitCore 0x18f919478 -[_UIViewLayoutEngineRelativeAlignmentRectOriginCache origin] + 372
2 UIKitCore 0x18f918f18 -[UIView _nsis_center:bounds:inEngine:forLayoutGuide:] + 1372
3 UIKitCore 0x18f908e9c -[UIView(Geometry) _applyISEngineLayoutValuesToBoundsOnly:] + 248
4 UIKitCore 0x18f9089e0 -[UIView(Geometry) _resizeWithOldSuperviewSize:] + 148
5 CoreFoundation 0x18d0cd6a4 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 24
6 CoreFoundation 0x18d0cd584 -[__NSArrayM enumerateObjectsWithOptions:usingBlock:] + 432
7 UIKitCore 0x18f8e62b0 -[UIView(Geometry) resizeSubviewsWithOldSize:] + 128
8 UIKitCore 0x18f977194 -[UIView(AdditionalLayoutSupport) _is_layout] + 124
9 UIKitCore 0x18f976c2c -[UIView _updateConstraintsAsNecessaryAndApplyLayoutFromEngine] + 800
10 UIKitCore 0x18f903944 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2728
11 QuartzCore 0x18ec15498 CA::Layer::layout_if_needed(CA::Transaction*) + 496
12 UIKitCore 0x18f940c10 -[UIView(Hierarchy) layoutBelowIfNeeded] + 312
Thread 0 name: Dispatch queue: com.apple.main-thread
Thread 0 Crashed:
0 QuartzCore 0x18ec2cfe0 -[CALayer animationForKey:] + 176
1 UIKitCore 0x18fa5b258 UniqueAnimationKeyForLayer + 192
2 UIKitCore 0x18fa5ab7c __67-[_UIViewAdditiveAnimationAction runActionForKey:object:arguments:]_block_invoke_2 + 468
3 UIKitCore 0x18fa5ba5c -[_UIViewAdditiveAnimationAction runActionForKey:object:arguments:] + 1968
4 QuartzCore 0x18eb9e938 CA::Layer::set_bounds(CA::Rect const&, bool) + 428
5 QuartzCore 0x18eb9e760 -[CALayer setBounds:] + 132
6 UIKitCore 0x18f941770 -[UIView _backing_setBounds:] + 64
7 UIKitCore 0x18f940404 -[UIView(Geometry) setBounds:] + 340
8 UIKitCore 0x18f908f84 -[UIView(Geometry) _applyISEngineLayoutValuesToBoundsOnly:] + 480
9 UIKitCore 0x18f9089e0 -[UIView(Geometry) _resizeWithOldSuperviewSize:] + 148
10 CoreFoundation 0x18d0cd6a4 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 24
11 CoreFoundation 0x18d132488 -[__NSSingleObjectArrayI enumerateObjectsWithOptions:usingBlock:] + 92
12 UIKitCore 0x18f8e62b0 -[UIView(Geometry) resizeSubviewsWithOldSize:] + 128
13 UIKitCore 0x18f977194 -[UIView(AdditionalLayoutSupport) _is_layout] + 124
14 UIKitCore 0x18f976c2c -[UIView _updateConstraintsAsNecessaryAndApplyLayoutFromEngine] + 800
15 UIKitCore 0x18f916258 -[UIView(Hierarchy) layoutSubviews] + 204
16 UIKitCore 0x18f903814 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2424
17 QuartzCore 0x18ec15498 CA::Layer::layout_if_needed(CA::Transaction*) + 496
18 UIKitCore 0x18f940c10 -[UIView(Hierarchy) layoutBelowIfNeeded] + 312
So far, we only know of one iPad Air M1 where this is happening. But we don't know how many users experience this issue without reporting it.
Does anyone know what could cause Auto Layout or Core Animation to block in those calls? We have no clue so far...
The newish PHPicker is a great way to access the users’ photo library without requiring fill access permissions. However, this is currently no way for accessing the original or unadjusted version of an asset. The preferredAssetRepresentationMode of the PHPickerConfiguration only allows the options automatic, compatible, and current, where current still returns the asset with previous adjustments applied. The option only seems to impact potential asset transcoding.
In contrast, when fetching PHAsset data, one can specify the PHImageRequestOptionsVersion unadjusted or original, which give access to the underlying untouched image. It would be great to have these options in the PHPicker interface as well.
The alternative would be to load the image through PHAsset via the identifier returned by the picker, but that would require full library access, which I want to avoid.
Or is there another way to access the original image without these permissions?
More and more iOS devices can capture content with high/extended dynamic range (HDR/EDR) now, and even more devices have screens that can display that content properly.
Apple also gave us developers the means to correctly display and process this EDR content in our apps on macOS and now also on iOS 16.
There are a lot of EDR-related sessions from WWDC 2021 and 2022.
However, most of them focus on HDR video but not images—even though Camera captures HDR images by default on many devices.
Interestingly, those HDR images seem to use a proprietary format that relies on EXIF metadata and an embedded HDR gain map image for displaying the HDR effect in Photos.
Some observations:
Only Photos will display those metadata-driven HDR images in their proper brightness range. Files, for instance, does not.
Photos will not display other HDR formats like OpenEXR or HEIC with BT.2100-PQ color space in their proper brightness.
When using the PHPicker, it will even automatically tone-map the EDR values of OpenEXR images to SDR. The only way to load those images is to request the original image via PHAsset, which requires photo library access.
And here comes my main point: There is no API that enables us developers to load iPhone HDR images (with metadata and gain map) that will decode image + metadata into EDR pixel values. That means we cannot display and edit those images in our app the same way as Photos.
There are ways to extract and embed the HDR gain maps from/into images using Image I/O APIs. But we don't know the algorithm used to blend the gain map with the image's SDR pixel values to get the EDR result.
It would be very helpful to know how decoding and encoding from SDR + gain map to HDR and back works.
Alternatively (or in addition), it would be great if common image loading APIs like Image I/O and Core Image would provide APIs to load those images into an EDR image representation (16-bit float linear sRGB with extended values, for example) and write EDR images into SDR + gain map images so that they are correctly displayed in Photos.
Thanks for your consideration! We really want to support HDR content in our image editing apps, but without the proper APIs, we can only guess how image HDR works on iOS.
With macOS 13, the CIColorCube and CIColorCubeWithColorSpace filters gained the extrapolate property for supporting EDR content.
When setting this property, we observe that the outputImage of the filter sometimes (~1 in 3 tries) just returns nil. And sometimes it “just” causes artifacts to appear when rendering EDR content (see screenshot below). The artifacts even appear sometimes when extrapolate was not set.
input | correct output | broken output
This was reproduced on Intel-based and M1 Macs.
All of our LUT-based filters in our apps are broken in this way and we could not find a workaround for the issue so far. Does anyone experice the same?
We recently had to change our MLModel's architecture to include custom layers, which means the model can't run on the Neural Engine anymore.
After the change, we observed a lot of crashes being reported on A13 devices. It turns out that the memory consumption when running the prediction with the new model on the GPU is much higher than before, when it was running on the Neural Engine. Before, the peak memory load was ~350 MB, now it spikes over 2 GB, leading to a crash most of the time. This only seems to happen on the A13. When forcing the model to only run on the CPU, the memory consumption is still high, but the same as running the old model on the CPU (~750 MB peak). All tested on iOS 16.1.2.
We profiled the process in Instruments and found that there are a lot of memory buffers allocated by Core ML that are not freed after the prediction.
The allocation stack trace for those buffers is the following:
We ran the same model on a different device and found the same buffers in Instruments, but there they are only 4 KB in size. It seems, Core ML is somehow massively over-allocating memory when run on the A13 GPU.
So far we limit the model to only run on CPU for those devices, but this is far from ideal. Is there any other model setting or workaround that we can use to avoid this issue?
Is there a way to observe the currentEDRHeadroom property of UIScreen for changes? KVO is not working for this property...
I understand that I can query the current headroom in the draw(...) method to adapt the rendering. However, our apps only render on-demand when the user changes parameters. But we would also like to re-render when the current EDR headroom changes to adapt the tone mapping to the new environment.
The only solution we've found so far is to continuously query the screen for changes, which doesn't seem ideal. It would be better if the property would be observable via KVO or if there would be a system notification to listen for.
Thanks!
When initializing a CIColor with a dynamic UIColor (like the system colors that resolve differently based on light/dark mode) on macOS 14 (Mac Catalyst), the resulting CIColor is invalid/uninitialized. For instance:
po CIColor(color: UIColor.systemGray2)
→ <uninitialized>
po CIColor(color: UIColor.systemGray2.resolvedColor(with: .current))
→ <CIColor 0x60000339afd0 (0.388235 0.388235 0.4 1) ExtendedSRGB>
But also, not all colors work even when resolved:
po CIColor(color: UIColor.systemGray.resolvedColor(with: .current))
→ <uninitialized>
I think this is caused by the color space of the resulting UIColor:
po UIColor.systemGray.resolvedColor(with: .current)
→ kCGColorSpaceModelRGB 0.596078 0.596078 0.615686 1
po UIColor.systemGray2.resolvedColor(with: .current)
→ UIExtendedSRGBColorSpace 0.388235 0.388235 0.4 1
This worked correctly before in macOS 13.
Tips presented using the popoverTip view modifier can't be styled using other tip view modifiers (as of beta 8).
For instance, the last two modifiers don't have any effect here:
Image(systemName: "wand.and.stars")
.popoverTip(tip)
.tipBackground(.red)
.tipCornerRadius(30)
It will look like this:
Whereas applying the same modifiers to a TipView changes its look:
TipView(tip, arrowEdge: .bottom)
.tipBackground(.red)
.tipCornerRadius(30)
Is this intended behavior? How can we change the appearance of popup tips?
In iOS 17.1 (and 17.2 beta), the arrowEdge parameter of the SwiftUI popoverTip doesn't work anymore.
This code
button
.popoverTip(tip, arrowEdge: .bottom)
looks like this on iOS 17.0
and like this on 17.1 and up.
I checked permittedArrowDirections of the corresponding UIPopoverPresentationController (via the Memory Graph): It's .down on iOS 17.0 and .any (the default) on 17.1. It seems the parameter of popoverTip is not properly propagated to the popover controller anymore.
I have an IOSurface and I want to turn that into a CIImage. However, the constructor of CIImage takes a IOSurfaceRef instead of a IOSurface.
On most platforms, this is not an issue because the two types are toll-free bridgeable... except for Mac Catalyst, where this fails.
I observed the same back in Xcode 13 on macOS. But there I could force-cast the IOSurface to a IOSurfaceRef:
let image = CIImage(ioSurface: surface as! IOSurfaceRef)
This cast fails at runtime on Catalyst.
I found that unsafeBitCast(surface, to: IOSurfaceRef.self) actually works on Catalyst, but it feels very wrong.
Am I missing something? Why aren't the types bridgeable on Catalyst?
Also, there should ideally be an init for CIImage that takes an IOSurface instead of a ref.