Post

Replies

Boosts

Views

Activity

AssistantIntent for Photos without library access
The new .photos AssistantSchema for intents allow integrating App Intents for Photos-related actions with Apple Intelligence. I was wondering if it would be possible to create intents that do not require full library access. Our app supports loading image from Photos via the PHPicker, which doesn't require any user permission. Now we want to support the .photos.openAsset schema in an app intent to allow interactions like "Open this image in BeCasso and apply preset X". Would that be possible without full library access?
0
0
673
Jul ’24
CGImageDestinationAddImageFromSource causes issues in iOS 18 / macOS 15
There seems to be an issue in iOS 18 / macOS 15 related to image thumbnail generation and/or HEIC. We are transcoding JPEG images to HEIC when they are loaded into our app (HEIC has a much lower memory footprint when loaded by Core Image, for some reason). We use Image I/O for that: guard let source = CGImageSourceCreateWithURL(inputURL, nil), let destination = CGImageDestinationCreateWithURL(outputURL, UTType.heic.identifier as CFString, 1, nil) else { throw <error> } let primaryImageIndex = CGImageSourceGetPrimaryImageIndex(source) CGImageDestinationAddImageFromSource(destination, source, primaryImageIndex, nil) When we use CGImageDestinationAddImageFromSource, we get the following warnings on the console: createImage:1445: *** ERROR: bad image size (0 x 0) rb: 0 CGImageSourceCreateThumbnailAtIndex:5195: *** ERROR: CGImageSourceCreateThumbnailAtIndex[0] - 'HJPG' - failed to create thumbnail [-67] {alw:-1, abs: 1 tra:-1 max:4620} writeImageAtIndex:1025: ⭕️ ERROR: '<app>' is trying to save an opaque image (4620x3466) with 'AlphaPremulLast'. This would unnecessarily increase the file size and will double (!!!) the required memory when decoding the image --> ignoring alpha. It seems that CGImageDestinationAddImageFromSource is trying to extract/create a thumbnail, which fails somehow. I re-wrote the last part like this: guard let primaryImage = CGImageSourceCreateImageAtIndex(source, primaryImageIndex, nil), let properties = CGImageSourceCopyPropertiesAtIndex(source, primaryImageIndex, nil) else { throw <error> } CGImageDestinationAddImage(destination, primaryImage, properties) This doesn't cause any warnings. An issue that might be related has been reported here. I've also heard from others having issues with CGImageSourceCreateThumbnailAtIndex.
0
0
910
Nov ’24
Resize Image Playground sheet
When using the imagePlaygroundSheet modifier in SwiftUI, the system presets an image playground in a fixed size. Especially on macOS, this modal is rather small and doesn't utilize the available space efficiently. Is there a way to make the modal bigger, or allow the user to resize the dialog? I tried presentationDetents, but this would need to be applied to the content of the sheet, which is provided by the system... I guess this question applies to other system-provided sheets like the photo picker as well.
2
0
746
Jan ’25
UI freeze during layouting
One of our users reported a very strange bug where our app freezes and eventually crashes on some screen transitions. From different crash logs we could determine that the app freezes up when we call view.layoutIfNeeded() for animating constraint changes. It then gets killed by the watchdog 10 seconds later: Exception Type: EXC_CRASH (SIGKILL) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: FRONTBOARD 2343432205 <RBSTerminateContext| domain:10 code:0x8BADF00D explanation:scene-update watchdog transgression: app<bundleID(2A01F261-3554-44C0-B5A9-EBEB446484AD)>:6921 exhausted real (wall clock) time allowance of 10.00 seconds ProcessVisibility: Background ProcessState: Running WatchdogEvent: scene-update WatchdogVisibility: Background WatchdogCPUStatistics: ( "Elapsed total CPU time (seconds): 24.320 (user 18.860, system 5.460), 29% CPU", "Elapsed application CPU time (seconds): 10.630, 12% CPU" ) reportType:CrashLog maxTerminationResistance:Interactive> The crash stack trace looks slightly different, depending on the UI transition that is happening. Here are the two we observed so far. Both are triggered by the layoutIfNeeded() call. Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 CoreAutoLayout 0x1b09f90e4 -[NSISEngine valueForEngineVar:] + 8 1 UIKitCore 0x18f919478 -[_UIViewLayoutEngineRelativeAlignmentRectOriginCache origin] + 372 2 UIKitCore 0x18f918f18 -[UIView _nsis_center:bounds:inEngine:forLayoutGuide:] + 1372 3 UIKitCore 0x18f908e9c -[UIView(Geometry) _applyISEngineLayoutValuesToBoundsOnly:] + 248 4 UIKitCore 0x18f9089e0 -[UIView(Geometry) _resizeWithOldSuperviewSize:] + 148 5 CoreFoundation 0x18d0cd6a4 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 24 6 CoreFoundation 0x18d0cd584 -[__NSArrayM enumerateObjectsWithOptions:usingBlock:] + 432 7 UIKitCore 0x18f8e62b0 -[UIView(Geometry) resizeSubviewsWithOldSize:] + 128 8 UIKitCore 0x18f977194 -[UIView(AdditionalLayoutSupport) _is_layout] + 124 9 UIKitCore 0x18f976c2c -[UIView _updateConstraintsAsNecessaryAndApplyLayoutFromEngine] + 800 10 UIKitCore 0x18f903944 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2728 11 QuartzCore 0x18ec15498 CA::Layer::layout_if_needed(CA::Transaction*) + 496 12 UIKitCore 0x18f940c10 -[UIView(Hierarchy) layoutBelowIfNeeded] + 312 Thread 0 name: Dispatch queue: com.apple.main-thread Thread 0 Crashed: 0 QuartzCore 0x18ec2cfe0 -[CALayer animationForKey:] + 176 1 UIKitCore 0x18fa5b258 UniqueAnimationKeyForLayer + 192 2 UIKitCore 0x18fa5ab7c __67-[_UIViewAdditiveAnimationAction runActionForKey:object:arguments:]_block_invoke_2 + 468 3 UIKitCore 0x18fa5ba5c -[_UIViewAdditiveAnimationAction runActionForKey:object:arguments:] + 1968 4 QuartzCore 0x18eb9e938 CA::Layer::set_bounds(CA::Rect const&, bool) + 428 5 QuartzCore 0x18eb9e760 -[CALayer setBounds:] + 132 6 UIKitCore 0x18f941770 -[UIView _backing_setBounds:] + 64 7 UIKitCore 0x18f940404 -[UIView(Geometry) setBounds:] + 340 8 UIKitCore 0x18f908f84 -[UIView(Geometry) _applyISEngineLayoutValuesToBoundsOnly:] + 480 9 UIKitCore 0x18f9089e0 -[UIView(Geometry) _resizeWithOldSuperviewSize:] + 148 10 CoreFoundation 0x18d0cd6a4 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 24 11 CoreFoundation 0x18d132488 -[__NSSingleObjectArrayI enumerateObjectsWithOptions:usingBlock:] + 92 12 UIKitCore 0x18f8e62b0 -[UIView(Geometry) resizeSubviewsWithOldSize:] + 128 13 UIKitCore 0x18f977194 -[UIView(AdditionalLayoutSupport) _is_layout] + 124 14 UIKitCore 0x18f976c2c -[UIView _updateConstraintsAsNecessaryAndApplyLayoutFromEngine] + 800 15 UIKitCore 0x18f916258 -[UIView(Hierarchy) layoutSubviews] + 204 16 UIKitCore 0x18f903814 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 2424 17 QuartzCore 0x18ec15498 CA::Layer::layout_if_needed(CA::Transaction*) + 496 18 UIKitCore 0x18f940c10 -[UIView(Hierarchy) layoutBelowIfNeeded] + 312 So far, we only know of one iPad Air M1 where this is happening. But we don't know how many users experience this issue without reporting it. Does anyone know what could cause Auto Layout or Core Animation to block in those calls? We have no clue so far...
1
0
137
Apr ’25
Compiling CI kernels at runtime
In the "Explore Core Image kernel improvements" session, David mentioned that it is now possible to compile [[stitchable]] CI kernels at runtime. However, I fail to get it working. The kernel requires the #import of <CoreImage/CoreImage.h> and linking against the CoreImage Metal library. But I don't know how to link against the library when compiling my kernel at runtime. Also, according to the Metal Best Practices Guide, "the #include directive is not supported at runtime for user files." Any guidance on how the runtime compilation works is much appreciated! 🙂
3
0
1.6k
Sep ’21
Allow 16-bit RGBA image formats as input/output of MLModels
Starting in iOS 16 and macOS Ventura, OneComponent16Half will be a new scalar type for Images. Ideally, we would also like to use the 16-bit support for RGBA images. As of now, we need to make an indirection using MLMultiArray with Float (Float16 with the update) set as type and copy the data into the desired image buffer. Direct usage of 16-bit RGBA predictions in Image format would be ideal for some applications requiring high precision outputs, like models that are trained on EDR image data. This is also useful when integrating Core ML into Core Image pipelines since CI’s internal image format is 16-bit RGBA by default. When passing that into a Neural Style Transfer model with (8-bit) RGBA image input/output type, conversions are always necessary (as demonstrated in WWDC2022-10027). If we could modify the models to use 16-bit RGBA images instead, no conversion would be necessary anymore. Thanks for the consideration!
3
0
1.4k
Jun ’22
Core ML model execution sometimes fails under load
I'm processing a 4K video with a complex Core Image pipeline that also invokes a neural style transfer Core ML model. This works very well, but sometimes, for very few frames, the model execution fails with the following error messages: Execution of the command buffer was aborted due to an error during execution. Internal Error (0000000e:Internal Error) Error: command buffer exited with error status. The Metal Performance Shaders operations encoded on it may not have completed. Error: (null) Internal Error (0000000e:Internal Error) <CaptureMTLCommandBuffer: 0x280b95d90> -> <AGXG15FamilyCommandBuffer: 0x108f143c0> label = <none> device = <AGXG15Device: 0x106034e00> name = Apple A16 GPU commandQueue = <AGXG15FamilyCommandQueue: 0x1206cee40> label = <none> device = <AGXG15Device: 0x106034e00> name = Apple A16 GPU retainedReferences = 1 [espresso] [Espresso::handle_ex_plan] exception=Espresso exception: "Generic error": Internal Error (0000000e:Internal Error); code=1 status=-1 [coreml] Error computing NN outputs -1 [coreml] Failure in -executePlan:error:. It's really hard to reproduce it since it only happens occasionally. I also didn't find a way to access that Internal Error mentioned, so I don't know the real reason why it fails. Any advice would be appreciated!
1
0
1.9k
Oct ’22
CIColorCube sometimes producing no or broken output in macOS 13
With macOS 13, the CIColorCube and CIColorCubeWithColorSpace filters gained the extrapolate property for supporting EDR content. When setting this property, we observe that the outputImage of the filter sometimes (~1 in 3 tries) just returns nil. And sometimes it “just” causes artifacts to appear when rendering EDR content (see screenshot below). The artifacts even appear sometimes when extrapolate was not set. input | correct output | broken output This was reproduced on Intel-based and M1 Macs. All of our LUT-based filters in our apps are broken in this way and we could not find a workaround for the issue so far. Does anyone experice the same?
2
0
1.4k
Oct ’22
Issues with new MLE5Engine in Core ML
There seems to be a new MLE5Engine in iOS 17 and macOS 14, that causes issues with our style transfer models: The output is wrong (just gray pixels) and not the same as on iOS 16. There is a large memory leak. The memory consumption is increasing rapidly with each new frame. Concerning 2): There are a lot of CVPixelBuffers leaking during prediction. Those buffers somehow have references to themselves and are not released properly. Here is a stack trace of how the buffers are created: 0 _malloc_zone_malloc_instrumented_or_legacy 1 _CFRuntimeCreateInstance 2 CVObject::alloc(unsigned long, _CFAllocator const*, unsigned long, unsigned long) 3 CVPixe Buffer::alloc(_CFAllocator const*) 4 CVPixelBufferCreate 5 +[MLMultiArray(ImageUtils) pixelBufferBGRA8FromMultiArrayCHW:channelOrderIsBGR:error:] 6 MLE5OutputPixelBufferFeatureValueByCopyingTensor 7 -[MLE5OutputPortBinder _makeFeatureValueFromPort:featureDescription:error:] 8 -[MLE5OutputPortBinder _makeFeatureValueAndReturnError:] 9 __36-[MLE5OutputPortBinder featureValue]_block_invoke 10 _dispatch_client_callout 11 _dispatch_lane_barrier_sync_invoke_and_complete 12 -[MLE5OutputPortBinder featureValue] 13 -[MLE5OutputPort featureValue] 14 -[MLE5ExecutionStreamOperation outputFeatures] 15 -[MLE5Engine _predictionFromFeatures:options:usingStream:operation:error:] 16 -[MLE5Engine _predictionFromFeatures:options:error:] 17 -[MLE5Engine predictionFromFeatures:options:error:] 18 -[MLDelegateModel predictionFromFeatures:options:error:] 19 StyleModel.prediction(input:options:) When manually disabling the use of the MLE5Engine, the models run as expected. Is this an issue caused by our model, or is it a bug in Core ML?
4
0
2.5k
Oct ’23
arrowEdge of popoverTip not working anymore on iOS 17.1
In iOS 17.1 (and 17.2 beta), the arrowEdge parameter of the SwiftUI popoverTip doesn't work anymore. This code button .popoverTip(tip, arrowEdge: .bottom) looks like this on iOS 17.0 and like this on 17.1 and up. I checked permittedArrowDirections of the corresponding UIPopoverPresentationController (via the Memory Graph): It's .down on iOS 17.0 and .any (the default) on 17.1. It seems the parameter of popoverTip is not properly propagated to the popover controller anymore.
2
1
1.8k
Jul ’24
IOSurface vs. IOSurfaceRef on Catalyst
I have an IOSurface and I want to turn that into a CIImage. However, the constructor of CIImage takes a IOSurfaceRef instead of a IOSurface. On most platforms, this is not an issue because the two types are toll-free bridgeable... except for Mac Catalyst, where this fails. I observed the same back in Xcode 13 on macOS. But there I could force-cast the IOSurface to a IOSurfaceRef: let image = CIImage(ioSurface: surface as! IOSurfaceRef) This cast fails at runtime on Catalyst. I found that unsafeBitCast(surface, to: IOSurfaceRef.self) actually works on Catalyst, but it feels very wrong. Am I missing something? Why aren't the types bridgeable on Catalyst? Also, there should ideally be an init for CIImage that takes an IOSurface instead of a ref.
2
1
955
Jun ’24
Transition from "Designed for iPad" to "Mac Catalyst"
Our apps can currently be installed on Apple Silicon Macs via the iPad app on Mac feature (“Designed for iPad”). Now we are working on “proper” (universal) Catalyst-based Mac apps that will be available on the Mac App Store. How does the transition work for users that currently have the iPad version installed? Will they automatically update to the Mac Catalyst app once it’s available, or do they need to re-install the app from the Mac App Store?
1
1
810
Jul ’24
Decode video frames in lower resolution before processing
We are processing videos with Core Image filters in our apps, using an AVMutableVideoComposition (for playback/preview and export). For older devices, we want to limit the resolution at which the video frames are processed for performance and memory reasons. Ideally, we would tell AVFoundation to give us video frames with a defined maximum size into our composition. We thought setting the renderSize property of the composition to the desired size would do that. However, this only changes the size of output frames, not the size of the source frames that come into the composition's handler block. For example: let composition = AVMutableVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in let input = request.sourceImage // <- this still has the video's original size // ... }) composition.renderSize = CGSize(width: 1280, heigth: 720) // for example So if the user selects a 4K video, our filter chain gets 4K input frames. Sure, we can scale them down inside our pipeline, but this costs resources and especially a lot of memory. It would be way better if AVFoundation could decode the video frames in the desired size already before passing it into the composition handler. Is there a way to tell AVFoundation to load smaller video frames?
0
1
603
Nov ’24
Image Playground not available for "Designed for iPad" apps?
I'm currently trying to add support for Image Playground to our apps. It seems that it's not working in an app that is "Designed for iPad" and runs on a Mac. The modal just shows a spinner and the following is logged to console: Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> Private sandbox for com.apple.GenerativePlaygroundApp.remoteUIExtension : <none> GP extension could not be loaded: Extension (platform: 2) could not be found (in update) dealloc Query controller [C32BA176-6A3E-465D-B3C5-0F8D91068B89] ImagePlaygroundViewController.isAvailable returns true, however. In a "real" Mac Catalyst app, it's working. Just not when the app is actually an iPad app. Is this a bug?
4
2
1.5k
Jan ’25
Unable to change Photos permission of iPad app on Mac
Users can run our apps on Macs with Apple Silicon via the "iPad Apps on Mac" feature. The apps use PHPhotoLibrary.requestAuthorization(for: .addOnly, handler: callback) to request write-only access to the user's Photo Library during image export. This works as intended on macOS, but a huge problem arises when the user denies access (by accident or intentionally) and later decides that they want us to add their image to Photos: There is no way to grant this permission again. In System Preferences → Privacy &amp;amp; Security → Photos, the app is just not listed – in fact, none of the "iPad Apps on Mac" apps appear here. Not even tccutil reset all my.bundle.id works. It just reports tccutil: Failed to reset all approval status for my.bundle.id. Uninstalling, restarting the Mac, and reinstalling the app also doesn't work. The system seems to remember the initial decision. Is this an oversight in the integration of those apps with macOS, or are we missing something fundamental here? Is there maybe a way to prompt the user again?
3
3
1.5k
Sep ’23