Image I/O

RSS for tag

Read and write most image file formats, manage color, access image metadata using Image I/O.

Posts under Image I/O tag

31 Posts

Post

Replies

Boosts

Views

Activity

ExcUserFault and corrupted data when using UIImage#heicData
Over the last two weeks, I’ve had sporadic reports from users who suddenly have a corrupt image in the database for my app. It’s only affecting a few users and may possibly have been fixed with iOS 26.4.1 (I’m not sure). In any case, this started suddenly with the release of iOS 26.4 - our app has not been changed in several months. But I wanted to share what’s happening in case others are experiencing this. My app lets users import photos from the camera roll, photo albums, etc. Once the user has selected an image, the app saves this to a SQLite3 database using “image.heicData()”. For the four or five users who have been affected by this problem, the heicData call returns successfully, with a non-nil Data value. But the image itself is corrupt and unreadable. When the user tries to later open a screen containing the image, the app crashes. I’ve had to manually guide each user through tracking down and removing the affected item or items to resolve it, which is a bad experience for them and time-consuming for us. Our app crashes when it tries to read the image (using “UIImage(data: heicData)”). All users who have this problem have had an ExcUserFault file in their crash reports with our app name in it. It's not possible to symbolicate this file but i've included an excerpt at the bottom of this post: I was able to extract some raw data saved when this error occurs. When you run “file corrupt_image.heic”, you get: AmigaOS bitmap font "rtypheic", fc_YSize 0, 35001 elements which definitely doesn’t seem right. On a valid HEIC file, i get: ISO Media, HEIF Image HEVC Main or Main Still Picture Profile Is anyone else experience this? Or does anyone else have any suggestions about what could be happening? I submitted feedback FB22667639 about this. ExcUserFault example Exception Type: EXC_GUARD Exception Subtype: GUARD_TYPE_USER Exception Message: namespc 7 reason_code 0x0000000000000009 Exception Codes: 0x6000000000000007, 0x0000000000000009 Termination Reason: Namespace LIBXPC, Code 9, XPC_EXIT_REASON_FAULT Thread 0: 0 ??? 0x231fa997c 0x180000000 + 2985990524 1 ??? 0x197eb98b4 0x180000000 + 401316020 2 ??? 0x197ec4e04 0x180000000 + 401362436 3 ??? 0x197ec5ea0 0x180000000 + 401366688 4 ??? 0x19066bdb8 0x180000000 + 275168696 5 ??? 0x19066b968 0x180000000 + 275167592 6 ??? 0x19b5c19a4 0x180000000 + 459020708 7 ??? 0x19b5cfa2c 0x180000000 + 459078188 8 ??? 0x19b5cf838 0x180000000 + 459077688 9 ??? 0x197ec7c74 0x180000000 + 401374324 10 ??? 0x197ec991c 0x180000000 + 401381660 11 ??? 0x1bd74222c 0x180000000 + 1031021100 12 ??? 0x1bd744ba4 0x180000000 + 1031031716 13 ??? 0x1bd730e18 0x180000000 + 1030950424 14 ??? 0x1bd7458f8 0x180000000 + 1031035128 15 ??? 0x1bd730e18 0x180000000 + 1030950424 16 ??? 0x1bd731ae4 0x180000000 + 1030953700 17 ??? 0x1bd73bdac 0x180000000 + 1030995372 18 ??? 0x1bd73b6ac 0x180000000 + 1030993580 19 ??? 0x1e23283b0 0x180000000 + 1647477680 20 ??? 0x1e23278c0 0x180000000 + 1647474880
1
0
167
1d
Supported public API to open containing iOS app from Share Extension for image/PDF share sheet imports
Hello Apple Developer Forums, We are building an iOS app that needs to receive images and PDFs shared from the system share sheet. The sources include Screenshots, Photos, Files, and third-party apps. The desired user experience is similar to apps such as ChatGPT or Claude: when the user taps our app in the share sheet, the main containing app opens and starts importing or uploading the shared image or PDF. We are trying to understand the supported public API for this behavior. Why opening the containing app is important For our use case, it is important that the containing app opens during the share flow. The import/upload operation depends on the user’s authenticated session. If the Share Extension attempts to upload the file directly, the auth token available to the extension could be missing, expired, or invalid. We would prefer not to make the Share Extension responsible for authentication-dependent behavior such as: validating the user session refreshing tokens handling expired credentials presenting login or re-authentication UI owning upload retry logic tied to auth state In our architecture, authentication and token refresh are owned by the containing app. The Share Extension should ideally only receive the shared file, persist it in an app group container, and hand off to the main app. The main app would then validate auth state, refresh tokens if needed, and perform the import/upload. So the desired flow is: Share Extension receives image/PDF → Share Extension stores file in app group container → Containing app opens → Containing app validates auth/session state → Containing app imports/uploads the file The alternative flow is problematic for us: Share Extension receives image/PDF → Share Extension attempts upload directly → Upload may fail if auth token is expired or unavailable → Share Extension would need auth/session responsibilities We are trying to avoid having an authentication dependency inside the Share Extension implementation. What we have tried CFBundleDocumentTypes We added document type support for: public.image public.png public.jpeg public.heic public.heif com.adobe.pdf This works for some document-open flows, such as opening files from Files or Photos in certain cases. However, it does not make the app appear reliably as a share target from Screenshot Share or from some third-party app share sheets. App Intents We tried using App Intents with IntentFile and: static var openAppWhenRun: Bool = true However, this does not seem to create a general-purpose share-sheet receiver for arbitrary image or PDF NSItemProvider payloads. Share Extension We also implemented a Share Extension that: Receives the shared NSItemProvider. Stores the image or PDF in an app group container. Attempts to open the containing app. However: NSExtensionContext.open(_:completionHandler:) does not appear to foreground the containing app from a Share Extension in the way we need. We also tested responder-chain openURL: trampoline approaches, but those do not work reliably and appear to be unsupported as a public API contract. Questions Is there a supported public API for an iOS app to appear as a share target for arbitrary image/PDF NSItemProvider payloads and then directly open the containing app? If apps such as ChatGPT or Claude appear to switch directly into the main app from the share sheet, is that behavior achievable using public APIs available to third-party developers? If directly opening the containing app is not supported, what is the recommended architecture when the import/upload depends on authenticated app state? Is Apple’s recommended design that the Share Extension itself must perform the full import/upload operation, even when that operation depends on auth validation or token refresh? Is there a supported handoff mechanism where the Share Extension can persist the file in an app group container and then ask the system to open the containing app to continue the flow? Are App Intents intended to support this kind of share-sheet attachment import flow, either currently or in a future iOS version? Reproduction Steps We created a focused sample project to reproduce the issue. Build and run the app on a physical iPhone. Leave the app installed. Capture a screenshot. Tap the screenshot thumbnail. Tap the Share button. Choose the app’s Share Extension from the share sheet. Observe that the Share Extension receives the image payload. Attempt to open the containing app from the extension. Expected Result The Share Extension receives the shared image or PDF, stores it in an app group container, and the containing app foregrounds. The containing app then validates the user’s authenticated session, refreshes tokens if needed, and performs the import/upload. Actual Result The Share Extension receives the image payload and logs the provider type identifiers, but the containing app does not reliably foreground. NSExtensionContext.open does not provide the desired transition, and responder-chain URL-opening workarounds do not appear to be supported or reliable. Minimal Question For image/PDF imports from the iOS share sheet, where upload/import requires authenticated app state, what is the supported implementation? Is it expected to be: Share Extension receives the file → Share Extension performs auth-dependent upload/import itself or is there a supported way to implement: Share Extension receives the file → Share Extension stores the file in app group container → Share Extension opens or hands off to containing app → Main app performs auth validation and upload/import Any guidance on the supported architecture would be appreciated. Thank you.
1
0
77
4d
Supported public API to open containing iOS app from Share Extension for image/PDF share sheet imports
Here’s a polished Apple Developer Forums post you can use. I removed personal identifiers such as email, Person ID, Team ID, and DTS Case ID because the forums are public. The post is based on your DTS request and Apple’s response directing you to ask in the Developer Forums.  ⸻ Title Supported public API to open containing iOS app from Share Extension for image/PDF share sheet imports Tags iOS Share Extension UIKit App Intents Uniform Type Identifiers Post Body Hello Apple Developer Forums, We are building an iOS app that needs to receive images and PDFs shared from the system share sheet. The sources include Screenshots, Photos, Files, and third-party apps. The desired user experience is similar to apps such as ChatGPT or Claude: when the user taps our app in the share sheet, the main containing app opens and starts importing or uploading the shared image or PDF. We are trying to understand the supported public API for this behavior. What we have tried CFBundleDocumentTypes We added document type support for: public.image public.png public.jpeg public.heic public.heif com.adobe.pdf This works for some document-open flows, such as opening files from Files or Photos in certain cases. However, it does not make the app appear reliably as a share target from Screenshot Share or from some third-party app share sheets. App Intents We tried using App Intents with IntentFile and: static var openAppWhenRun: Bool = true However, this does not seem to create a general-purpose share-sheet receiver for arbitrary image or PDF NSItemProvider payloads. Share Extension We also implemented a Share Extension that: Receives the shared NSItemProvider. Stores the image or PDF in an app group container. Attempts to open the containing app. However: NSExtensionContext.open(_:completionHandler:) does not appear to foreground the containing app from a Share Extension in the way we need. We also tested responder-chain openURL: trampoline approaches, but those do not work reliably and appear to be unsupported as a public API contract. Questions Is there a supported public API for an iOS app to appear as a share target for arbitrary image/PDF NSItemProvider payloads and then directly open the containing app? If apps such as ChatGPT or Claude appear to switch directly into the main app from the share sheet, is that behavior achievable using public APIs available to third-party developers? If directly opening the containing app is not supported, is the recommended design to perform all upload/import work inside the Share Extension itself? Are App Intents intended to support this kind of share-sheet attachment import flow, either currently or in a future iOS version? Reproduction Steps We created a focused sample project to reproduce the issue. Build and run the app on a physical iPhone. Leave the app installed. Capture a screenshot. Tap the screenshot thumbnail. Tap the Share button. Choose the app’s Share Extension from the share sheet. Observe that the Share Extension receives the image payload. Attempt to open the containing app from the extension. Expected Result The containing app should foreground and receive a URL or other handoff signal indicating that a shared file is available for import. Actual Result The Share Extension receives the image payload and logs the provider type identifiers, but the containing app does not reliably foreground. NSExtensionContext.open does not provide the desired transition, and responder-chain URL-opening workarounds do not appear to be supported or reliable. Minimal Question For image/PDF imports from the iOS share sheet, should the supported implementation be: Share Extension receives the file → Share Extension performs the upload/import itself rather than: Share Extension receives the file → Share Extension opens containing app → Main app performs upload/import Any guidance on the supported architecture would be appreciated. Thank you.
0
0
44
4d
EXC_BAD_ACCESS in drawHierarchy(in:afterScreenUpdates:) on iOS 26.3.1+ — IOSurface CIF10 decompression crash
We're experiencing an EXC_BAD_ACCESS (SIGSEGV) crash in UIView.drawHierarchy(in:afterScreenUpdates: false) that occurs only on iOS 26.3.1 and later. It does not reproduce on iOS 26.3.0 or earlier. Crash Stack Thread 0 (Main Thread) — EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0 libvDSP.dylib vConvert_XRGB2101010ToARGB8888_vec 1 ImageIO IIOIOSurfaceWrapper_CIF10::CopyImageBlockSetWithOptions 2 ImageIO IIOImageProviderInfo::CopyImageBlockSetWithOptions 3 ImageIO CGImageReadGetBytesAtOffset 4 CoreGraphics CGAccessSessionGetBytes 5 CoreGraphics img_data_lock 6 CoreGraphics CGSImageDataLock 7 CoreGraphics ripc_AcquireImage 8 CoreGraphics ripc_DrawImage 9 CoreGraphics CGContextDrawImage 10 UIKitCore -[UIView(Rendering) drawHierarchy:afterScreenUpdates:] The crash occurs during 10-bit CIF10 → 8-bit ARGB8888 pixel conversion when the IOSurface backing a UIImageView in the view hierarchy is deallocated mid-render. How to Reproduce Display a scrollable list with multiple UIImageViews loaded via an async image library Call drawHierarchy(in: bounds, afterScreenUpdates: false) on visible cells periodically Scroll to trigger image recycling Crash occurs sporadically — more likely under memory pressure or rapid image recycling What We've Tried Both UIKit off-screen rendering approaches crash on iOS 26.3.1: Approach Result drawHierarchy(afterScreenUpdates: false) EXC_BAD_ACCESS in CIF10 IOSurface decompression view.layer.render(in:) EXC_BAD_ACCESS in Metal (agxaAssertBufferIsValid) iOS Version Correlation iOS 26.3.0 and earlier: No crash iOS 26.3.1 (23D8133)+: Crash occurs (~5 events per 7 days) We suspect the ImageIO security patches in iOS 26.3 (CVE-2026-20675, CVE-2026-20634) may have changed IOSurface lifecycle timing, exposing a race condition between drawHierarchy's composited buffer read and asynchronous IOSurface reclamation by the OS. Crash Data We sampled 3 crash events: Event 1 (iOS 26.3.1): 71 MB free memory — memory pressure Event 2 (iOS 26.3.1): 88 MB free memory — memory pressure Event 3 (iOS 26.3.2): 768 MB free memory — NOT memory pressure Event 3 shows this isn't purely a low-memory issue. The IOSurface can be reclaimed even with ample free memory, likely due to async image recycling. Question Is this a known regression in iOS 26.3.1? Is there a safe way to snapshot a view hierarchy containing IOSurface-backed images without risking EXC_BAD_ACCESS? Should drawHierarchy gracefully handle the case where an IOSurface backing store is reclaimed during the render? Any guidance or workarounds would be appreciated. We've also filed this as Feedback (will update with FB number after submission).
2
0
278
2w
Display 17 PRO series
Dear colleagues, when will you add the ability to manually adjust the display's color temperature? Everyone is familiar with the color rendering issue on the 17 series. Many complain about the display's yellowish tint. I'm one of those people, with a G9N panel, but I can't get whites right; there's a persistent yellow tint. TrueTone only solves this problem under cool, white lighting conditions. So, the display might work as intended, but how can I make it work consistently? If there were a way to manually adjust TrueTone, many users wouldn't be so upset when buying a new device, fearing the display would be yellow. I'd like users to be able to choose their preferred color, warm or cool, and have a scale to adjust it! This would solve the yellowish tint issue on 17 series displays! Thank you.
0
0
155
Feb ’26
App Store Connect rejects screenshot upload: “incorrect size” (subscription purchase flow) — tried all documented sizes
Hello Apple Developer Forums, I’m preparing to submit an app update that includes an in-app subscription. As part of the submission, I need to provide screenshots showing where the user initiates and completes the subscription purchase flow. The issue is that App Store Connect keeps rejecting my screenshot upload with an “incorrect size” (or size invalid) error. I have already tried exporting the screenshot in all sizes and resolutions described in Apple’s documentation, but none of them are being accepted so far. Could you please advise: What exact pixel dimensions / format requirements App Store Connect currently enforces for these screenshots (including file type and color profile, if relevant)? Whether there are any known issues or common causes for this error (e.g., metadata, alpha channel, scaling, or export settings)? Any recommended workflow/tools to generate a compliant screenshot that reliably uploads? Thank you in advance for your help.
3
0
175
Jan ’26
ProRAW: Demystify 48MP vs 12MP binning based on lighting?
Hi everyone, does anybody have any resources I could check out regarding the 48->12mp binning behavior on supported sensors? I know the 48mp sensor on iPhone can automatically bin pixels for better low light performance. But not sure how to reliably make this happen in practice. On iPhone 14 Pro+ with a 48MP sensor, I want the best of both worlds for ProRAW: ∙ Bright light: 48MP full resolution ∙ Low light: 12MP pixel-binned for better noise `photoOutput.maxPhotoDimensions = CMVideoDimensions(width: 8064, height: 6048) let settings = AVCapturePhotoSettings(rawPixelFormatType: proRawFormat, processedFormat: [...]) settings.photoQualityPrioritization = .quality // NOT setting settings.maxPhotoDimensions — always get 12MP` When I omit maxPhotoDimensions, iOS always returns 12MP regardless of lighting. When I set it to 48MP, I always get 48MP. Is there an API to let iOS automatically choose the optimal resolution based on conditions, or should I detect low light myself (via device.iso / exposureDuration) and set maxPhotoDimensions accordingly? Any help or direction would be much appreciated!
0
0
701
Jan ’26
Cannot make my app appear in “Share with App” action in Shortcuts – How to allow receiving images from Shortcuts?
Hi, I’m trying to integrate my iOS app with Shortcuts. My goal is: In the Shortcuts app → Create a shortcut → Select an image → Share the image directly to my app for analysis. However, when I try to add the “Share with App” / “Open in App” / “Send to App” action in Shortcuts: My app does NOT appear in the list of available apps. I want my app to be selectable so that Shortcuts can send an image (UIImage / file) to my app. What I have tried My app supports receiving images using UIActivityViewController and Share Extension. I created an App Intents extension (AppIntent + @Parameter(file)...) but the app still does not appear in Shortcuts “Share with App”. I also checked the Info.plist but didn’t find any permission related to Shortcuts. The app is installed on the device and works normally. My question What permission, Info.plist entry, or capability is required so that my app becomes visible in the Shortcuts app as a target for image sharing? More specifically: Which extension type should be used for receiving images from Shortcuts? App Intents Extension? Share Extension? Intent Extension? Do I need a specific NSExtensionPointIdentifier for Shortcuts integration? Do I need to declare a custom Uniform Type Identifier (UTI) or add supported content types so Shortcuts knows my app can handle images? Are there any required entitlements / capabilities to make the app appear inside the “Share with App” action? Goal Summary I simply want: Shortcuts → Pick Image → Send to My App → App receives the image and processes it. But currently my app cannot be selected in Shortcuts. Thanks in advance for any guidance!
3
0
398
Dec ’25
Since iOS 18.3, icons are no longer generated correctly with QLThumbnailGenerator
Since iOS 18.3, icons are no longer generated correctly with QLThumbnailGenerator. No error is returned either. But this error message now appears in the console: Error returned from iconservicesagent image request: <ISTypeIcon: 0x3010f91a0>,Type: com.adobe.pdf - <ISImageDescriptor: 0x302f188c0> - (36.00, 36.00)@3x v:1 l:5 a:0:0:0:0 t:() b:0 s:2 ps:0 digest: B19540FD-0449-3E89-AC50-38F92F9760FE error: Error Domain=NSOSStatusErrorDomain Code=-609 "Client is disallowed from making such an icon request" UserInfo={NSLocalizedDescription=Client is disallowed from making such an icon request} Does anyone know this error? Is there a workaround? Are there new permissions to consider? Here is the code how icons are generated: let request = QLThumbnailGenerator.Request(fileAt: url, size: size, scale: scale, representationTypes: self.thumbnailType) request.iconMode = true let generator = QLThumbnailGenerator.shared generator.generateRepresentations(for: request) { [weak self] thumbnail, _, error in }
16
5
1.8k
Nov ’25
“iOS 26 + BGContinuedProcessingTask: Why does a CPU/ML-intensive job run 4-5× slower in background?”
Hello All, I’m a mobile-app developer working with iOS 26+ and I’m using BGContinuedProcessingTask to perform background work. My app’s workflow includes the following business logic: Loading images via PHImageRequest. Using a CLIP model to extract image embeddings. Using an .mlmodel-based model to further process those embeddings. For both model inferences I set computeUnits = .cpuAndNeuralEngine. When the app is moved to the background, I observe that the same workload(all three workload) becomes on average 4-5× slower than when the app is in the foreground. In an attempt to diagnose the slowdown, I tried to profile with Xcode Instruments, but since a debugger was attached, the performance in background appeared nearly identical to foreground. Even when I detached the debugger, the measured system resource metrics (process CPU usage, system CPU usage, memory, QoS class, thermal state) showed no meaningful difference. Below are some of the metrics I captured: Process CPU: 177% (Foreground) → 153% (Background) → ~-24.1% Still >1.5 cores of work. System CPU: 56.1% → 38.4% → ~-17.7% Process Memory: 244.8 MB → 218.1 MB QoS Class: userInitiated in both cases Thermal State: nominal in both cases Given these results, I’m finding it hard to pinpoint why the overall latency is so much worse when the app is backgrounded, even though the obvious metrics show little variation. I suspect the cause may involve P-core vs E-core scheduling, or internal hardware throttling/limit of Neural Engine usage, but I cannot find clear documentation or logging to confirm this. My question is: Does anyone know why a CPU (and Neural Engine)-intensive job like this would slow down so dramatically when using BGContinuedProcessingTask in the background on iOS 26+, despite apparent similar resource-usage metrics? Are there internal iOS scheduling/hardware-allocation behaviors (e.g., falling back to lower-performing cores when backgrounded) that might explain this? Any pointers to Apple technical notes, system logs, or instrumentation I might use to detect which cores or compute units are being used would be greatly appreciated. Thank you for your time and any guidance you can provide. Best regards,
1
0
545
Nov ’25
Not Getting Realistic Camera Output Even After Capturing RAW (.dng) Images on iOS
Hi everyone, I’m working on a custom camera implementation in iOS using native code. My goal is to capture unprocessed, realistic images directly from the camera — without any filters or post-image processing applied by the system. I’ve implemented RAW image capture using the native camera APIs (AVFoundation) and successfully received .dng files. However, even the RAW outputs don’t look like the real environment — the colors, tone, and exposure still seem processed or corrected in some way. I’ve tried various configurations such as photoSettings.rawPhotoPixelFormatType, experimenting with AVCaptureDevice and AVCapturePhotoOutput settings, and reviewing ProRAW and standard RAW behavior, but I’m still not getting truly unprocessed results that reflect the actual sensor data. Has anyone experienced similar results when capturing RAW images on iOS, or found a way to bypass Apple’s image signal processing (ISP) pipeline for more realistic captures? Any insights or references from Apple’s camera framework behavior would be greatly appreciated. Thank you!
0
0
340
Oct ’25
Present .icon in app for AppIcon Picker feature
I'm working on an AppIcon selector and would like to do something like UIImage(named: "AppIcon-Alternate") to present the icon for the user to choose using the new IconComposer icons. I've done a fair bit of research on this and it looks like this used to be possible (prior to .icon) with workarounds that were later 'fixed' / removed (appending 60x60 to the icon name). The only 'solution' seems to be bundling the exported images into the app itself but this seems like a terrible idea as it massively bloats the app. Assuming we export from the new IconComposer tool and want to include dark mode that's roughly 3MB per icon which is absolutely shocking bloat and so a terrible solution. Looking into the app the Assets.car actually generates png files for these alternate icons. These are in the json as "MultiSized Image" assets. Interestingly using UIImage(named: is actually attempting to load these but fails to resolve an kCSIElementSignature. Also the OS alert when switching alternate icon shows a preview of the icon so this must be privately possible and using Asset Catalog Tinkerer I'm able to see these pngs. This feels like broken API; I'd guess the new icon format is not correctly generating the entry in the Asset.car to link the generated pngs for usage with UIImage(named:) API. Does anyone have pointers for this? This feels like a developer API afterthought or bug but is it intentional? Edit: I've submitted feedback for this FB20341182.
0
0
364
Sep ’25
Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
2
3
233
Aug ’25
making preview for app
I have a small .mov I created using screenshot and I want to use it as a preview. I have managed to resize it to the required 1920x1080, added a sound track using ffmpeg (home-brew), drop it into an iMovie App preview project, share it as a file, drag that file to App Store Connect/Apps/myApp/"App previews and Screenshots" only to have it rejected for "frame rate too high", 30 fps required. There appears to be no way to specify frame rate in "Screenshot" nor iMovie during "share". Aside from using a third party app "Handbrake" to edit the file, what can be done? Maybe more importantly, why is 30 fps required when it isn't a standard output of screenshot nor iMovie/AppPreviewProject ? btw: iMovie/AppPreview/Help shows submittal of non-1920x1080 files to AppStoreConnect
0
0
214
Jul ’25
Image cropping
Currently, I’m working on developing a small macOS utility tool for my photography. In my camera, I have a digital zoom feature. I prefer using this feature when I shoot both JPEG and DNG files. While the JPEG is already cropped to the desired format, the DNG file contains metadata (DefaultUserCrop: 0.22, 0.22, 0.78, 0.78). For instance, when I open that DNG file in Lightroom, it pre-crops the image non-destructively. However, I prefer using Pixelmator Pro for editing. Unfortunately, Pixelmator Pro doesn’t have this feature. So, I thought I could create an app that allows me to pre-crop the image for editing in Pixelmator Pro afterward. Does someone have a better idea or some hints on how I could solve it?
1
0
361
Jul ’25
Why does converting HEIC/HEIF to JPEG using UIImage.jpegData(compressionQuality: 1.0) significantly increase file size?
I'm working with images selected from the iOS Photos app using PHPickerViewController. Some images appear as HEIF in the Photos info panel — which I understand are stored in the HEIC format, i.e., HEIF containers with HEVC-compressed images, commonly used on iOS when "High Efficiency" is enabled. To convert these images to JPEG, I'm using the standard UIKit approach: if let image = UIImage(data: heicData) { let jpegData = image.jpegData(compressionQuality: 1.0) } However, I’ve noticed that this conversion often increases the image size significantly: Original HEIC/HEIF: ~3 MB Converted JPEG (quality: 1.0): ~8–12 MB There’s no resolution change or image editing — it’s just a direct conversion. I understand that HEIC is more efficient than JPEG, but the increase in file size feels disproportionate. Is this kind of jump expected, or are there any recommended workarounds to avoid it?
1
0
336
Jul ’25
ExcUserFault and corrupted data when using UIImage#heicData
Over the last two weeks, I’ve had sporadic reports from users who suddenly have a corrupt image in the database for my app. It’s only affecting a few users and may possibly have been fixed with iOS 26.4.1 (I’m not sure). In any case, this started suddenly with the release of iOS 26.4 - our app has not been changed in several months. But I wanted to share what’s happening in case others are experiencing this. My app lets users import photos from the camera roll, photo albums, etc. Once the user has selected an image, the app saves this to a SQLite3 database using “image.heicData()”. For the four or five users who have been affected by this problem, the heicData call returns successfully, with a non-nil Data value. But the image itself is corrupt and unreadable. When the user tries to later open a screen containing the image, the app crashes. I’ve had to manually guide each user through tracking down and removing the affected item or items to resolve it, which is a bad experience for them and time-consuming for us. Our app crashes when it tries to read the image (using “UIImage(data: heicData)”). All users who have this problem have had an ExcUserFault file in their crash reports with our app name in it. It's not possible to symbolicate this file but i've included an excerpt at the bottom of this post: I was able to extract some raw data saved when this error occurs. When you run “file corrupt_image.heic”, you get: AmigaOS bitmap font "rtypheic", fc_YSize 0, 35001 elements which definitely doesn’t seem right. On a valid HEIC file, i get: ISO Media, HEIF Image HEVC Main or Main Still Picture Profile Is anyone else experience this? Or does anyone else have any suggestions about what could be happening? I submitted feedback FB22667639 about this. ExcUserFault example Exception Type: EXC_GUARD Exception Subtype: GUARD_TYPE_USER Exception Message: namespc 7 reason_code 0x0000000000000009 Exception Codes: 0x6000000000000007, 0x0000000000000009 Termination Reason: Namespace LIBXPC, Code 9, XPC_EXIT_REASON_FAULT Thread 0: 0 ??? 0x231fa997c 0x180000000 + 2985990524 1 ??? 0x197eb98b4 0x180000000 + 401316020 2 ??? 0x197ec4e04 0x180000000 + 401362436 3 ??? 0x197ec5ea0 0x180000000 + 401366688 4 ??? 0x19066bdb8 0x180000000 + 275168696 5 ??? 0x19066b968 0x180000000 + 275167592 6 ??? 0x19b5c19a4 0x180000000 + 459020708 7 ??? 0x19b5cfa2c 0x180000000 + 459078188 8 ??? 0x19b5cf838 0x180000000 + 459077688 9 ??? 0x197ec7c74 0x180000000 + 401374324 10 ??? 0x197ec991c 0x180000000 + 401381660 11 ??? 0x1bd74222c 0x180000000 + 1031021100 12 ??? 0x1bd744ba4 0x180000000 + 1031031716 13 ??? 0x1bd730e18 0x180000000 + 1030950424 14 ??? 0x1bd7458f8 0x180000000 + 1031035128 15 ??? 0x1bd730e18 0x180000000 + 1030950424 16 ??? 0x1bd731ae4 0x180000000 + 1030953700 17 ??? 0x1bd73bdac 0x180000000 + 1030995372 18 ??? 0x1bd73b6ac 0x180000000 + 1030993580 19 ??? 0x1e23283b0 0x180000000 + 1647477680 20 ??? 0x1e23278c0 0x180000000 + 1647474880
Replies
1
Boosts
0
Views
167
Activity
1d
Supported public API to open containing iOS app from Share Extension for image/PDF share sheet imports
Hello Apple Developer Forums, We are building an iOS app that needs to receive images and PDFs shared from the system share sheet. The sources include Screenshots, Photos, Files, and third-party apps. The desired user experience is similar to apps such as ChatGPT or Claude: when the user taps our app in the share sheet, the main containing app opens and starts importing or uploading the shared image or PDF. We are trying to understand the supported public API for this behavior. Why opening the containing app is important For our use case, it is important that the containing app opens during the share flow. The import/upload operation depends on the user’s authenticated session. If the Share Extension attempts to upload the file directly, the auth token available to the extension could be missing, expired, or invalid. We would prefer not to make the Share Extension responsible for authentication-dependent behavior such as: validating the user session refreshing tokens handling expired credentials presenting login or re-authentication UI owning upload retry logic tied to auth state In our architecture, authentication and token refresh are owned by the containing app. The Share Extension should ideally only receive the shared file, persist it in an app group container, and hand off to the main app. The main app would then validate auth state, refresh tokens if needed, and perform the import/upload. So the desired flow is: Share Extension receives image/PDF → Share Extension stores file in app group container → Containing app opens → Containing app validates auth/session state → Containing app imports/uploads the file The alternative flow is problematic for us: Share Extension receives image/PDF → Share Extension attempts upload directly → Upload may fail if auth token is expired or unavailable → Share Extension would need auth/session responsibilities We are trying to avoid having an authentication dependency inside the Share Extension implementation. What we have tried CFBundleDocumentTypes We added document type support for: public.image public.png public.jpeg public.heic public.heif com.adobe.pdf This works for some document-open flows, such as opening files from Files or Photos in certain cases. However, it does not make the app appear reliably as a share target from Screenshot Share or from some third-party app share sheets. App Intents We tried using App Intents with IntentFile and: static var openAppWhenRun: Bool = true However, this does not seem to create a general-purpose share-sheet receiver for arbitrary image or PDF NSItemProvider payloads. Share Extension We also implemented a Share Extension that: Receives the shared NSItemProvider. Stores the image or PDF in an app group container. Attempts to open the containing app. However: NSExtensionContext.open(_:completionHandler:) does not appear to foreground the containing app from a Share Extension in the way we need. We also tested responder-chain openURL: trampoline approaches, but those do not work reliably and appear to be unsupported as a public API contract. Questions Is there a supported public API for an iOS app to appear as a share target for arbitrary image/PDF NSItemProvider payloads and then directly open the containing app? If apps such as ChatGPT or Claude appear to switch directly into the main app from the share sheet, is that behavior achievable using public APIs available to third-party developers? If directly opening the containing app is not supported, what is the recommended architecture when the import/upload depends on authenticated app state? Is Apple’s recommended design that the Share Extension itself must perform the full import/upload operation, even when that operation depends on auth validation or token refresh? Is there a supported handoff mechanism where the Share Extension can persist the file in an app group container and then ask the system to open the containing app to continue the flow? Are App Intents intended to support this kind of share-sheet attachment import flow, either currently or in a future iOS version? Reproduction Steps We created a focused sample project to reproduce the issue. Build and run the app on a physical iPhone. Leave the app installed. Capture a screenshot. Tap the screenshot thumbnail. Tap the Share button. Choose the app’s Share Extension from the share sheet. Observe that the Share Extension receives the image payload. Attempt to open the containing app from the extension. Expected Result The Share Extension receives the shared image or PDF, stores it in an app group container, and the containing app foregrounds. The containing app then validates the user’s authenticated session, refreshes tokens if needed, and performs the import/upload. Actual Result The Share Extension receives the image payload and logs the provider type identifiers, but the containing app does not reliably foreground. NSExtensionContext.open does not provide the desired transition, and responder-chain URL-opening workarounds do not appear to be supported or reliable. Minimal Question For image/PDF imports from the iOS share sheet, where upload/import requires authenticated app state, what is the supported implementation? Is it expected to be: Share Extension receives the file → Share Extension performs auth-dependent upload/import itself or is there a supported way to implement: Share Extension receives the file → Share Extension stores the file in app group container → Share Extension opens or hands off to containing app → Main app performs auth validation and upload/import Any guidance on the supported architecture would be appreciated. Thank you.
Replies
1
Boosts
0
Views
77
Activity
4d
Supported public API to open containing iOS app from Share Extension for image/PDF share sheet imports
Here’s a polished Apple Developer Forums post you can use. I removed personal identifiers such as email, Person ID, Team ID, and DTS Case ID because the forums are public. The post is based on your DTS request and Apple’s response directing you to ask in the Developer Forums.  ⸻ Title Supported public API to open containing iOS app from Share Extension for image/PDF share sheet imports Tags iOS Share Extension UIKit App Intents Uniform Type Identifiers Post Body Hello Apple Developer Forums, We are building an iOS app that needs to receive images and PDFs shared from the system share sheet. The sources include Screenshots, Photos, Files, and third-party apps. The desired user experience is similar to apps such as ChatGPT or Claude: when the user taps our app in the share sheet, the main containing app opens and starts importing or uploading the shared image or PDF. We are trying to understand the supported public API for this behavior. What we have tried CFBundleDocumentTypes We added document type support for: public.image public.png public.jpeg public.heic public.heif com.adobe.pdf This works for some document-open flows, such as opening files from Files or Photos in certain cases. However, it does not make the app appear reliably as a share target from Screenshot Share or from some third-party app share sheets. App Intents We tried using App Intents with IntentFile and: static var openAppWhenRun: Bool = true However, this does not seem to create a general-purpose share-sheet receiver for arbitrary image or PDF NSItemProvider payloads. Share Extension We also implemented a Share Extension that: Receives the shared NSItemProvider. Stores the image or PDF in an app group container. Attempts to open the containing app. However: NSExtensionContext.open(_:completionHandler:) does not appear to foreground the containing app from a Share Extension in the way we need. We also tested responder-chain openURL: trampoline approaches, but those do not work reliably and appear to be unsupported as a public API contract. Questions Is there a supported public API for an iOS app to appear as a share target for arbitrary image/PDF NSItemProvider payloads and then directly open the containing app? If apps such as ChatGPT or Claude appear to switch directly into the main app from the share sheet, is that behavior achievable using public APIs available to third-party developers? If directly opening the containing app is not supported, is the recommended design to perform all upload/import work inside the Share Extension itself? Are App Intents intended to support this kind of share-sheet attachment import flow, either currently or in a future iOS version? Reproduction Steps We created a focused sample project to reproduce the issue. Build and run the app on a physical iPhone. Leave the app installed. Capture a screenshot. Tap the screenshot thumbnail. Tap the Share button. Choose the app’s Share Extension from the share sheet. Observe that the Share Extension receives the image payload. Attempt to open the containing app from the extension. Expected Result The containing app should foreground and receive a URL or other handoff signal indicating that a shared file is available for import. Actual Result The Share Extension receives the image payload and logs the provider type identifiers, but the containing app does not reliably foreground. NSExtensionContext.open does not provide the desired transition, and responder-chain URL-opening workarounds do not appear to be supported or reliable. Minimal Question For image/PDF imports from the iOS share sheet, should the supported implementation be: Share Extension receives the file → Share Extension performs the upload/import itself rather than: Share Extension receives the file → Share Extension opens containing app → Main app performs upload/import Any guidance on the supported architecture would be appreciated. Thank you.
Replies
0
Boosts
0
Views
44
Activity
4d
EXC_BAD_ACCESS in drawHierarchy(in:afterScreenUpdates:) on iOS 26.3.1+ — IOSurface CIF10 decompression crash
We're experiencing an EXC_BAD_ACCESS (SIGSEGV) crash in UIView.drawHierarchy(in:afterScreenUpdates: false) that occurs only on iOS 26.3.1 and later. It does not reproduce on iOS 26.3.0 or earlier. Crash Stack Thread 0 (Main Thread) — EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0 libvDSP.dylib vConvert_XRGB2101010ToARGB8888_vec 1 ImageIO IIOIOSurfaceWrapper_CIF10::CopyImageBlockSetWithOptions 2 ImageIO IIOImageProviderInfo::CopyImageBlockSetWithOptions 3 ImageIO CGImageReadGetBytesAtOffset 4 CoreGraphics CGAccessSessionGetBytes 5 CoreGraphics img_data_lock 6 CoreGraphics CGSImageDataLock 7 CoreGraphics ripc_AcquireImage 8 CoreGraphics ripc_DrawImage 9 CoreGraphics CGContextDrawImage 10 UIKitCore -[UIView(Rendering) drawHierarchy:afterScreenUpdates:] The crash occurs during 10-bit CIF10 → 8-bit ARGB8888 pixel conversion when the IOSurface backing a UIImageView in the view hierarchy is deallocated mid-render. How to Reproduce Display a scrollable list with multiple UIImageViews loaded via an async image library Call drawHierarchy(in: bounds, afterScreenUpdates: false) on visible cells periodically Scroll to trigger image recycling Crash occurs sporadically — more likely under memory pressure or rapid image recycling What We've Tried Both UIKit off-screen rendering approaches crash on iOS 26.3.1: Approach Result drawHierarchy(afterScreenUpdates: false) EXC_BAD_ACCESS in CIF10 IOSurface decompression view.layer.render(in:) EXC_BAD_ACCESS in Metal (agxaAssertBufferIsValid) iOS Version Correlation iOS 26.3.0 and earlier: No crash iOS 26.3.1 (23D8133)+: Crash occurs (~5 events per 7 days) We suspect the ImageIO security patches in iOS 26.3 (CVE-2026-20675, CVE-2026-20634) may have changed IOSurface lifecycle timing, exposing a race condition between drawHierarchy's composited buffer read and asynchronous IOSurface reclamation by the OS. Crash Data We sampled 3 crash events: Event 1 (iOS 26.3.1): 71 MB free memory — memory pressure Event 2 (iOS 26.3.1): 88 MB free memory — memory pressure Event 3 (iOS 26.3.2): 768 MB free memory — NOT memory pressure Event 3 shows this isn't purely a low-memory issue. The IOSurface can be reclaimed even with ample free memory, likely due to async image recycling. Question Is this a known regression in iOS 26.3.1? Is there a safe way to snapshot a view hierarchy containing IOSurface-backed images without risking EXC_BAD_ACCESS? Should drawHierarchy gracefully handle the case where an IOSurface backing store is reclaimed during the render? Any guidance or workarounds would be appreciated. We've also filed this as Feedback (will update with FB number after submission).
Replies
2
Boosts
0
Views
278
Activity
2w
CoreMotion QuickTake Video Data iOS 17.5.1
I am conducting a forensic examination of a video, and it would be helpful if anyone knew how to decode the LiveTrackInfo of the metadata of a QuickTake .MOV recorded on iOS 17.5.1 There are 27 different fields, and I am not sure what each one represents. Any help would be appreciated! Thank you! log-file
Replies
0
Boosts
0
Views
373
Activity
Mar ’26
Display 17 PRO series
Dear colleagues, when will you add the ability to manually adjust the display's color temperature? Everyone is familiar with the color rendering issue on the 17 series. Many complain about the display's yellowish tint. I'm one of those people, with a G9N panel, but I can't get whites right; there's a persistent yellow tint. TrueTone only solves this problem under cool, white lighting conditions. So, the display might work as intended, but how can I make it work consistently? If there were a way to manually adjust TrueTone, many users wouldn't be so upset when buying a new device, fearing the display would be yellow. I'd like users to be able to choose their preferred color, warm or cool, and have a scale to adjust it! This would solve the yellowish tint issue on 17 series displays! Thank you.
Replies
0
Boosts
0
Views
155
Activity
Feb ’26
App Store Connect rejects screenshot upload: “incorrect size” (subscription purchase flow) — tried all documented sizes
Hello Apple Developer Forums, I’m preparing to submit an app update that includes an in-app subscription. As part of the submission, I need to provide screenshots showing where the user initiates and completes the subscription purchase flow. The issue is that App Store Connect keeps rejecting my screenshot upload with an “incorrect size” (or size invalid) error. I have already tried exporting the screenshot in all sizes and resolutions described in Apple’s documentation, but none of them are being accepted so far. Could you please advise: What exact pixel dimensions / format requirements App Store Connect currently enforces for these screenshots (including file type and color profile, if relevant)? Whether there are any known issues or common causes for this error (e.g., metadata, alpha channel, scaling, or export settings)? Any recommended workflow/tools to generate a compliant screenshot that reliably uploads? Thank you in advance for your help.
Replies
3
Boosts
0
Views
175
Activity
Jan ’26
ProRAW: Demystify 48MP vs 12MP binning based on lighting?
Hi everyone, does anybody have any resources I could check out regarding the 48->12mp binning behavior on supported sensors? I know the 48mp sensor on iPhone can automatically bin pixels for better low light performance. But not sure how to reliably make this happen in practice. On iPhone 14 Pro+ with a 48MP sensor, I want the best of both worlds for ProRAW: ∙ Bright light: 48MP full resolution ∙ Low light: 12MP pixel-binned for better noise `photoOutput.maxPhotoDimensions = CMVideoDimensions(width: 8064, height: 6048) let settings = AVCapturePhotoSettings(rawPixelFormatType: proRawFormat, processedFormat: [...]) settings.photoQualityPrioritization = .quality // NOT setting settings.maxPhotoDimensions — always get 12MP` When I omit maxPhotoDimensions, iOS always returns 12MP regardless of lighting. When I set it to 48MP, I always get 48MP. Is there an API to let iOS automatically choose the optimal resolution based on conditions, or should I detect low light myself (via device.iso / exposureDuration) and set maxPhotoDimensions accordingly? Any help or direction would be much appreciated!
Replies
0
Boosts
0
Views
701
Activity
Jan ’26
Cannot make my app appear in “Share with App” action in Shortcuts – How to allow receiving images from Shortcuts?
Hi, I’m trying to integrate my iOS app with Shortcuts. My goal is: In the Shortcuts app → Create a shortcut → Select an image → Share the image directly to my app for analysis. However, when I try to add the “Share with App” / “Open in App” / “Send to App” action in Shortcuts: My app does NOT appear in the list of available apps. I want my app to be selectable so that Shortcuts can send an image (UIImage / file) to my app. What I have tried My app supports receiving images using UIActivityViewController and Share Extension. I created an App Intents extension (AppIntent + @Parameter(file)...) but the app still does not appear in Shortcuts “Share with App”. I also checked the Info.plist but didn’t find any permission related to Shortcuts. The app is installed on the device and works normally. My question What permission, Info.plist entry, or capability is required so that my app becomes visible in the Shortcuts app as a target for image sharing? More specifically: Which extension type should be used for receiving images from Shortcuts? App Intents Extension? Share Extension? Intent Extension? Do I need a specific NSExtensionPointIdentifier for Shortcuts integration? Do I need to declare a custom Uniform Type Identifier (UTI) or add supported content types so Shortcuts knows my app can handle images? Are there any required entitlements / capabilities to make the app appear inside the “Share with App” action? Goal Summary I simply want: Shortcuts → Pick Image → Send to My App → App receives the image and processes it. But currently my app cannot be selected in Shortcuts. Thanks in advance for any guidance!
Replies
3
Boosts
0
Views
398
Activity
Dec ’25
Since iOS 18.3, icons are no longer generated correctly with QLThumbnailGenerator
Since iOS 18.3, icons are no longer generated correctly with QLThumbnailGenerator. No error is returned either. But this error message now appears in the console: Error returned from iconservicesagent image request: <ISTypeIcon: 0x3010f91a0>,Type: com.adobe.pdf - <ISImageDescriptor: 0x302f188c0> - (36.00, 36.00)@3x v:1 l:5 a:0:0:0:0 t:() b:0 s:2 ps:0 digest: B19540FD-0449-3E89-AC50-38F92F9760FE error: Error Domain=NSOSStatusErrorDomain Code=-609 "Client is disallowed from making such an icon request" UserInfo={NSLocalizedDescription=Client is disallowed from making such an icon request} Does anyone know this error? Is there a workaround? Are there new permissions to consider? Here is the code how icons are generated: let request = QLThumbnailGenerator.Request(fileAt: url, size: size, scale: scale, representationTypes: self.thumbnailType) request.iconMode = true let generator = QLThumbnailGenerator.shared generator.generateRepresentations(for: request) { [weak self] thumbnail, _, error in }
Replies
16
Boosts
5
Views
1.8k
Activity
Nov ’25
“iOS 26 + BGContinuedProcessingTask: Why does a CPU/ML-intensive job run 4-5× slower in background?”
Hello All, I’m a mobile-app developer working with iOS 26+ and I’m using BGContinuedProcessingTask to perform background work. My app’s workflow includes the following business logic: Loading images via PHImageRequest. Using a CLIP model to extract image embeddings. Using an .mlmodel-based model to further process those embeddings. For both model inferences I set computeUnits = .cpuAndNeuralEngine. When the app is moved to the background, I observe that the same workload(all three workload) becomes on average 4-5× slower than when the app is in the foreground. In an attempt to diagnose the slowdown, I tried to profile with Xcode Instruments, but since a debugger was attached, the performance in background appeared nearly identical to foreground. Even when I detached the debugger, the measured system resource metrics (process CPU usage, system CPU usage, memory, QoS class, thermal state) showed no meaningful difference. Below are some of the metrics I captured: Process CPU: 177% (Foreground) → 153% (Background) → ~-24.1% Still >1.5 cores of work. System CPU: 56.1% → 38.4% → ~-17.7% Process Memory: 244.8 MB → 218.1 MB QoS Class: userInitiated in both cases Thermal State: nominal in both cases Given these results, I’m finding it hard to pinpoint why the overall latency is so much worse when the app is backgrounded, even though the obvious metrics show little variation. I suspect the cause may involve P-core vs E-core scheduling, or internal hardware throttling/limit of Neural Engine usage, but I cannot find clear documentation or logging to confirm this. My question is: Does anyone know why a CPU (and Neural Engine)-intensive job like this would slow down so dramatically when using BGContinuedProcessingTask in the background on iOS 26+, despite apparent similar resource-usage metrics? Are there internal iOS scheduling/hardware-allocation behaviors (e.g., falling back to lower-performing cores when backgrounded) that might explain this? Any pointers to Apple technical notes, system logs, or instrumentation I might use to detect which cores or compute units are being used would be greatly appreciated. Thank you for your time and any guidance you can provide. Best regards,
Replies
1
Boosts
0
Views
545
Activity
Nov ’25
how to create GIF files?
I want to create GIF file and then use UIImage to it.
Replies
0
Boosts
0
Views
449
Activity
Nov ’25
unity游戏在新版本iphone手机上卡渲染
在正常游戏中,如果非常频繁的调用assetBundle.Unload接口,会导致游戏应用画面卡死,但是游戏的背景音乐仍然正常播放。这类问题仅发生在iphone16 和iphone17的手机上,低版本的手机没有任何问题,请问该如何解决这个问题?
Replies
1
Boosts
0
Views
827
Activity
Nov ’25
Not Getting Realistic Camera Output Even After Capturing RAW (.dng) Images on iOS
Hi everyone, I’m working on a custom camera implementation in iOS using native code. My goal is to capture unprocessed, realistic images directly from the camera — without any filters or post-image processing applied by the system. I’ve implemented RAW image capture using the native camera APIs (AVFoundation) and successfully received .dng files. However, even the RAW outputs don’t look like the real environment — the colors, tone, and exposure still seem processed or corrected in some way. I’ve tried various configurations such as photoSettings.rawPhotoPixelFormatType, experimenting with AVCaptureDevice and AVCapturePhotoOutput settings, and reviewing ProRAW and standard RAW behavior, but I’m still not getting truly unprocessed results that reflect the actual sensor data. Has anyone experienced similar results when capturing RAW images on iOS, or found a way to bypass Apple’s image signal processing (ISP) pipeline for more realistic captures? Any insights or references from Apple’s camera framework behavior would be greatly appreciated. Thank you!
Replies
0
Boosts
0
Views
340
Activity
Oct ’25
Present .icon in app for AppIcon Picker feature
I'm working on an AppIcon selector and would like to do something like UIImage(named: "AppIcon-Alternate") to present the icon for the user to choose using the new IconComposer icons. I've done a fair bit of research on this and it looks like this used to be possible (prior to .icon) with workarounds that were later 'fixed' / removed (appending 60x60 to the icon name). The only 'solution' seems to be bundling the exported images into the app itself but this seems like a terrible idea as it massively bloats the app. Assuming we export from the new IconComposer tool and want to include dark mode that's roughly 3MB per icon which is absolutely shocking bloat and so a terrible solution. Looking into the app the Assets.car actually generates png files for these alternate icons. These are in the json as "MultiSized Image" assets. Interestingly using UIImage(named: is actually attempting to load these but fails to resolve an kCSIElementSignature. Also the OS alert when switching alternate icon shows a preview of the icon so this must be privately possible and using Asset Catalog Tinkerer I'm able to see these pngs. This feels like broken API; I'd guess the new icon format is not correctly generating the entry in the Asset.car to link the generated pngs for usage with UIImage(named:) API. Does anyone have pointers for this? This feels like a developer API afterthought or bug but is it intentional? Edit: I've submitted feedback for this FB20341182.
Replies
0
Boosts
0
Views
364
Activity
Sep ’25
Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
Replies
2
Boosts
3
Views
233
Activity
Aug ’25
Proper printing page from my app design
My app uses VStack and HStack, instead of the normal table format. When I try to print everything works perfect but it will not print the cell outline. What is the correct line code or instruction terminology? Thanks, Hal
Replies
1
Boosts
0
Views
137
Activity
Aug ’25
making preview for app
I have a small .mov I created using screenshot and I want to use it as a preview. I have managed to resize it to the required 1920x1080, added a sound track using ffmpeg (home-brew), drop it into an iMovie App preview project, share it as a file, drag that file to App Store Connect/Apps/myApp/"App previews and Screenshots" only to have it rejected for "frame rate too high", 30 fps required. There appears to be no way to specify frame rate in "Screenshot" nor iMovie during "share". Aside from using a third party app "Handbrake" to edit the file, what can be done? Maybe more importantly, why is 30 fps required when it isn't a standard output of screenshot nor iMovie/AppPreviewProject ? btw: iMovie/AppPreview/Help shows submittal of non-1920x1080 files to AppStoreConnect
Replies
0
Boosts
0
Views
214
Activity
Jul ’25
Image cropping
Currently, I’m working on developing a small macOS utility tool for my photography. In my camera, I have a digital zoom feature. I prefer using this feature when I shoot both JPEG and DNG files. While the JPEG is already cropped to the desired format, the DNG file contains metadata (DefaultUserCrop: 0.22, 0.22, 0.78, 0.78). For instance, when I open that DNG file in Lightroom, it pre-crops the image non-destructively. However, I prefer using Pixelmator Pro for editing. Unfortunately, Pixelmator Pro doesn’t have this feature. So, I thought I could create an app that allows me to pre-crop the image for editing in Pixelmator Pro afterward. Does someone have a better idea or some hints on how I could solve it?
Replies
1
Boosts
0
Views
361
Activity
Jul ’25
Why does converting HEIC/HEIF to JPEG using UIImage.jpegData(compressionQuality: 1.0) significantly increase file size?
I'm working with images selected from the iOS Photos app using PHPickerViewController. Some images appear as HEIF in the Photos info panel — which I understand are stored in the HEIC format, i.e., HEIF containers with HEVC-compressed images, commonly used on iOS when "High Efficiency" is enabled. To convert these images to JPEG, I'm using the standard UIKit approach: if let image = UIImage(data: heicData) { let jpegData = image.jpegData(compressionQuality: 1.0) } However, I’ve noticed that this conversion often increases the image size significantly: Original HEIC/HEIF: ~3 MB Converted JPEG (quality: 1.0): ~8–12 MB There’s no resolution change or image editing — it’s just a direct conversion. I understand that HEIC is more efficient than JPEG, but the increase in file size feels disproportionate. Is this kind of jump expected, or are there any recommended workarounds to avoid it?
Replies
1
Boosts
0
Views
336
Activity
Jul ’25