Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

AVCaptureVideoPreviewLayer layoutSublayers invoked on background thread
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue. We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread. After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called. I've attached the crashlog below. Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
1
1
1.1k
Jun ’25
iOS 14, App crash when call presentLimitedLibraryPickerFromViewControlle
iPhone7 : iOS 14.0 Beta 5 Xcode-beta Mac OS : 10.15.5 (19F101) crash info : ** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:]: unrecognized selector sent to instance xxxxxx' terminating with uncaught exception of type NSException my code: (void)viewDidAppear:(BOOL)animated {   [super viewDidAppear:animated];   if (@available(iOS 14, *)) {     [[PHPhotoLibrary sharedPhotoLibrary] presentLimitedLibraryPickerFromViewController:self];   } }
9
0
4.4k
Nov ’25
Does PhotoKit provide access to People, Places, and Shared Albums?
I know how to search for Smart Albums (favorites, selfies, etc...) containing photos: // Get smart albums PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil]; I have the following questions: Is there a way to enumerate the People Smart Albums and access the photos in a specific People Smart Album? Is there a way to enumerate the Places Smart Albums and access the photos in a specific Place Smart Album? Is there a way to enumerate Shared Albums (shared to the current iCloud user) and access the photos in a specific Shared Album?
1
2
807
Mar ’26
Understanding CMIO Extension
Hello, I am getting the following errors when building a Mac Camera Extension with web sockets. I am using URLSessionWebsocketTask as my web socket library. I built a test program for my code and in there I can see my web sockets are working properly, but when I run it from the System Extension I get the following errors. The socket opens for two - three messages then crashes. I couldnt find any documentation online for the following errors CMIOExtensionProvider.m:1975:-[CMIOExtensionProvider removeProviderContext:]_block_invoke Unregistered provider context <CMIOExtensionProviderContext: ->, don't be surprised if things go badly CMIOExtensionProviderContext.m:64:-[CMIOExtensionProviderContext initWithConnection:]_block_invoke [391] received Connection invalid``
7
0
2.5k
Mar ’26
PHPicker fails to load RAW images
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud. Here is the setup: The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding. The image is loaded from the item provider like this: if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) { itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in // work } } This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console: [claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}} Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}} We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail. We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing. Additional observations: This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image. When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails. The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first. This is reproducable using the Selecting Photos and Videos in iOS sample code project. We observed this happening in other apps that use the PHPicker, not just ours. Is this a bug, or is there something that we are missing?
7
1
3k
Sep ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
1
2
748
Jul ’25
I need a way to permantently disable Reactions from my app, ideally the universe too
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload. So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%. "Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery. Now, how do I make it go away, for ever? Best regards Jacob
10
6
1.5k
Jul ’25
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
1
0
624
Jun ’25
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
2
0
1.2k
Jan ’26
Torch Freezes Ultra-Wide Camera When Switching Between Wide & Ultra-Wide Lenses (AVFoundation Bug?)
I'm developing an iOS app using AVFoundation for real-time video capture and object detection. While implementing torch functionality with camera switching (between Wide and Ultra-Wide lenses), I encountered a critical issue where the camera freezes when toggling the torch while the Ultra-Wide camera is active. Issue If the torch is ON and I switch from Wide to Ultra-Wide, the camera freezes If the Ultra-Wide camera is active and I try to turn the torch ON, the camera freezes The iPhone Camera app allows using the torch while recording video with the Ultra-Wide lens, so this should be possible via AVFoundation as well. Code snippet DispatchQueue.global(qos: .userInitiated).async { [weak self] in guard let self = self else { return } let isSwitchingToUltraWide = !self.isUsingFisheyeCamera let cameraType: AVCaptureDevice.DeviceType = isSwitchingToUltraWide ? .builtInUltraWideCamera : .builtInWideAngleCamera let cameraName = isSwitchingToUltraWide ? "Ultra Wide" : "Wide" guard let selectedCamera = AVCaptureDevice.default(cameraType, for: .video, position: .back) else { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "\(cameraName) camera is not available on this device.") } return } do { let currentInput = self.videoCapture.captureSession.inputs.first as? AVCaptureDeviceInput self.videoCapture.captureSession.beginConfiguration() if isSwitchingToUltraWide && self.isFlashlightOn { self.forceEnableTorchThroughWide() } if let currentInput = currentInput { self.videoCapture.captureSession.removeInput(currentInput) } let videoInput = try AVCaptureDeviceInput(device: selectedCamera) self.videoCapture.captureSession.addInput(videoInput) self.videoCapture.captureSession.commitConfiguration() self.videoCapture.updateVideoOrientation() DispatchQueue.main.async { if let barButton = sender as? UIBarButtonItem { barButton.title = isSwitchingToUltraWide ? "Wide" : "Ultra Wide" barButton.tintColor = isSwitchingToUltraWide ? UIColor.systemGreen : UIColor.white } print("Switched to \(cameraName) camera.") } self.isUsingFisheyeCamera.toggle() } catch { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "Failed to switch to \(cameraName) camera: \(error.localizedDescription)") } } } } Expected Behavior Torch should be able to work when Ultra-Wide is active, just like the iPhone Camera app does. The camera should not freeze when switching between Wide and Ultra-Wide with the torch ON. AVCaptureSession should not crash when toggling the torch while Ultra-Wide is active. Questions & Help Needed Is this a known issue with AVFoundation? How does the iPhone Camera app allow using the torch while recording in Ultra-Wide? What’s the correct way to switch between Wide and Ultra-Wide cameras without freezing when the torch is active? Info Device tested: iPhone 13 Pro / iPhone 15 Pro / Iphone 15 iOS Version: iOS 17.3 / iOS 18.0 Xcode Version: 16.2
2
1
751
Oct ’25
DockKit gimbal reported yaw drifts by upwards of 45 degrees after running for a while
This is an issue with the Insta360 Flow Pro 2. My iOS app uses DockKit to control the gimbal; in particular, my app disables tracking and sends angular velocity commands to control the gimbal's orientation. I only try to modify the yaw (rotation around the vertical axis); never the pitch or yaw. Note that I don't send the gimbal to a particular orientation directly; I modify the velocity. Everything works great for a long period of time: typically for a continuous run of 4-6 hours; in the most recent case, I managed about 36 hours of continous operation before the following problem occurred. I came back to check on the system, and because no visual activity had occurred in the camera's field of view for a while, the phone had commanded the gimbal to rotate back to a yaw angle of 0 degrees. So the phone in the gimbal should have been looking straight ahead (i.e. the 0 degree yaw position), but it was definitely looking off at an angle. I've seen this twice now. The first time, when it should have been looking straight ahead, it was in fact looking 60 degrees off center. This time (caught on video, see below), it was off by 22 degrees from center. Here's the weird part: the gimbal reports this way off center positioning as zero degrees (well close enough to zero, like 0.2 or something that's fine). But, mechanically, the gimbal still knows where zero degrees is: if we double click on the trigger of the Flow Pro 2, which is supposed to reset the gimbal to 0 degrees yaw and pitch, the gimbal responds correctly and reorients to a 0 degree position. However, the yaw values it reports are not zero, but as shown in my video, 22 degrees off axis or so. Power cycling the gimbal and restarting immediately fixes the problem. Also, I switched from my app to the Insta360 app, which caused the phone to flip from landscape to portrait, then when I returned to my app and switched back to landscape, the gimbal now started reporting correct yaw angles. Is there a possibility this is a bug in the DockKit framework? Has anyone seen this? I have a case open with Insta360, but although it's clearly a software issue, it's not clear if it's in Insta360's code or the DockKit layer. Any ideas for how I can get out of this mode? My concern is that the phone is in a tripod about 10' off the floor, and not very accessible. Also, if all goes well, we may have about 50 of these systems running, and having to fix them one by one after a few hours is not good. For a demonstration of this bug, see the following video: https://octoparry.com/offset.MOV Any help greatly appreciated.
4
0
602
Jun ’25
Dockkit custom tracking does not work on iOS18.3
Hi all, I'm using Apple Sample Code below to create application using dockkit. "Controlling a DockKit accessory using your camera app" https://developer.apple.com/documentation/dockkit/controlling-a-dockkit-accessory-using-your-camera-app?changes=_8 I used vision hand recognition and put the observation data to dockAccessory.track, but Belkin or Insta360 devices never move on iPhone 16 Pro Max with iOS 18.3. If I use other functions like face search (system tracking) in the app, those work ok. I used Belkin and Insta360 Flow 2 Pro to reproduce the problem. My friend is also saying that the custom tracking feature was working fine on the OS 18 beta, but on recent iOS 18.3 that feature does not work. If I can get the iOS 18.0 beta then we can test that feature. But I cannot revert my iOS from 18.3 to the iOS 18.0 Beta. Regards, TO
1
1
359
Dec ’25
Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
2
3
233
Aug ’25
PHFetchOptions: Full List of Supported Predicate Keys
Hi, Could anybody share the full list of supported Predicate keys for the PHFetchOptions? I'm aware of the list that is posted in the documentation: https://developer.apple.com/documentation/photos/phfetchoptions However I have reason to believe that this is not an exhaustive list and there also seem to be mistakes in this doc. i.e. isFavorite does not work but favorite does. Through some experimentation I also found that this works: NSPredicate(format: "adjustmentFormatIdentifier == 'com.pixelmatorteam.touch.x.photo.PhotosAdjustmentData.EmbeddedSlimSidecarFileInfo.compressed'") even though adjustmentFormatIdentifier is not listed as a supported key. Are there other secret keys that you are aware of? Specifically I want to filter a fetch result for edited items. Something like this: NSPredicate(format: "hasAdjustments == true") (I tried this, doesn't work) The native Photos app has such a filter which leads me to believe that there probably is a key for this: If one of the Framework Developers reads this: Could you please update the documentation page with this information? Finally if there really aren't any more secret keys, is there a way to achieve this with adjustmentFormatIdentifier? I have tried a bunch of stuff already like adjustmentFormatIdentifier != nil but for some reason that gives me the exact opposite of what I want: all the photos without edits. 🫠 Any tips on the correct syntax here would be much appreciated.
1
0
163
Jun ’25
AVCapturePhotoOutput crashes at delegate callback on MacOS 13.7.5
A functioning Multiplatform app, which includes use of Continuity Camera on an M1MacMini running Sequoia 15.5, works correctly capturing photos with AVCapturePhoto. However, that app (and a test app just for Continuity Camera) crashes at delegate callback when run on a 2017 MacBookPro under MacOS 13.7.5. The app was created with Xcode 16 (various releases) and using Swift 6 (but tried with 5). Compiling and running the test app with Xcode 15.2 on the 13.7.5 machine also crashes at delegate callback. The iPhone 15 Continuity Camera gets detected and set up correctly, and preview video works correctly. It's when the CapturePhoto code is run that the crash occurs. The relevant capture code is: func capturePhoto() { let captureSettings = AVCapturePhotoSettings() captureSettings.flashMode = .auto photoOutput.maxPhotoQualityPrioritization = .quality photoOutput.capturePhoto(with: captureSettings, delegate: PhotoDelegate.shared) print("**** CameraManager: capturePhoto") } and the delegate callbacks are: class PhotoDelegate: NSObject, AVCapturePhotoCaptureDelegate { nonisolated(unsafe) static let shared = PhotoDelegate() // MARK: - Delegate callbacks func photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)? ) { print("**** CameraManager: didFinishProcessingPhoto") guard let pData = photo.fileDataRepresentation() else { print("**** photoOutput is empty") return } print("**** photoOutput data is \(pData.count) bytes") } func photoOutput( _ output: AVCapturePhotoOutput, willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings ) { print("**** CameraManager: willBeginCaptureFor") } func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { print("**** CameraManager: willCaptureCapturePhotoFor") } } The crash report significant parts are..... Crashed Thread: 3 Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000000 Exception Codes: 0x0000000000000001, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [30850] VM Region Info: 0 is not in any region. Bytes before following region: 4296495104 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> __TEXT 100175000-10017f000 [ 40K] r-x/r-x SM=COW ...tinuityCamera Thread 0:: Dispatch queue: com.apple.main-thread 0 libsystem_kernel.dylib 0x7ff803aed552 mach_msg2_trap + 10 1 libsystem_kernel.dylib 0x7ff803afb6cd mach_msg2_internal + 78 2 libsystem_kernel.dylib 0x7ff803af4584 mach_msg_overwrite + 692 3 libsystem_kernel.dylib 0x7ff803aed83a mach_msg + 19 4 CoreFoundation 0x7ff803c07f8f __CFRunLoopServiceMachPort + 145 5 CoreFoundation 0x7ff803c06a10 __CFRunLoopRun + 1365 6 CoreFoundation 0x7ff803c05e51 CFRunLoopRunSpecific + 560 7 HIToolbox 0x7ff80d694f3d RunCurrentEventLoopInMode + 292 8 HIToolbox 0x7ff80d694d4e ReceiveNextEventCommon + 657 9 HIToolbox 0x7ff80d694aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64 10 AppKit 0x7ff806ca59d8 _DPSNextEvent + 858 11 AppKit 0x7ff806ca4882 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214 12 AppKit 0x7ff806c96ef7 -[NSApplication run] + 586 13 AppKit 0x7ff806c6b111 NSApplicationMain + 817 14 SwiftUI 0x7ff90e03a9fb 0x7ff90dfb4000 + 551419 15 SwiftUI 0x7ff90f0778b4 0x7ff90dfb4000 + 17578164 16 SwiftUI 0x7ff90e9906cf 0x7ff90dfb4000 + 10340047 17 ContinuityCamera 0x10017b49e 0x100175000 + 25758 18 dyld 0x7ff8037d1418 start + 1896 Thread 1: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 2: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 3 Crashed:: Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext 0 ??? 0x0 ??? 1 AVFCapture 0x7ff82045996c StreamAsyncStillCaptureCallback + 61 2 CoreMediaIO 0x7ff813a4358f __94-[CMIOExtensionProviderHostContext captureAsyncStillImageWithStreamID:uniqueID:options:reply:]_block_invoke + 498 3 libxpc.dylib 0x7ff803875b33 _xpc_connection_reply_callout + 36 4 libxpc.dylib 0x7ff803875ab2 _xpc_connection_call_reply_async + 69 5 libdispatch.dylib 0x7ff80398b099 _dispatch_client_callout3 + 8 6 libdispatch.dylib 0x7ff8039a6795 _dispatch_mach_msg_async_reply_invoke + 387 7 libdispatch.dylib 0x7ff803991088 _dispatch_lane_serial_drain + 393 8 libdispatch.dylib 0x7ff803991d6c _dispatch_lane_invoke + 417 9 libdispatch.dylib 0x7ff80399c3fc _dispatch_workloop_worker_thread + 765 10 libsystem_pthread.dylib 0x7ff803b28c55 _pthread_wqthread + 327 11 libsystem_pthread.dylib 0x7ff803b27bbf start_wqthread + 15 Of course, the MacBookPro is an old device - but Continuity Camera works with the installed Photo Booth app, so it's possible. Any thoughts on solving this situation would be appreciated. Regards, Michaela
1
0
517
Nov ’25
Format of 14-bit RAW bayer data from lower bit camera sensor?
I'm working on an application that uses the iPhone camera for scientific purposes - and, as a result would like to receive sensor data in as unprocessed format as possible. I'm using AVCapturePhotoOutput to take Bayer RAW stills and receiving data in kCVPixelFormatType_14Bayer_RGGB format. However, I'm puzzled as to the content of the bits. I simply demosaic the image by taking each 2x2 square: RG GB and use R, (G+G)/2, B to get 16-bit RGB values - and this indeed works. However, I am puzzled as to the values we are getting as they seem to be approximately in the range 2048 - 16383. The top value is understandable - the maximum that you can fit in 14-bits (as implied by the pixel format type). However we don't seem to be able to get lower than ~2048 no matter how black/dark we make the sensor. I'm aware that the sensor is probably not 14-bits (we're using the iPhone 16e camera) and that maybe this is to do with the way the sensor data is packaged. The Advances in iOS Photography video (https://developer.apple.com/videos/play/wwdc2016/501/) describes it as "10-bit sensor RAW packaged in 14 bits per pixel instead of eight." Is there any documentation describing what is going on here? It's vital for our use that we get as close to the raw camera sensor light readings as possible, so any pointers as to the mapping (e.g. decompanding?) being used would be extremely useful. Many thanks in advance for your help.
3
0
202
May ’25
[iOS 18.0] PHImageManager request image crash
I am seeing a crash on iOS 18.0 in my app due to PHImageManager.default().requestImage. Same crash is seen even while using requestImageDataAndOrientation. Not sure why this is happening only on iOS 18.0. Any help on this would be appreciated. Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 VM Region Info: 0x10 is not in any region. Bytes before following region: 4310777840 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [695] Triggered by Thread: 14 Thread 14 name: Dispatch queue: */file=33) .Generic Thread 14 Crashed: 0 libobjc.A.dylib 0x1944d7020 objc_msgSend + 32 1 PhotoLibraryServices 0x1b080262c +[PLUniformTypeIdentifier utiWithCompactRepresentation:conformanceHint:] + 40 2 Photos 0x1b0081354 _resourceInfoFromResultDict + 1048 3 Photos 0x1b0080a7c fetchResourcesForChoosing + 584 4 Photos 0x1b0080714 ___fetchNonHintResources_block_invoke.227 + 152 5 PhotoLibraryServices 0x1b026b3c4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke + 48 6 CoreData 0x19f171b00 developerSubmittedBlockToNSManagedObjectContextPerform + 228 7 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 8 libdispatch.dylib 0x19eef5728 _dispatch_lane_barrier_sync_invoke_and_complete + 56 9 CoreData 0x19f1c2a0c -[NSManagedObjectContext performBlockAndWait:] + 308 10 PhotoLibraryServices 0x1b026cea4 -[PLManagedObjectContext _directPerformBlockAndWait:] + 144 11 PhotoLibraryServices 0x1b026cdf8 -[PLManagedObjectContext performBlockAndWait:] + 196 12 Photos 0x1b00804a4 _fetchNonHintResources + 292 13 Photos 0x1afeedb38 PHChooserListContinueEnumerating + 144 14 Photos 0x1afeed9f8 -[PHImageResourceChooser presentNextQualifyingResource] + 412 15 Photos 0x1afeed1d0 -[PHImageRequest startRequest] + 2424 16 libdispatch.dylib 0x19eee5aac _dispatch_call_block_and_release + 32 17 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 18 libdispatch.dylib 0x19eeee2d0 _dispatch_lane_serial_drain + 740 19 libdispatch.dylib 0x19eeeede0 _dispatch_lane_invoke + 440 20 libdispatch.dylib 0x19eef91dc _dispatch_root_queue_drain_deferred_wlh + 292 21 libdispatch.dylib 0x19eef8a60 _dispatch_workloop_worker_thread + 540 22 libsystem_pthread.dylib 0x2214d5660 _pthread_wqthread + 292 23 libsystem_pthread.dylib 0x2214d29f8 start_wqthread + 8
6
0
213
May ’25
PhotosPicker obtains the result information of the selected video
In the AVP project, a selector pops up, only wanting to filter spatial videos. When selecting the material of one of the spatial videos, the selection result returns empty. How can we obtain the video selected by the user and get the path and the URL of the file The code is as follows: PhotosPicker(selection: $selectedItem, matching: .videos) { Text("Choose a spatial photo or video") } func loadTransferable(from imageSelection: PhotosPickerItem) -> Progress { return imageSelection.loadTransferable(type: URL.self) { result in DispatchQueue.main.async { // guard imageSelection == self.imageSelection else { return } print("加载成功的图片集合:(result)") switch result { case .success(let url?): self.selectSpatialVideoURL = url print("获取视频链接:(url)") case .success(nil): break // Handle the success case with an empty value. case .failure(let error): print("spatial错误:(error)") // Handle the failure case with the provided error. } } } }
4
0
204
May ’25
Customized IOKit extension not work
Hi guys, How to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker). I want to make use of IOKit extension to hidden or ignore build-in camera to realize the requirement. however the extension can't be loaded for Invalid permissions in MacOS 15.4.1, buildVersion:24E263. I also tried to run in in MacOS 14.4.1, which can be loaded but can't auto load when restart laptop as KDK version not match. Could you please give me some suggestion? Is it possible hidden build-in camera in MacOS M-series chip? Is there any other method to realize the feature. Thanks a lot.
7
0
442
Jun ’25
Does CMIO support "hide" build-in camera
Hi guys, Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
2
0
398
Jul ’25
AVCaptureVideoPreviewLayer layoutSublayers invoked on background thread
Opening this question after discussing the issue in the AVCapture lab, hopefully so we can track down this issue. We've been noticing some crashes in App Store Connect caused by layoutSublayers being called on a background thread. After debugging the issue a bit we found that all calls which modified the AVCaptureSession or preview layer were indeed done on the main thread. It would be useful to see what results in AVCaptureVideoPreviewLayer.updateFormatDescription being called. I've attached the crashlog below. Crash log.ips - https://developer.apple.com/forums/content/attachment/800b0dba-3477-4c5a-b56c-f4cc393b384f
Replies
1
Boosts
1
Views
1.1k
Activity
Jun ’25
iOS 14, App crash when call presentLimitedLibraryPickerFromViewControlle
iPhone7 : iOS 14.0 Beta 5 Xcode-beta Mac OS : 10.15.5 (19F101) crash info : ** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[PHPhotoLibrary presentLimitedLibraryPickerFromViewController:]: unrecognized selector sent to instance xxxxxx' terminating with uncaught exception of type NSException my code: (void)viewDidAppear:(BOOL)animated {   [super viewDidAppear:animated];   if (@available(iOS 14, *)) {     [[PHPhotoLibrary sharedPhotoLibrary] presentLimitedLibraryPickerFromViewController:self];   } }
Replies
9
Boosts
0
Views
4.4k
Activity
Nov ’25
Does PhotoKit provide access to People, Places, and Shared Albums?
I know how to search for Smart Albums (favorites, selfies, etc...) containing photos: // Get smart albums PHFetchResult *smartAlbums = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil]; I have the following questions: Is there a way to enumerate the People Smart Albums and access the photos in a specific People Smart Album? Is there a way to enumerate the Places Smart Albums and access the photos in a specific Place Smart Album? Is there a way to enumerate Shared Albums (shared to the current iCloud user) and access the photos in a specific Shared Album?
Replies
1
Boosts
2
Views
807
Activity
Mar ’26
Understanding CMIO Extension
Hello, I am getting the following errors when building a Mac Camera Extension with web sockets. I am using URLSessionWebsocketTask as my web socket library. I built a test program for my code and in there I can see my web sockets are working properly, but when I run it from the System Extension I get the following errors. The socket opens for two - three messages then crashes. I couldnt find any documentation online for the following errors CMIOExtensionProvider.m:1975:-[CMIOExtensionProvider removeProviderContext:]_block_invoke Unregistered provider context <CMIOExtensionProviderContext: ->, don't be surprised if things go badly CMIOExtensionProviderContext.m:64:-[CMIOExtensionProviderContext initWithConnection:]_block_invoke [391] received Connection invalid``
Replies
7
Boosts
0
Views
2.5k
Activity
Mar ’26
PHPicker fails to load RAW images
We observed that the PHPicker is unable to load RAW images captured on an iPhone in some scenarios. And it is also somehow related to iCloud. Here is the setup: The PHPickerViewController is configured with preferredAssetRepresentationMode = .current to avoid transcoding. The image is loaded from the item provider like this: if itemProvider.hasItemConformingToTypeIdentifier(kUTTypeImage) { itemProvider.loadFileRepresentation(forTypeIdentifier: kUTTypeImage) { url, error in // work } } This usually works, also for RAW images. However, when trying to load a RAW image that has just been captured with the iPhone, the loading fails with the following errors on the console: [claims] 43A5D3B2-84CD-488D-B9E4-19F9ED5F39EB grantAccessClaim reply is an error: Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}} Error copying file type public.image. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.image" UserInfo={NSLocalizedDescription=Cannot load representation of type public.image, NSUnderlyingError=0x280480540 {Error Domain=NSCocoaErrorDomain Code=4097 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x2804a8e70 {Error Domain=NSCocoaErrorDomain Code=4097 "connection from pid 19420 on anonymousListener or serviceListener" UserInfo={NSDebugDescription=connection from pid 19420 on anonymousListener or serviceListener}}}}} We observed that on some devices, loading the image will actually work after a short time (~30 sec), but on others it will always fail. We think it is related to iCloud Photos: On the device that has iCloud Photos sync enabled, the picker is able to load the image right after it was synced to the cloud. On devices that don't sync the image, loading always fails. It seems that the sync process is doing some processing (?) of the image that will later enable the picker to load it successfully, but that's just guessing. Additional observations: This seems to only occur for images that were taken with the stock Camera app. When using Halide to capture RAW (either ProRAW or RAW), the Picker is able to load the image. When trying to load the image as kUTTypeRawImage instead of kUTTypeImage, it also fails. The picker also can't load RAW images that were AirDroped from another device, unless it synced to iCloud first. This is reproducable using the Selecting Photos and Videos in iOS sample code project. We observed this happening in other apps that use the PHPicker, not just ours. Is this a bug, or is there something that we are missing?
Replies
7
Boosts
1
Views
3k
Activity
Sep ’25
IPadOS 17 external camera exposure
I'm developing iPad app that will be mostly dedicated for certain external camera for visually impaired people. The linux UVC api (e.g. using guvcview) allows to enable automatic exposure for the camera. IOs api "isExposureModeSupported" unfortunately returns false for any of the exposure modes. Is it a bug? Or perhaps AVFoundation doesn't support UVC exposure yet?
Replies
1
Boosts
2
Views
748
Activity
Jul ’25
I need a way to permantently disable Reactions from my app, ideally the universe too
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload. So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%. "Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery. Now, how do I make it go away, for ever? Best regards Jacob
Replies
10
Boosts
6
Views
1.5k
Activity
Jul ’25
PHLivePhotoEditingContext.saveLivePhoto results in AVFoundation error -11800 "The operation could not be completed" reason An unknown error occurred (-12815)
When trying to edit some Live Photos, calling PHLivePhotoEditingContext.saveLivePhoto results in the following error: Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12815), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x300d05380 {Error Domain=NSOSStatusErrorDomain Code=-12815 "(null)"}} I was able to replicate it on my device by taking a new Live Photo. Not sure what's wrong with that one specifically, not all Live Photos replicate the issue. I've submitted FB15880825 with a sysdiagnose and a Photos Diagnostics as well. Any ideas what's going on here? It's impacting multiple customers. Thanks!
Replies
1
Boosts
0
Views
624
Activity
Jun ’25
Unable to Capture 24MP Photos
Hello, I'm wondering how to capture 24MP photos. I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality. When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings. What could be the issue? // Objective-C // setup [self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality]; CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue]; [self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions]; // capturing AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings]; photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions; photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization; [self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate]; ...
Replies
2
Boosts
0
Views
1.2k
Activity
Jan ’26
Torch Freezes Ultra-Wide Camera When Switching Between Wide & Ultra-Wide Lenses (AVFoundation Bug?)
I'm developing an iOS app using AVFoundation for real-time video capture and object detection. While implementing torch functionality with camera switching (between Wide and Ultra-Wide lenses), I encountered a critical issue where the camera freezes when toggling the torch while the Ultra-Wide camera is active. Issue If the torch is ON and I switch from Wide to Ultra-Wide, the camera freezes If the Ultra-Wide camera is active and I try to turn the torch ON, the camera freezes The iPhone Camera app allows using the torch while recording video with the Ultra-Wide lens, so this should be possible via AVFoundation as well. Code snippet DispatchQueue.global(qos: .userInitiated).async { [weak self] in guard let self = self else { return } let isSwitchingToUltraWide = !self.isUsingFisheyeCamera let cameraType: AVCaptureDevice.DeviceType = isSwitchingToUltraWide ? .builtInUltraWideCamera : .builtInWideAngleCamera let cameraName = isSwitchingToUltraWide ? "Ultra Wide" : "Wide" guard let selectedCamera = AVCaptureDevice.default(cameraType, for: .video, position: .back) else { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "\(cameraName) camera is not available on this device.") } return } do { let currentInput = self.videoCapture.captureSession.inputs.first as? AVCaptureDeviceInput self.videoCapture.captureSession.beginConfiguration() if isSwitchingToUltraWide && self.isFlashlightOn { self.forceEnableTorchThroughWide() } if let currentInput = currentInput { self.videoCapture.captureSession.removeInput(currentInput) } let videoInput = try AVCaptureDeviceInput(device: selectedCamera) self.videoCapture.captureSession.addInput(videoInput) self.videoCapture.captureSession.commitConfiguration() self.videoCapture.updateVideoOrientation() DispatchQueue.main.async { if let barButton = sender as? UIBarButtonItem { barButton.title = isSwitchingToUltraWide ? "Wide" : "Ultra Wide" barButton.tintColor = isSwitchingToUltraWide ? UIColor.systemGreen : UIColor.white } print("Switched to \(cameraName) camera.") } self.isUsingFisheyeCamera.toggle() } catch { DispatchQueue.main.async { self.showAlert(title: "Camera Error", message: "Failed to switch to \(cameraName) camera: \(error.localizedDescription)") } } } } Expected Behavior Torch should be able to work when Ultra-Wide is active, just like the iPhone Camera app does. The camera should not freeze when switching between Wide and Ultra-Wide with the torch ON. AVCaptureSession should not crash when toggling the torch while Ultra-Wide is active. Questions & Help Needed Is this a known issue with AVFoundation? How does the iPhone Camera app allow using the torch while recording in Ultra-Wide? What’s the correct way to switch between Wide and Ultra-Wide cameras without freezing when the torch is active? Info Device tested: iPhone 13 Pro / iPhone 15 Pro / Iphone 15 iOS Version: iOS 17.3 / iOS 18.0 Xcode Version: 16.2
Replies
2
Boosts
1
Views
751
Activity
Oct ’25
DockKit gimbal reported yaw drifts by upwards of 45 degrees after running for a while
This is an issue with the Insta360 Flow Pro 2. My iOS app uses DockKit to control the gimbal; in particular, my app disables tracking and sends angular velocity commands to control the gimbal's orientation. I only try to modify the yaw (rotation around the vertical axis); never the pitch or yaw. Note that I don't send the gimbal to a particular orientation directly; I modify the velocity. Everything works great for a long period of time: typically for a continuous run of 4-6 hours; in the most recent case, I managed about 36 hours of continous operation before the following problem occurred. I came back to check on the system, and because no visual activity had occurred in the camera's field of view for a while, the phone had commanded the gimbal to rotate back to a yaw angle of 0 degrees. So the phone in the gimbal should have been looking straight ahead (i.e. the 0 degree yaw position), but it was definitely looking off at an angle. I've seen this twice now. The first time, when it should have been looking straight ahead, it was in fact looking 60 degrees off center. This time (caught on video, see below), it was off by 22 degrees from center. Here's the weird part: the gimbal reports this way off center positioning as zero degrees (well close enough to zero, like 0.2 or something that's fine). But, mechanically, the gimbal still knows where zero degrees is: if we double click on the trigger of the Flow Pro 2, which is supposed to reset the gimbal to 0 degrees yaw and pitch, the gimbal responds correctly and reorients to a 0 degree position. However, the yaw values it reports are not zero, but as shown in my video, 22 degrees off axis or so. Power cycling the gimbal and restarting immediately fixes the problem. Also, I switched from my app to the Insta360 app, which caused the phone to flip from landscape to portrait, then when I returned to my app and switched back to landscape, the gimbal now started reporting correct yaw angles. Is there a possibility this is a bug in the DockKit framework? Has anyone seen this? I have a case open with Insta360, but although it's clearly a software issue, it's not clear if it's in Insta360's code or the DockKit layer. Any ideas for how I can get out of this mode? My concern is that the phone is in a tripod about 10' off the floor, and not very accessible. Also, if all goes well, we may have about 50 of these systems running, and having to fix them one by one after a few hours is not good. For a demonstration of this bug, see the following video: https://octoparry.com/offset.MOV Any help greatly appreciated.
Replies
4
Boosts
0
Views
602
Activity
Jun ’25
Dockkit custom tracking does not work on iOS18.3
Hi all, I'm using Apple Sample Code below to create application using dockkit. "Controlling a DockKit accessory using your camera app" https://developer.apple.com/documentation/dockkit/controlling-a-dockkit-accessory-using-your-camera-app?changes=_8 I used vision hand recognition and put the observation data to dockAccessory.track, but Belkin or Insta360 devices never move on iPhone 16 Pro Max with iOS 18.3. If I use other functions like face search (system tracking) in the app, those work ok. I used Belkin and Insta360 Flow 2 Pro to reproduce the problem. My friend is also saying that the custom tracking feature was working fine on the OS 18 beta, but on recent iOS 18.3 that feature does not work. If I can get the iOS 18.0 beta then we can test that feature. But I cannot revert my iOS from 18.3 to the iOS 18.0 Beta. Regards, TO
Replies
1
Boosts
1
Views
359
Activity
Dec ’25
Images with unusual color spaces not correctly loaded by Core Image
Some users reported that their images are not loading correctly in our app. After a lot of debugging we identified the following: This only happens when the app is build for Mac Catalyst. Not on iOS, iPadOS, or “real” macOS (AppKit). The images in question have unusual color spaces. We observed the issue for uRGB and eciRGB v2. Those images are rendered correctly in Photos and Preview on all platforms. When displaying the image inside of a UIImageView or in a SwiftUI Image, they render correctly. The issue only occurs when loading the image via Core Image. When comparing the different Core Image render graphs between AppKit (working) and Catalyst (faulty) builds, they look identical—except for the result. Mac (AppKit): Catalyst: Something seems to be off when Core Image tries to load an image with foreign color space in Catalyst. We identified a workaround: By using a CGImageDestination to transcode the image using the kCGImageDestinationOptimizeColorForSharing option, Image I/O will convert the image to sRGB (or similar) and Core Image is able to load the image correctly. However, one potentially loses fidelity this way. Or might there be a better workaround?
Replies
2
Boosts
3
Views
233
Activity
Aug ’25
PHFetchOptions: Full List of Supported Predicate Keys
Hi, Could anybody share the full list of supported Predicate keys for the PHFetchOptions? I'm aware of the list that is posted in the documentation: https://developer.apple.com/documentation/photos/phfetchoptions However I have reason to believe that this is not an exhaustive list and there also seem to be mistakes in this doc. i.e. isFavorite does not work but favorite does. Through some experimentation I also found that this works: NSPredicate(format: "adjustmentFormatIdentifier == 'com.pixelmatorteam.touch.x.photo.PhotosAdjustmentData.EmbeddedSlimSidecarFileInfo.compressed'") even though adjustmentFormatIdentifier is not listed as a supported key. Are there other secret keys that you are aware of? Specifically I want to filter a fetch result for edited items. Something like this: NSPredicate(format: "hasAdjustments == true") (I tried this, doesn't work) The native Photos app has such a filter which leads me to believe that there probably is a key for this: If one of the Framework Developers reads this: Could you please update the documentation page with this information? Finally if there really aren't any more secret keys, is there a way to achieve this with adjustmentFormatIdentifier? I have tried a bunch of stuff already like adjustmentFormatIdentifier != nil but for some reason that gives me the exact opposite of what I want: all the photos without edits. 🫠 Any tips on the correct syntax here would be much appreciated.
Replies
1
Boosts
0
Views
163
Activity
Jun ’25
AVCapturePhotoOutput crashes at delegate callback on MacOS 13.7.5
A functioning Multiplatform app, which includes use of Continuity Camera on an M1MacMini running Sequoia 15.5, works correctly capturing photos with AVCapturePhoto. However, that app (and a test app just for Continuity Camera) crashes at delegate callback when run on a 2017 MacBookPro under MacOS 13.7.5. The app was created with Xcode 16 (various releases) and using Swift 6 (but tried with 5). Compiling and running the test app with Xcode 15.2 on the 13.7.5 machine also crashes at delegate callback. The iPhone 15 Continuity Camera gets detected and set up correctly, and preview video works correctly. It's when the CapturePhoto code is run that the crash occurs. The relevant capture code is: func capturePhoto() { let captureSettings = AVCapturePhotoSettings() captureSettings.flashMode = .auto photoOutput.maxPhotoQualityPrioritization = .quality photoOutput.capturePhoto(with: captureSettings, delegate: PhotoDelegate.shared) print("**** CameraManager: capturePhoto") } and the delegate callbacks are: class PhotoDelegate: NSObject, AVCapturePhotoCaptureDelegate { nonisolated(unsafe) static let shared = PhotoDelegate() // MARK: - Delegate callbacks func photoOutput( _ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: (any Error)? ) { print("**** CameraManager: didFinishProcessingPhoto") guard let pData = photo.fileDataRepresentation() else { print("**** photoOutput is empty") return } print("**** photoOutput data is \(pData.count) bytes") } func photoOutput( _ output: AVCapturePhotoOutput, willBeginCaptureFor resolvedSettings: AVCaptureResolvedPhotoSettings ) { print("**** CameraManager: willBeginCaptureFor") } func photoOutput(_ output: AVCapturePhotoOutput, willCapturePhotoFor resolvedSettings: AVCaptureResolvedPhotoSettings) { print("**** CameraManager: willCaptureCapturePhotoFor") } } The crash report significant parts are..... Crashed Thread: 3 Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Codes: KERN_INVALID_ADDRESS at 0x0000000000000000 Exception Codes: 0x0000000000000001, 0x0000000000000000 Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11 Terminating Process: exc handler [30850] VM Region Info: 0 is not in any region. Bytes before following region: 4296495104 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> __TEXT 100175000-10017f000 [ 40K] r-x/r-x SM=COW ...tinuityCamera Thread 0:: Dispatch queue: com.apple.main-thread 0 libsystem_kernel.dylib 0x7ff803aed552 mach_msg2_trap + 10 1 libsystem_kernel.dylib 0x7ff803afb6cd mach_msg2_internal + 78 2 libsystem_kernel.dylib 0x7ff803af4584 mach_msg_overwrite + 692 3 libsystem_kernel.dylib 0x7ff803aed83a mach_msg + 19 4 CoreFoundation 0x7ff803c07f8f __CFRunLoopServiceMachPort + 145 5 CoreFoundation 0x7ff803c06a10 __CFRunLoopRun + 1365 6 CoreFoundation 0x7ff803c05e51 CFRunLoopRunSpecific + 560 7 HIToolbox 0x7ff80d694f3d RunCurrentEventLoopInMode + 292 8 HIToolbox 0x7ff80d694d4e ReceiveNextEventCommon + 657 9 HIToolbox 0x7ff80d694aa8 _BlockUntilNextEventMatchingListInModeWithFilter + 64 10 AppKit 0x7ff806ca59d8 _DPSNextEvent + 858 11 AppKit 0x7ff806ca4882 -[NSApplication(NSEvent) _nextEventMatchingEventMask:untilDate:inMode:dequeue:] + 1214 12 AppKit 0x7ff806c96ef7 -[NSApplication run] + 586 13 AppKit 0x7ff806c6b111 NSApplicationMain + 817 14 SwiftUI 0x7ff90e03a9fb 0x7ff90dfb4000 + 551419 15 SwiftUI 0x7ff90f0778b4 0x7ff90dfb4000 + 17578164 16 SwiftUI 0x7ff90e9906cf 0x7ff90dfb4000 + 10340047 17 ContinuityCamera 0x10017b49e 0x100175000 + 25758 18 dyld 0x7ff8037d1418 start + 1896 Thread 1: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 2: 0 libsystem_pthread.dylib 0x7ff803b27bb0 start_wqthread + 0 Thread 3 Crashed:: Dispatch queue: com.apple.cmio.CMIOExtensionProviderHostContext 0 ??? 0x0 ??? 1 AVFCapture 0x7ff82045996c StreamAsyncStillCaptureCallback + 61 2 CoreMediaIO 0x7ff813a4358f __94-[CMIOExtensionProviderHostContext captureAsyncStillImageWithStreamID:uniqueID:options:reply:]_block_invoke + 498 3 libxpc.dylib 0x7ff803875b33 _xpc_connection_reply_callout + 36 4 libxpc.dylib 0x7ff803875ab2 _xpc_connection_call_reply_async + 69 5 libdispatch.dylib 0x7ff80398b099 _dispatch_client_callout3 + 8 6 libdispatch.dylib 0x7ff8039a6795 _dispatch_mach_msg_async_reply_invoke + 387 7 libdispatch.dylib 0x7ff803991088 _dispatch_lane_serial_drain + 393 8 libdispatch.dylib 0x7ff803991d6c _dispatch_lane_invoke + 417 9 libdispatch.dylib 0x7ff80399c3fc _dispatch_workloop_worker_thread + 765 10 libsystem_pthread.dylib 0x7ff803b28c55 _pthread_wqthread + 327 11 libsystem_pthread.dylib 0x7ff803b27bbf start_wqthread + 15 Of course, the MacBookPro is an old device - but Continuity Camera works with the installed Photo Booth app, so it's possible. Any thoughts on solving this situation would be appreciated. Regards, Michaela
Replies
1
Boosts
0
Views
517
Activity
Nov ’25
Format of 14-bit RAW bayer data from lower bit camera sensor?
I'm working on an application that uses the iPhone camera for scientific purposes - and, as a result would like to receive sensor data in as unprocessed format as possible. I'm using AVCapturePhotoOutput to take Bayer RAW stills and receiving data in kCVPixelFormatType_14Bayer_RGGB format. However, I'm puzzled as to the content of the bits. I simply demosaic the image by taking each 2x2 square: RG GB and use R, (G+G)/2, B to get 16-bit RGB values - and this indeed works. However, I am puzzled as to the values we are getting as they seem to be approximately in the range 2048 - 16383. The top value is understandable - the maximum that you can fit in 14-bits (as implied by the pixel format type). However we don't seem to be able to get lower than ~2048 no matter how black/dark we make the sensor. I'm aware that the sensor is probably not 14-bits (we're using the iPhone 16e camera) and that maybe this is to do with the way the sensor data is packaged. The Advances in iOS Photography video (https://developer.apple.com/videos/play/wwdc2016/501/) describes it as "10-bit sensor RAW packaged in 14 bits per pixel instead of eight." Is there any documentation describing what is going on here? It's vital for our use that we get as close to the raw camera sensor light readings as possible, so any pointers as to the mapping (e.g. decompanding?) being used would be extremely useful. Many thanks in advance for your help.
Replies
3
Boosts
0
Views
202
Activity
May ’25
[iOS 18.0] PHImageManager request image crash
I am seeing a crash on iOS 18.0 in my app due to PHImageManager.default().requestImage. Same crash is seen even while using requestImageDataAndOrientation. Not sure why this is happening only on iOS 18.0. Any help on this would be appreciated. Exception Type: EXC_BAD_ACCESS (SIGSEGV) Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000010 Exception Codes: 0x0000000000000001, 0x0000000000000010 VM Region Info: 0x10 is not in any region. Bytes before following region: 4310777840 REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL UNUSED SPACE AT START ---> Termination Reason: SIGNAL 11 Segmentation fault: 11 Terminating Process: exc handler [695] Triggered by Thread: 14 Thread 14 name: Dispatch queue: */file=33) .Generic Thread 14 Crashed: 0 libobjc.A.dylib 0x1944d7020 objc_msgSend + 32 1 PhotoLibraryServices 0x1b080262c +[PLUniformTypeIdentifier utiWithCompactRepresentation:conformanceHint:] + 40 2 Photos 0x1b0081354 _resourceInfoFromResultDict + 1048 3 Photos 0x1b0080a7c fetchResourcesForChoosing + 584 4 Photos 0x1b0080714 ___fetchNonHintResources_block_invoke.227 + 152 5 PhotoLibraryServices 0x1b026b3c4 __53-[PLManagedObjectContext _directPerformBlockAndWait:]_block_invoke + 48 6 CoreData 0x19f171b00 developerSubmittedBlockToNSManagedObjectContextPerform + 228 7 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 8 libdispatch.dylib 0x19eef5728 _dispatch_lane_barrier_sync_invoke_and_complete + 56 9 CoreData 0x19f1c2a0c -[NSManagedObjectContext performBlockAndWait:] + 308 10 PhotoLibraryServices 0x1b026cea4 -[PLManagedObjectContext _directPerformBlockAndWait:] + 144 11 PhotoLibraryServices 0x1b026cdf8 -[PLManagedObjectContext performBlockAndWait:] + 196 12 Photos 0x1b00804a4 _fetchNonHintResources + 292 13 Photos 0x1afeedb38 PHChooserListContinueEnumerating + 144 14 Photos 0x1afeed9f8 -[PHImageResourceChooser presentNextQualifyingResource] + 412 15 Photos 0x1afeed1d0 -[PHImageRequest startRequest] + 2424 16 libdispatch.dylib 0x19eee5aac _dispatch_call_block_and_release + 32 17 libdispatch.dylib 0x19eeff584 _dispatch_client_callout + 16 18 libdispatch.dylib 0x19eeee2d0 _dispatch_lane_serial_drain + 740 19 libdispatch.dylib 0x19eeeede0 _dispatch_lane_invoke + 440 20 libdispatch.dylib 0x19eef91dc _dispatch_root_queue_drain_deferred_wlh + 292 21 libdispatch.dylib 0x19eef8a60 _dispatch_workloop_worker_thread + 540 22 libsystem_pthread.dylib 0x2214d5660 _pthread_wqthread + 292 23 libsystem_pthread.dylib 0x2214d29f8 start_wqthread + 8
Replies
6
Boosts
0
Views
213
Activity
May ’25
PhotosPicker obtains the result information of the selected video
In the AVP project, a selector pops up, only wanting to filter spatial videos. When selecting the material of one of the spatial videos, the selection result returns empty. How can we obtain the video selected by the user and get the path and the URL of the file The code is as follows: PhotosPicker(selection: $selectedItem, matching: .videos) { Text("Choose a spatial photo or video") } func loadTransferable(from imageSelection: PhotosPickerItem) -> Progress { return imageSelection.loadTransferable(type: URL.self) { result in DispatchQueue.main.async { // guard imageSelection == self.imageSelection else { return } print("加载成功的图片集合:(result)") switch result { case .success(let url?): self.selectSpatialVideoURL = url print("获取视频链接:(url)") case .success(nil): break // Handle the success case with an empty value. case .failure(let error): print("spatial错误:(error)") // Handle the failure case with the provided error. } } } }
Replies
4
Boosts
0
Views
204
Activity
May ’25
Customized IOKit extension not work
Hi guys, How to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker). I want to make use of IOKit extension to hidden or ignore build-in camera to realize the requirement. however the extension can't be loaded for Invalid permissions in MacOS 15.4.1, buildVersion:24E263. I also tried to run in in MacOS 14.4.1, which can be loaded but can't auto load when restart laptop as KDK version not match. Could you please give me some suggestion? Is it possible hidden build-in camera in MacOS M-series chip? Is there any other method to realize the feature. Thanks a lot.
Replies
7
Boosts
0
Views
442
Activity
Jun ’25
Does CMIO support "hide" build-in camera
Hi guys, Can I use CMIO to achieve the following feature on macOS when a USB device (Camera/Mic/Speaker) is connected: When a third-party video conferencing app is not in a meeting, ensure the app defaults to using the USB device (Camera/Mic/Speaker). When a third-party conferencing app is in a meeting, ensure the app automatically switches to the USB device (Camera/Mic/Speaker).
Replies
2
Boosts
0
Views
398
Activity
Jul ’25