Apple Watch app closes when changing photo permissions in the iPhone app.
App A is installed simultaneously on both the paired iPhone and Apple Watch.
I'm running App A on both my iPhone and Apple Watch.
When I change the photo permissions on App A installed on my iPhone, App A running on my Apple Watch automatically closes.
At first, I assumed App A on my iPhone was abnormally closing, causing App A on my Apple Watch to also close.
However, I've determined that changing the photo permissions is the cause of the app closing.
I don't think this behavior existed before WatchOS/iOS 26.
Is this behavior a natural addition to WatchOS/iOS 26?
If I go to the home screen while running app A on my iPhone and change its photo permissions in the Settings app, app A running on my Apple Watch automatically closes.
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Recently Apple gave us the possibility to upload asset resources in the background. We implemented our background upload extension but when our CI tried to upload the app on TestFlight we got an error that the extension point identifier - in our case com.apple.photos.backgound-upload - is not an official one. Any idea when it will become official and we will be able to release a working background uploading?
Hi,
I’m trying to implement the new PhotoKit PHBackgroundResourceUploadExtension. I created the extension, enabled full photo library access in the host app, and registered the extension point using the string: com.apple.photos.background-upload.
However, when I attempted to enable the extension with:
try library.setUploadJobExtensionEnabled(true)
I received the following error:
Error Domain=PHPhotosErrorDomain Code=-1 "(null)"
This happens when running the app on Xcode 26.1 and 26.2 Beta, using the iPhone 17 Pro Max simulator (iOS 26.1 and 26.2).
My question is: Is this extension supported on the simulator?
I’m asking because at the moment it’s difficult for me to test this on a physical device.
Also, What's the meaning of the error?
Thanks.
I'm trying to benchmark a Core Image filter chains memory footprint and notice a weird quirk in instruments.
On a real device, even with a simple Core Image chain, the memory balloons each time I ran the filter. See attached screen shots.
Running on iPhone 17 Pro:
Running on simulator (M2 Macbook Pro)
As you can see there's a huge build up of 4MB "VM: IOSurface" memory on the real device, but the simulator seems to clean it up correctly.
Here's my basic code:
func processImage() {
guard let inputImage = ContentViewModel.loadImageFromBundle(name: "kitty.HEIC") else {
print("Failed to load sample_image from bundle")
return
}
var outputImage = inputImage
outputImage = outputImage.applyingFilter("CIBloom", parameters: [
kCIInputRadiusKey: 20,
kCIInputIntensityKey: 0.8
])
DispatchQueue.global(qos: .userInitiated).async {
let data = self.context.jpegRepresentation(of: outputImage, colorSpace: CGColorSpace(name: CGColorSpace.sRGB)!)
if let data = data, let uiImage = UIImage(data: data) {
DispatchQueue.main.async {
self.displayImage = Image(uiImage: uiImage)
}
}
}
}
Why is this happening? Seems like a bug to me or I need to release an object. At the very least makes it challenging to measure memory usage.
Any help is greatly appreciated.
Alex
Device: iPhone 17 Pro
iOS Version: iOS 26.1
Camera: Ultra-wide (0.5x) using AVCaptureSession
Our camera app freezes on iPhone 17 when switching frame rates (30fps ↔ 60fps). This works fine on iPhone 16 Pro and earlier.
What We've Observed:
Freeze happens on frame rate change - particularly when stabilization was enabled
Thread.sleep is used - to allow camera hardware to settle before re-enabling stabilization
Works on older iPhones - only iPhone 17 exhibits this behavior
Console shows these errors before freeze:
17281
<<<< FigXPCUtilities >>>> signalled err=18446744073709534335 <<<< FigCaptureSourceRemote >>>> err=-17281
Is Thread.sleep on the main thread causing the freeze? Should all camera configuration be on a background queue?
Is there something specific about iPhone 17 ultra-wide camera that requires different handling?
Should we use session.beginConfiguration() / session.commitConfiguration() instead of direct device configuration?
Is calling setFrameRate from a property's didSet (which runs synchronously) problematic?
Are the FigCaptureSourceRemote errors (-17281) indicative of the problem, and what do they mean?
PHPhotoLibrary.authorizationStatus(for: .readWrite) == .authorized
Iinfo.plist Privacy - Photo Library Usage Description set
I check authorization before attempting to get the photoPickerItem.itemIdentifier, but every time the return value from itemIdentifier is nil. Seems I missing some permissions, but unsure why the system is still keeping _shouldExposeItemIdentifier set to false.
Topic:
Media Technologies
SubTopic:
Photos & Camera
I am new to Swift and iOS development, and I have a question about video capture performance.
Is it possible to capture video at a resolution of 4032×3024 while simultaneously running a vision/ML model on the video stream (e.g., using Vision or CoreML)?
I want to know:
whether iOS devices support capturing video at that resolution,
whether the frame rate drops significantly at that scale,
and whether it is practical to run a Vision/ML model in real-time while recording at such a high resolution.
If anyone has experience with high-resolution AVCaptureSession setups or combining them with real-time ML processing, I would really appreciate guidance or sample code.
Hi everyone,
I’m seeing recurring internal AVFoundation camera logs on iOS 26.2 and I’m trying to understand whether this is expected behavior or a regression in the capture pipeline.
These logs appear shortly after starting an AVCaptureSession, while video frames are being delivered, and also when the camera is stopped or the capture session is torn down.
<<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302
<<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281)
Even in this clean, minimal setup, the same logs appear on iOS 26.2
The exact same logic did not produce these logs on iOS 18.x.
To rule out issues caused by my own code, GPT created a minimal SwiftUI example from scratch.
My primary interest is to perform real-time processing on the video frames delivered by the camera (via AVCaptureVideoDataOutput), for tasks such as analysis, computer vision, or custom frame handling, while simultaneously displaying the live preview.
Thanks in advance for any insight.
Example Code
I'm adopting Liquid Glass in iOS 26, when I try to test VNDocumentCameraViewController with document scanning after Liquid Glass enabled, there's a crash just after a photo is taken in VNDocumentCameraViewController, here's the screenshot when it crashed
The exception output in XCode console is this:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Layout requested for visible navigation bar, <UINavigationBar: 0x1240bde00; frame = (0 117; 390 54); opaque = NO; tintColor = UIExtendedSRGBColorSpace 1 1 0 1; layer = <CALayer: 0x120c21e60>> standardAppearance=0x12407b900 scrollEdgeAppearance=0x12407bb80 compactAppearance=0x12407b880 no-scroll-edge-support, when the top item belongs to a different navigation bar. topItem = <UINavigationItem: 0x1240bd800> style=navigator leftBarButtonItems=0x123d4e5f0 rightBarButtonItems=0x123d4d5a0, navigation bar = <UINavigationBar: 0x107b9ad00; frame = (0 47; 390 54); opaque = NO; autoresize = W; tintColor = UIExtendedSRGBColorSpace 1 1 0 1; layer = <CALayer: 0x120c20150>> delegate=0x10a805200 standardAppearance=0x107b2c300 scrollEdgeAppearance=0x107b2c280 compactAppearance=0x107b2c100, possibly from a client attempt to nest wrapped navigation controllers.'
*** First throw call stack:
(0x18e1db994 0x18b0f5814 0x18c092aa0 0x193b18660 0x193a7d540 0x193a7e020 0x1953ec4a0 0x1943b7d78 0x18ed83420 0x18ed82f74 0x18eb83134 0x18eb44c10 0x18eb70bc4 0x18eb7e74c 0x193ac8cd0 0x193ac8c04 0x193ad6afc 0x193ad5f8c 0x27b456560 0x18e12c4cc 0x18e15c0b0 0x18e15bfd8 0x18e133c1c 0x18e132a6c 0x22ed54498 0x193af6ba4 0x193a9fa78 0x193bcb68c 0x102cc2718 0x102cc2688 0x102cc2794 0x18b14ae28)
libc++abi: terminating due to uncaught exception of type NSException
I’m writing to report a serious usability regression in the iOS 26 Photos app. Folders can still be created and albums can still be assigned to them, but folders can no longer be opened to view the albums they contain. A container that cannot be opened is not a container, and this breaks a fundamental information architecture model that has existed in Photos for well over a decade.
This change disproportionately harms users who maintain large, intentional photo libraries—travel archives, projects, professional work, or long-term personal documentation—where hierarchy and ordering are essential. Search and automated surfacing are not substitutes for deliberate structure. Removing the ability to browse folder → album hierarchy on iOS strips users of control while still exposing the UI for folder creation, which is internally inconsistent.
If this behavior is intentional, it should be clearly documented and the folder UI removed to avoid misleading users. If it is not intentional, it needs urgent correction. At minimum, iOS should retain parity with macOS Photos for basic navigation of folders and albums. This is not a niche request; it is a regression in core functionality.
Topic:
Media Technologies
SubTopic:
Photos & Camera
Environment
Device: iPhone 15 Pro
iOS: iOS 18.0
Framework: AVFoundation
App type: Custom camera app using AVCaptureSession + AVCaptureVideoPreviewLayer
I’m seeing an intermittent but frequent issue where the camera preview layer briefly flashes empty after certain interruptions, even though the capture session reports itself as running and no errors are emitted.
This happens most often after:
Locking and unlocking the device
Switching cameras (back ↔ front)
The issue is not 100% reproducible, but occurs often enough to be noticeable in normal usage.
What happens
The preview layer briefly flashes as empty (sometimes just a “micro-frame”)
Duration: typically ~0.5–2 seconds before frames resume
session.isRunning == true throughout
No crash, no runtime error, no interruption end failure
Focus/exposure restore correctly once frames resume
Visually it looks like the preview layer loses frames temporarily, even though the session appears healthy.
Repro
Intermittent but frequent after:
Lock → unlock device
Switching camera (front/back)
Timing-dependent and non-deterministic
Happens multiple times per session, but not every time
Key observation
AVCaptureSession.isRunning == true does not guarantee that frames are actually flowing.
To verify this, I added an AVCaptureVideoDataOutput temporarily:
During the blank period, no sample buffers are delivered
Frames resume after ~1–2s without any explicit restart
Session state remains “running” the entire time
What I’ve tried (did NOT fix it)
Adding delays before/after startRunning() (0.1–0.5s)
Calling startRunning() on different queues
Restarting the session in AVCaptureSessionInterruptionEnded
Verifying session.connections (all show isActive == true)
Rebuilding inputs/outputs during interruption recovery
Ensuring startRunning() is never called between beginConfiguration() / commitConfiguration()
(Hit the expected runtime warning when attempted)
None of the above removed the brief blank preview.
Workaround (works visually but expensive)
This visually fixes the issue, but:
Energy impact jumps from Low → High in Xcode Energy Gauge
AVCaptureVideoDataOutput processes 30–60 FPS continuously
The gap only lasts ~1–2s, but toggling the delegate on/off cleanly is difficult
Overall CPU and energy cost is not acceptable for production
Additional notes
CPU usage is already relatively high even without the workaround (this app is camera-heavy by nature)
With the workaround enabled, energy impact becomes noticeably worse
The issue feels like a timing/state desync between session state and actual frame delivery, not a UI issue
Questions
Is this a known behavior where AVCaptureSession.isRunning == true but frames are temporarily unavailable after interruptions?
Is there a recommended way to detect actual frame flow resumption (not just session state)?
Should the AVCaptureVideoPreviewLayer.connection (isActive / isEnabled) be explicitly checked or reset after interruptions?
Is there a lightweight, energy-efficient way to bridge this short “no frames” gap without using AVCaptureVideoDataOutput?
Is rebuilding the entire session the only reliable solution here, or is there a better pattern Apple recommends?
We have a very strange issue that I am trying to solve or find the best practice for.
We have a SwiftUI View that uses the Camera to preview. So as suggested in Apples Docs we check authorisation status and then if it's not determined we request authorisation.
We also have the privacy entry in the info.plist
case .notDetermined:
AVCaptureDevice.requestAccess(for: .video) { accessStatusAuthorised in
if !accessStatusAuthorised {
self.cameraStatus = .notAuthorised
} else {
self.isAuthorized = true
self.cameraStatus = .authorised
self.startCameraSession(cameraPosition: cameraPosition)
}
}
case .restricted:
cameraStatus = .notAuthorised
isAuthorized = false
case .denied:
cameraStatus = .notAuthorised
isAuthorized = false
case .authorized:
cameraStatus = .authorised
isAuthorized = true
startCameraSession(cameraPosition: cameraPosition)
break
@unknown default:
isAuthorized = true
cameraStatus = .notAuthorised
}
However when we call this code it freezes the Camera feed, even when allow has been tapped.
However and this is the confusing part.
If we do not call the code above, we still get the permission for camera access pop up and the camera works fine after allowing.
What im concerned about is changing the code to do this and its a possible apple bug that gets fixed and hey then none of the Apps allow the camera function.
I cannot see any where that the process has changed for iOS 26 / Xcode 26.
Can anyone shed any light on this or had similar experience ?
At which point in the image processing pipeline does iOS apply the white balance gains which can be set via AVCaptureDevice.setWhiteBalanceModeLocked(with:completionHandler:)?
Are those gains applied in the analog part of the camera pipeline, before the pixel voltage gets converted via the ADC to digital values? Or does the camera first convert the pixel voltages to digital values and then the gains are applied to the digital values?
Is this consistent across devices or can the behavior vary from device to device?
Hi Apple Developer Support Team,
We are developing an iOS application using a camera package within a hybrid (cross-platform) framework, and we would like to confirm whether it is possible to disable the camera shutter sound programmatically.
As per our understanding, the shutter sound on iOS is system-controlled and depends on the device’s silent/ring mode, and there is no App Store–approved API available to force-disable this sound. Kindly confirm whether this understanding is correct or if any supported alternative approach exists for hybrid or native implementations.
Thank you for your clarification.
Best regards,
ParkhyaSolutions
Hello,
I am getting the following error while attempting to run my LockedCameraCapture compatible app on an iOS 15 device:
dyld[434]: Library not loaded: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture'
Referenced from: '/private/var/containers/Bundle/Application/.../MyApp.app/MyApp.debug.dylib'
Reason: tried: '/System/Library/Frameworks/LockedCameraCapture.framework/LockedCameraCapture' (no such file)
Of course iOS 15 doesn't have the library for LockedCameraCapture, but I have had no issue including Lock Screen Widgets (which require iOS 16), so I am not sure why the error is popping up.
Thank you!
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hello, I'm wondering how to capture 24MP photos.
I'm currently testing on an iPhone 16 Pro Max. By default, the device's activeFormat supports 24MP (photo dimensions: {4032x3024, 5712x4284}). For the photoOutput, I'm setting the maxPhotoDimensions to videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject, and setting MaxPhotoQualityPrioritization to quality.
When capturing, I'm applying the same maxPhotoDimensions and photoQualityPrioritization settings from the photoOutput directly to the AVCapturePhotoSettings.
What could be the issue?
// Objective-C
// setup
[self.photoOutput setMaxPhotoQualityPrioritization:AVCapturePhotoQualityPrioritizationQuality];
CMVideoDimensions maxPhotoDimensions = [(NSValue *)videoDevice.activeFormat.supportedMaxPhotoDimensions.lastObject CMVideoDimensionsValue];
[self.photoOutput setMaxPhotoDimensions:maxPhotoDimensions];
// capturing
AVCapturePhotoSettings *photoSettings = [AVCapturePhotoSettings photoSettings];
photoSettings.maxPhotoDimensions = self.photoOutput.maxPhotoDimensions;
photoSettings.photoQualityPrioritization = self.photoOutput.maxPhotoQualityPrioritization;
[self.photoOutput capturePhotoWithSettings:photoSettings delegate:photoCaptureDelegate];
...
I'm developing an iOS app using AVFoundation for real-time video capture and object detection.
While implementing torch functionality with camera switching (between Wide and Ultra-Wide lenses), I encountered a critical issue where the camera freezes when toggling the torch while the Ultra-Wide camera is active.
Issue
If the torch is ON and I switch from Wide to Ultra-Wide, the camera freezes
If the Ultra-Wide camera is active and I try to turn the torch ON, the camera freezes
The iPhone Camera app allows using the torch while recording video with the Ultra-Wide lens, so this should be possible via AVFoundation as well.
Code snippet
DispatchQueue.global(qos: .userInitiated).async { [weak self] in
guard let self = self else { return }
let isSwitchingToUltraWide = !self.isUsingFisheyeCamera
let cameraType: AVCaptureDevice.DeviceType = isSwitchingToUltraWide ? .builtInUltraWideCamera : .builtInWideAngleCamera
let cameraName = isSwitchingToUltraWide ? "Ultra Wide" : "Wide"
guard let selectedCamera = AVCaptureDevice.default(cameraType, for: .video, position: .back) else {
DispatchQueue.main.async {
self.showAlert(title: "Camera Error", message: "\(cameraName) camera is not available on this device.")
}
return
}
do {
let currentInput = self.videoCapture.captureSession.inputs.first as? AVCaptureDeviceInput
self.videoCapture.captureSession.beginConfiguration()
if isSwitchingToUltraWide && self.isFlashlightOn {
self.forceEnableTorchThroughWide()
}
if let currentInput = currentInput {
self.videoCapture.captureSession.removeInput(currentInput)
}
let videoInput = try AVCaptureDeviceInput(device: selectedCamera)
self.videoCapture.captureSession.addInput(videoInput)
self.videoCapture.captureSession.commitConfiguration()
self.videoCapture.updateVideoOrientation()
DispatchQueue.main.async {
if let barButton = sender as? UIBarButtonItem {
barButton.title = isSwitchingToUltraWide ? "Wide" : "Ultra Wide"
barButton.tintColor = isSwitchingToUltraWide ? UIColor.systemGreen : UIColor.white
}
print("Switched to \(cameraName) camera.")
}
self.isUsingFisheyeCamera.toggle()
} catch {
DispatchQueue.main.async {
self.showAlert(title: "Camera Error", message: "Failed to switch to \(cameraName) camera: \(error.localizedDescription)")
}
}
}
}
Expected Behavior
Torch should be able to work when Ultra-Wide is active, just like the iPhone Camera app does.
The camera should not freeze when switching between Wide and Ultra-Wide with the torch ON.
AVCaptureSession should not crash when toggling the torch while Ultra-Wide is active.
Questions & Help Needed
Is this a known issue with AVFoundation?
How does the iPhone Camera app allow using the torch while recording in Ultra-Wide?
What’s the correct way to switch between Wide and Ultra-Wide cameras without freezing when the torch is active?
Info
Device tested: iPhone 13 Pro / iPhone 15 Pro / Iphone 15
iOS Version: iOS 17.3 / iOS 18.0
Xcode Version: 16.2
I am trying to use AVCaptureDevice.rotationCoordinator API to observe angles for preview and capture and it seems there is an issue with the API when used with arbitrary CALayer (which is not a AVCaptureVideoPreviewLayer) and switching cameras.
Here is my setup. The below function is defined in an actor class called CameraManager that performs setup of rotationCoordinator.
func updateRotationCoordinator(_ callback:@escaping @MainActor (CGFloat) -> Void) {
guard let device = sessionConfiguration.activeVideoInput?.device, let displayLayer = displayLayer else { return }
cancellables.removeAll()
rotationCoordinator = AVCaptureDevice.RotationCoordinator(device: device, previewLayer: displayLayer)
guard let coordinator = rotationCoordinator else { return }
coordinator.publisher(for: \.videoRotationAngleForHorizonLevelPreview)
.receive(on: DispatchQueue.main)
.sink { degrees in
let radians = degrees * .pi / 180
MainActor.assumeIsolated {
callback(radians)
}
}
.store(in: &cancellables)
}
This works the very first time but when I switch cameras and call this function again, it throws a runtime error that view's layer is modified from a non-main thread. This happens at the very line where rotation coordinator is been recreated. It's not clear why initialising rotation coordinator should modify CALayer properties right in it's init method.
Modifying properties of a view's layer off the main thread is not allowed: view <MyApp.DisplayLayerView: 0x102ffaf40> with nearest ancestor view controller <_TtGC7SwiftUI19UIHostingControllerGVS_15ModifiedContentVS_7AnyViewVS_12RootModifier__: 0x101f7fb80>; backtrace:
(
0 UIKitCore 0x0000000194a977b4 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 22509492
1 UIKitCore 0x000000019358594c 575E5140-FA6A-37C2-B00B-A4EACEDFDA53 + 416076
2 QuartzCore 0x00000001927f5bd8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43992
3 QuartzCore 0x00000001927f5a4c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 43596
4 QuartzCore 0x000000019283a41c D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 324636
5 QuartzCore 0x000000019283a0a8 D8E8E86D-85AC-3C90-B2E1-940235ECAA18 + 323752
6 AVFCapture 0x00000001af072a18 09192166-E0B6-346C-B1C2-7C95C3EFF7F7 + 420376
7 MyApp.debug.dylib 0x0000000105fa3914 $s10MyApp15CapturePipelineC25updateRotationCoordinatoryyy12CoreGraphics7CGFloatVScMYccF + 972
8 MyApp.debug.dylib 0x00000001063ade40 $s10MyApp11CameraModelC18switchVideoDevicesyyYaFTY3_ + 72
9 MyApp.debug.dylib 0x0000000105fe3cbd $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TQ1_ + 1
10 MyApp.debug.dylib 0x0000000105ff06d9 $s10MyApp11ContentViewV4bodyQrvg7SwiftUI6VStackVyAE05TupleE0VyAE6HStackVyAIyAE6SpacerV_AE6ButtonVyAE0E0PAEE5frame5width6height9alignmentQr12CoreGraphics7CGFloatVSg_AyE9AlignmentVtFQOyAqEE11scaledToFitQryFQOyAqEE10imageScaleyQrAE5ImageV0Z0OFQOyA3__Qo__Qo__Qo_GtGG_AmKyAIyAKyAIyAqEE7paddingyQrAE4EdgeO3SetV_AYtFQOyAA07CaptureM0V_Qo__AOyAE4TextVGAmKyAIyA9__AqEEArstUQrAY_AYA_tFQOyAM_Qo_A9_tGGtGG_AmqEE10background_AUQrqd___A_tAePRd__lFQOyAqEEArstUQrAY_AYA_tFQOyA21__Qo__AqEEArstUQrAY_AYA_tFQOyAE06_ShapeE0VyAE9RectangleVAE5ColorVG_Qo_Qo_SgtGGtGGyXEfU0_A42_yXEfU_A10_yXEfU_yyScMYccfU_yyYacfU_TATQ0_ + 1
11 MyApp.debug.dylib 0x0000000105f9c595 $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTQ0_ + 1
12 MyApp.debug.dylib 0x0000000105f9fb3d $sxIeAgHr_xs5Error_pIegHrzo_s8SendableRzs5NeverORs_r0_lTRTATQ0_ + 1
13 libswift_Concurrency.dylib 0x000000019c49fe39 E15CC6EE-9354-3CE5-AF91-F641CA8283E0 + 433721
)
Hi, I’m working on a photo backup app.
I track the PHAsset localIdentifier to determine which photos have been backed up and which haven’t.
Recently, I’ve noticed that two users seem to have experienced the localIdentifier changes after transferring data to a new iPhone using Quick Start.
Additionally, others on StackOverflow have mentioned that the localIdentifier sometimes changes after updating the iOS version.
https://stackoverflow.com/questions/40094728/phobject-localidentifier-reliability
I’d like to confirm the reliability of the localIdentifier after an iOS version upgrade or device transfer.
Can I continue using these locally stored localIdentifiers?
Or is there another recommended approach, such as using PHCloudIdentifier?
I would appreciate help in coding or an explanation what to use in swift for an app which will be able to capture LiDAR scanning and RGB data from taken pictures, generate a 3D mesh, and create .OBJ, .MTL, and .JPEG file set for further manipulation of 3D model.
Topic:
Media Technologies
SubTopic:
Photos & Camera
Tags:
3D Graphics
Swift Playground
Object Capture