Post

Replies

Boosts

Views

Activity

Take a builtInUltraWideCamera photo during ARKit world tracking?
I need to be able to capture a still image from builtInUltraWideCamera during an ARKit session (ARWorldTrackingConfiguration not face tracking). It doesn't matter if taking the photo takes a few seconds, I just need an image and to be able to get the camera position/orientation from ARKit as usual. The only way I can see to do this is to pause the ARKit session, grab the image and then immediately resume the session with .initialWorldMap and hope it resumes successfully. This seems like a poor solution given that ARKit in ARWorldTrackingConfiguration is using builtInUltraWideCamera internally for tracking and could share an image from it if there was a suitable API. Is there any better way forward?
1
1
777
Apr ’23
builtInLiDARDepthCamera doesn't work on the 2020 iPad Pro on iOS 26
On iOS 26.1, this throws on the 2020 iPad Pro (4th gen) but works fine on an M4 iPad Pro or iPhone 15 Pro: guard let device = AVCaptureDevice.default(.builtInLiDARDepthCamera, for: .video, position: .back) else { throw ConfigurationError.lidarDeviceUnavailable } It's just the standard code from Apple's own sample code so obviously used to work: https://developer.apple.com/documentation/AVFoundation/capturing-depth-using-the-lidar-camera Does it fail because Apple have silently dumped support for the older LiDAR sensor used prior to the M4 iPad Pro, or is there another reason? What about the 5th and 6th gen iPad Pro, does it still work on those?
2
0
374
4w