Post

Replies

Boosts

Views

Activity

Deleted Planes in ARKit Immediately Reappear
Hello, In my app I'm trying to delete all but one chosen plane and do some raycasting on that plane. I noticed that, whenever I tried to delete other planes they would instantly reappear. Here is some sample code of the ARViewController I'm using that demonstrates the problem class ARViewController: UIViewController { var arView: ARView! *** Bunch of stuff *** func session(_ session: ARSession, didAdd anchors: [ARAnchor]) { // Iterate through the detected anchors for anchor in anchors { // Check if the detected anchor is an ARPlaneAnchor if let planeAnchor = anchor as? ARPlaneAnchor { plane_count += 1 print("Plane added. Number of plane anchors = \(plane_count)") } } } func session(_ session: ARSession, didRemove anchors: [ARAnchor]) { for anchor in anchors { if let planeAnchor = anchor as? ARPlaneAnchor { plane_count -= 1 print("SESSION CALLED: Plane Removed. Number of plane anchors = \(plane_count)") } } } func deletePlanes(){ for anchor in arView.session.currentFrame?.anchors ?? [] { arView.session.remove(anchor: anchor) } } When deletePlanes() is called, I'll see the following output populate instantly SESSION CALLED: Plane Removed. Number of plane anchors = 2 SESSION CALLED: Plane Removed. Number of plane anchors = 1 SESSION CALLED: Plane Removed. Number of plane anchors = 0 Plane added. Number of plane anchors = 1 Plane added. Number of plane anchors = 2 Plane added. Number of plane anchors = 3 This even occurs when the phone is face down after detecting a few planes. It appears that the planes are not actually being removed from the session. Please let me know if I'm doing anything wrong here! Thanks.
0
0
366
Jun ’23
ARFrame.sceneDepth not correctly registered with ARFrame.capturedImage for iPad Pro (6th Gen) for high resolution capture.
Hi team, I believe I’ve found a registration issue between ARFrame.sceneDepth and ARFrame.capturedImage when using high-resolution frame capture on a 2022 iPad Pro (6th gen). When enabling high-resolution capture: if let highResFormat = ARWorldTrackingConfiguration.recommendedVideoFormatForHighResolutionFrameCapturing { config.videoFormat = highResFormat } … arView.session.captureHighResolutionFrame { ... } the depth map provided by ARFrame.sceneDepth no longer aligns correctly with the corresponding high-resolution capturedImage. This misalignment results in consistently over-estimated distance measurements in my app (which relies on mapping depth to 2D pixel coordinates). iPad Pro (6th gen): misalignment occurs only when capturing high-resolution frames. iPhone 16 Pro: depth is correctly registered for both standard and high-resolution captures. It appears the camera intrinsics, specifically the FOV, change between the “regular” resolution stream and the high-resolution capture on the iPad. My suspicion is that the depth data continues using the intrinsics of the lower resolution stream, resulting in an unregistered depth-to-RGB mapping. Once I have the iPad in hand again, I will confirm whether camera.intrinsics or FOV differ between the low-res and high-res frames. Is this a known issue with high-resolution frame capture on the 2022 iPad Pro? If not, I’m happy to provide some more thorough sample code. Thanks for your time!
0
0
135
4w