Post

Replies

Boosts

Views

Activity

Reply to iPhone 12 Pro vs iPad Pro- which has better LiDAR?
I have the iPhone 12 Pro Max and my wife has the most recent iPad Pro. I use both for testing AR apps, and I have generally found overall the iPhone 12 Pro Max is a snappier experience. I don't know how much of that is due to the difference in CPUs (my iPhone has an A14 while the iPad Pro has an A12Z) vs. the amount of pixels each has to push to the screen (iPhone 2778‑by‑1284, iPad 2388-by-1668) vs. different LiDAR generations (Apple doesn't specify this). While the iPhone is qualitatively snappier to me, the iPad provides a more immersive experience because of the larger screen. So snappier vs. more immersive. One warning: there have been pretty steady rumors that Apple will release a new iPad Pro very soon. Another warning: I haven't found an elegant way to convert the ARMeshGeometry Apple provides in an ARMeshAnchor (things that live in ARKit land) to a MeshResource for a ModelComponent (things that live in RealityKit land). I am hoping Apple provides a richer API for MeshResource in the next release. Another warning: Turning on occlusion with LiDAR equipped iOS devices is awesome! You won't want to go back to non-LiDAR life afterwards.
Topic: Spatial Computing SubTopic: ARKit Tags:
Mar ’21
Reply to RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
Update: I checked the Entity's initial rotation rotation (simd_quatf) and destination rotation used in move(to:relativeTo:duration:timingFunction:), and then I checked the Entity's rotation after the animation completes. The axis Y component after the animation completed has flipped (and so the angle has changed accordingly) Here is what I get "normally" (or at least how I expected) The Final angle and axis are the same as the destination I gave to the animation function: Initial Angle: 171.89, Vector: 0.00, 1.00, 0.00 Destination Angle: 206.26, Vector: 0.00, 1.00, 0.00 Final Angle: 206.26, Vector: 0.00, 1.00, 0.00 Here is what I get just prior to the unexpected behavior: Initial Angle: 206.26, Vector: 0.00, 1.00, 0.00 Destination Angle: 240.64, Vector: 0.00, 1.00, 0.00 Final Angle: 119.36, Vector: 0.00, -1.00, 0.00 Note the Y component of the axis has flipped and the angle is 119.3 = 360 - 240.64. I had not counted on the Y axis flipping. I guess the answer is not to assume the axes (and angle) after the animation completes are the same as the axes (and angle) supplied as the destination Transform in the move(to:relativeTo:duration:timingFunction:) function.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Mar ’21
Reply to RealityKit iPhone camera lens
I don't know if Apple will answer this, but from my experience using a number of devices I suspect Apple uses a combination of Standard wide lens Ultra wide lens LiDAR sensor and merges all this data to understand the world around it and the device's location in that world. The more combination of sensors Apple can leverage on a given device, the more effective the tracking is, so the better the experience is. A device with a single standard wide lens will do (but you do need to move the device side-to-side to help it triangulate depth). Wide + Ultra wide lens is better. Wide + Ultra wide + LiDAR is pretty sweet.
Mar ’21