During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there.
This low FPS will continue even when I kill the app completely and restart.
However, after giving my phone a break, the fps returns to 60 fps.
Does ARKit automatically throttle down FPS when the device gets too hot?
If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Does RealityKit have an API to test if a ModelEntity (or its CollisionComponent) is currently visible on the screen?
In ARKit+RealityKit I do a raycast from the ARView's center, then create an AnchorEntity at the result and add a target ModelEntity (a flattened cube) to the AnchorEntity.
guard let result = session.raycast(query).first else { return }
let newAnchor = AnchorEntity(raycastResult: result)
newAnchor.addChild(placementTargetEntity)
arView.scene.addAnchor(newAnchor)
I repeat this for each frame update via the ARSessionDelegate session(_:didUpdate:), removing the previous AnchorEntity first.
I use this as a target to let the user know where the full model will be placed when they tap the screen.
This works find under iOS 14, but I get strange results with iPadOS 15 - two different placements are created on different screen updates, offset from each other and slightly rotated from each other.
Has anyone else had issues with raycast() or creating an AnchorEntity from the result?
Is the use of session(_:didUpdate:) via ARSessionDelegate to update virtual content considered bad style now? (I noticed in the WWDC21 they used a different mechanism to update their virtual content.)
(If any Apple engineers read this, I filed a feedback with sample code and video of the issue at FB9535616)
I am experiencing a single video frame glitch when transitioning from one RealityKit Entity animation to another when transitionDuration is non-zero.
This is with the current RealityKit and iOS 14.6 (i.e., not the betas).
Is this a known issue?
Have people succeeded in transitioning from one animation to another with a non-zero transition time and no strange blink?
Background:
I loaded two USDZ models, each with a different animation. One model will be shown, but the AnimationResource from the second model will (at some point) be applied to the first model.
I originally created the models with Adobe's mixamo site (they are characters moving), downloaded the .fbx files, and then converted them to USDZ with Apple's "Reality Converter".
I start the first model (robot) with its animation, then at some point I apply the animation from the second model (nextAnimationToPlay) to the original model (robot).
If the transitionDuration is set to something other than 0, there appears a single video frame glitch (or blink) before the animation transition occurs (that single frame may be the model's original T-pose, but I'm not certain).
robot.playAnimation(nextAnimationToPlay, transitionDuration: 1.0, startsPaused: false)
If transitionDuration is set to 0, there is no glitch, but then I lose the smooth transition.
I have tried variations. For example, setting startPaused to "true", and then calling resume() on the playback controller; also, waiting until the current animation completes before calling the playAnimation() with the next animation. Still, I get the quick blink.
Any suggestions or pointers would be appreciated.
Thanks,
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds)
Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*).
Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
The ARMeshGeometry - https://developer.apple.com/documentation/arkit/armeshgeometry documentation references ARMeshClassification, - https://developer.apple.com/documentation/arkit/armeshclassification but I cannot find any obvious way to get classification information for the mesh data.
I found the classificationOf(faceWithIndex: index) function in the Xcode sample project Visualizing and Interacting with a Reconstructed Scene - https://developer.apple.com/documentation/arkit/content_anchors/visualizing_and_interacting_with_a_reconstructed_scene, but it seems pretty complex.
Is there something simpler that I am missing?
It also seems from the code that a mesh doesn't have a classification, but only individual geometry faces in the mesh have a classification.
Is it common for a single mesh to represent many different objects (e.g., a chair, floor, and wall) all at the same time?
Thanks,
Given an AnchorEntity from say RealityKit's Scene anchors collection, is it possible to retrieve the ARAnchor that was used when creating the AnchorEntity?
Looking through the AnchorEntity documentation, - https://developer.apple.com/documentation/realitykit/anchorentity it seems that while you can create an AnchorEntity using an ARAnchor, there is no way to retrieve that ARAnchor afterwards.
Alternatively, the ARSession delegate functions receive a list of ARAnchors or an ARFrame that has ARAnchors, but I could not find an approach to retrieve AnchorEntities that might be associated with any of these ARAnchors.
Given an ARAnchor, is there a way to get an AnchorEntity associated with it?
Thanks,
Is it possible to turn on and off different occlusion material when using Scene Understanding with LiDAR and RealityKit?
For example, if ARKit identifies a wall, I don't want that mesh to be used during occlusion (but I do want occlusion for other things, like the couch or the floor)
If I could do this, it would essentially make my walls transparent, and I could see the RealityKit objects that extend beyond the room I am in.
Thanks,