Post

Replies

Boosts

Views

Activity

RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds) Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*). Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
2
0
1.7k
Apr ’21
RealityKit playAnimation with transitionDuration causes a blink/glitch frame
I am experiencing a single video frame glitch when transitioning from one RealityKit Entity animation to another when transitionDuration is non-zero. This is with the current RealityKit and iOS 14.6 (i.e., not the betas). Is this a known issue? Have people succeeded in transitioning from one animation to another with a non-zero transition time and no strange blink? Background: I loaded two USDZ models, each with a different animation. One model will be shown, but the AnimationResource from the second model will (at some point) be applied to the first model. I originally created the models with Adobe's mixamo site (they are characters moving), downloaded the .fbx files, and then converted them to USDZ with Apple's "Reality Converter". I start the first model (robot) with its animation, then at some point I apply the animation from the second model (nextAnimationToPlay) to the original model (robot). If the transitionDuration is set to something other than 0, there appears a single video frame glitch (or blink) before the animation transition occurs (that single frame may be the model's original T-pose, but I'm not certain). robot.playAnimation(nextAnimationToPlay, transitionDuration: 1.0, startsPaused: false) If transitionDuration is set to 0, there is no glitch, but then I lose the smooth transition. I have tried variations. For example, setting startPaused to "true", and then calling resume() on the playback controller; also, waiting until the current animation completes before calling the playAnimation() with the next animation. Still, I get the quick blink. Any suggestions or pointers would be appreciated. Thanks,
1
0
1.3k
Aug ’21
flickering, double vision with raycast on iPadOS 15.0
In ARKit+RealityKit I do a raycast from the ARView's center, then create an AnchorEntity at the result and add a target ModelEntity (a flattened cube) to the AnchorEntity. guard let result = session.raycast(query).first else { return } let newAnchor = AnchorEntity(raycastResult: result) newAnchor.addChild(placementTargetEntity) arView.scene.addAnchor(newAnchor) I repeat this for each frame update via the ARSessionDelegate session(_:didUpdate:), removing the previous AnchorEntity first. I use this as a target to let the user know where the full model will be placed when they tap the screen. This works find under iOS 14, but I get strange results with iPadOS 15 - two different placements are created on different screen updates, offset from each other and slightly rotated from each other. Has anyone else had issues with raycast() or creating an AnchorEntity from the result? Is the use of session(_:didUpdate:) via ARSessionDelegate to update virtual content considered bad style now? (I noticed in the WWDC21 they used a different mechanism to update their virtual content.) (If any Apple engineers read this, I filed a feedback with sample code and video of the issue at FB9535616)
5
0
1.2k
Aug ’21
ARKit FPS drop when device gets hot?
During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there. This low FPS will continue even when I kill the app completely and restart. However, after giving my phone a break, the fps returns to 60 fps. Does ARKit automatically throttle down FPS when the device gets too hot? If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
2
0
1.7k
Sep ’21
PHAsset of older photos is missing GPS info in Swift
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset. For newer photos, the asset.location is populated with GPS information. Also, with newer photos, CIImage's property dictionary has {GPS} information. So all is good with newer photos. But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage. However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information. So am I missing some settings to find the GPS information in older photos using PHAsset on iOS? Thanks,
2
0
1.3k
Sep ’21
StoreKit 2 demo buy buttons flips state
I've been playing with Apple's StoreKit 2 demo code (buying the cars, subscriptions, ...), and sometimes when I purchase a car, one or more of the other buttons visually flip state (e.g., purchased checkmark changes back to the price). Leaving the StoreView and returning to it shows the correct state for each of the buttons. I am using the StoreKit Configuration Products.storekit (for the scheme), so testing in Xcode. I get this in both the simulator and on my actual phone. The issue is random. The vast majority of the time everything works perfectly. Is anyone else seeing this issue? Does anyone know how to address it? Dev environment: Xcode 13.0 beta 5 (13A5212g) macOS 12.0 Beta (21A5534d) Mac mini (M1, 2020)
1
0
802
Oct ’21
AnchorEntity(plane:) ground shadow without the plane?
When I create an AnchorEntity like this: let entityAnchor = AnchorEntity(plane: [.horizontal], classification: [.floor], minimumBounds: [0.2,0.2]) and add a USDZ model to it, I get a nice ground shadow. But if I create an AnchorEntity using an ARAnchor like this: let entityAnchor = AnchorEntity(anchor: anchor) I do not get that nice ground shadow. Is there a way to get that ground shadow I get from a plane anchor but with an EntityAnchor where I can specify where it goes or attach it to an ARAnchor? [Note: for LiDAR devices, I can get a nice shadow using config.sceneReconstruction = .mesh arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.receivesLighting) but creating the environment mesh is computationally expensive. I'd like to avoid that if possible.]
0
0
882
Oct ’21
Guidance on USDZ model sizes
I'm looking for documentation/guidance on USDZ and scene model sizes. My focus is on RealityKit-based apps. I found the 2018 WWDC presentation Integrating Apps and Content with AR Quick Look which mentions a rule of thumb for a USDZ model of: 100K polygons One set of 2048x2048 textures 10 seconds of animations Are these number still recommended in 2021? Are these numbers just for Quicklook, or do they apply to RealityKit-based apps too? If a RealityKit scene loads several USDZ models, should the cumulative number of polygons across all models be 100K, or is the 100K number on a per-model basis? The talk mentioned AR Quicklook will dynamically downsample textures for devices with less memory. Does RealityKit do this as well? If so, can I error on providing a larger texture (e.g., 4096 x 4096) and trust RealityKit to downsample as appropriate for me? (I am hoping there is some documentation covering questions like this)
1
0
2k
Nov ’21
usdzconvert and metersPerUnit
I've been creating USDA files manually and converting them to USDZ via Apple's usdzconvert tool (version 0.64). In the file I set unit size to be 1 meter metersPerUnit = 1.0 but the USDZ keeps the unit size at 1 cm. Apple's Reality Converter does process the metersPerUnit metadata, so that is a viable work-around for me. But sometimes I'd prefer the command-line tool. Is there an update to the usdzconvert tool? I couldn't find one.
0
0
900
Nov ’21
OcclusionMaterial filter?
RealityKit has a CollisionFilter to determine which entities can collide with which other ones. Perchance, is there something similar for OcclusionMaterial? In effect, I'd like to have the ability to have a model with an OcclusionMaterial "occlude this entity but not that entity".
0
0
546
Nov ’21
AR Quick Look additional controls?
I've recently added some USDZ files to a web page, and I can download and display them fine via AR Quick Look on an iPhone or iPad. I've noticed full occlusion is active in the AR view. Over time, the device appears to heat up and the frame rate drops. Are there any properties I can set in the <a rel="ar" ...> HTML tag to control things like occlusion or autofocus (i.e., turn them off)?
0
0
938
Nov ’21
polygon count vs. triangle count?
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
2
0
2.5k
Nov ’21
Are RealityKit lights expensive?
I am finding some unexpected behavior with lights I've been adding to a RealityKit scene. For example, I created 14 PointLights, but only 8 appeared to be used to illuminate the scene. In another example, I created 7 PointLights and 7 SpotLights, and the frame rate dropped quite a bit. Are lights computationally expensive, causing some adaptive behavior by RealityKit? Should I be judicious in my use of lights for a scene? (Note: I set arView.environment.lighting.resource to a Skybox with a black image; my goal was to completely control the lighting. I don't know if that added to the computational load)
1
1
1.2k
Dec ’21
RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds) Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*). Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
Replies
2
Boosts
0
Views
1.7k
Activity
Apr ’21
RealityKit playAnimation with transitionDuration causes a blink/glitch frame
I am experiencing a single video frame glitch when transitioning from one RealityKit Entity animation to another when transitionDuration is non-zero. This is with the current RealityKit and iOS 14.6 (i.e., not the betas). Is this a known issue? Have people succeeded in transitioning from one animation to another with a non-zero transition time and no strange blink? Background: I loaded two USDZ models, each with a different animation. One model will be shown, but the AnimationResource from the second model will (at some point) be applied to the first model. I originally created the models with Adobe's mixamo site (they are characters moving), downloaded the .fbx files, and then converted them to USDZ with Apple's "Reality Converter". I start the first model (robot) with its animation, then at some point I apply the animation from the second model (nextAnimationToPlay) to the original model (robot). If the transitionDuration is set to something other than 0, there appears a single video frame glitch (or blink) before the animation transition occurs (that single frame may be the model's original T-pose, but I'm not certain). robot.playAnimation(nextAnimationToPlay, transitionDuration: 1.0, startsPaused: false) If transitionDuration is set to 0, there is no glitch, but then I lose the smooth transition. I have tried variations. For example, setting startPaused to "true", and then calling resume() on the playback controller; also, waiting until the current animation completes before calling the playAnimation() with the next animation. Still, I get the quick blink. Any suggestions or pointers would be appreciated. Thanks,
Replies
1
Boosts
0
Views
1.3k
Activity
Aug ’21
flickering, double vision with raycast on iPadOS 15.0
In ARKit+RealityKit I do a raycast from the ARView's center, then create an AnchorEntity at the result and add a target ModelEntity (a flattened cube) to the AnchorEntity. guard let result = session.raycast(query).first else { return } let newAnchor = AnchorEntity(raycastResult: result) newAnchor.addChild(placementTargetEntity) arView.scene.addAnchor(newAnchor) I repeat this for each frame update via the ARSessionDelegate session(_:didUpdate:), removing the previous AnchorEntity first. I use this as a target to let the user know where the full model will be placed when they tap the screen. This works find under iOS 14, but I get strange results with iPadOS 15 - two different placements are created on different screen updates, offset from each other and slightly rotated from each other. Has anyone else had issues with raycast() or creating an AnchorEntity from the result? Is the use of session(_:didUpdate:) via ARSessionDelegate to update virtual content considered bad style now? (I noticed in the WWDC21 they used a different mechanism to update their virtual content.) (If any Apple engineers read this, I filed a feedback with sample code and video of the issue at FB9535616)
Replies
5
Boosts
0
Views
1.2k
Activity
Aug ’21
API to test if ModelEntity is visible?
Does RealityKit have an API to test if a ModelEntity (or its CollisionComponent) is currently visible on the screen?
Replies
1
Boosts
0
Views
709
Activity
Sep ’21
ARKit FPS drop when device gets hot?
During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there. This low FPS will continue even when I kill the app completely and restart. However, after giving my phone a break, the fps returns to 60 fps. Does ARKit automatically throttle down FPS when the device gets too hot? If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
Replies
2
Boosts
0
Views
1.7k
Activity
Sep ’21
PHAsset of older photos is missing GPS info in Swift
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset. For newer photos, the asset.location is populated with GPS information. Also, with newer photos, CIImage's property dictionary has {GPS} information. So all is good with newer photos. But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage. However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information. So am I missing some settings to find the GPS information in older photos using PHAsset on iOS? Thanks,
Replies
2
Boosts
0
Views
1.3k
Activity
Sep ’21
StoreKit 2 demo buy buttons flips state
I've been playing with Apple's StoreKit 2 demo code (buying the cars, subscriptions, ...), and sometimes when I purchase a car, one or more of the other buttons visually flip state (e.g., purchased checkmark changes back to the price). Leaving the StoreView and returning to it shows the correct state for each of the buttons. I am using the StoreKit Configuration Products.storekit (for the scheme), so testing in Xcode. I get this in both the simulator and on my actual phone. The issue is random. The vast majority of the time everything works perfectly. Is anyone else seeing this issue? Does anyone know how to address it? Dev environment: Xcode 13.0 beta 5 (13A5212g) macOS 12.0 Beta (21A5534d) Mac mini (M1, 2020)
Replies
1
Boosts
0
Views
802
Activity
Oct ’21
AnchorEntity(plane:) ground shadow without the plane?
When I create an AnchorEntity like this: let entityAnchor = AnchorEntity(plane: [.horizontal], classification: [.floor], minimumBounds: [0.2,0.2]) and add a USDZ model to it, I get a nice ground shadow. But if I create an AnchorEntity using an ARAnchor like this: let entityAnchor = AnchorEntity(anchor: anchor) I do not get that nice ground shadow. Is there a way to get that ground shadow I get from a plane anchor but with an EntityAnchor where I can specify where it goes or attach it to an ARAnchor? [Note: for LiDAR devices, I can get a nice shadow using config.sceneReconstruction = .mesh arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.receivesLighting) but creating the environment mesh is computationally expensive. I'd like to avoid that if possible.]
Replies
0
Boosts
0
Views
882
Activity
Oct ’21
Guidance on USDZ model sizes
I'm looking for documentation/guidance on USDZ and scene model sizes. My focus is on RealityKit-based apps. I found the 2018 WWDC presentation Integrating Apps and Content with AR Quick Look which mentions a rule of thumb for a USDZ model of: 100K polygons One set of 2048x2048 textures 10 seconds of animations Are these number still recommended in 2021? Are these numbers just for Quicklook, or do they apply to RealityKit-based apps too? If a RealityKit scene loads several USDZ models, should the cumulative number of polygons across all models be 100K, or is the 100K number on a per-model basis? The talk mentioned AR Quicklook will dynamically downsample textures for devices with less memory. Does RealityKit do this as well? If so, can I error on providing a larger texture (e.g., 4096 x 4096) and trust RealityKit to downsample as appropriate for me? (I am hoping there is some documentation covering questions like this)
Replies
1
Boosts
0
Views
2k
Activity
Nov ’21
In-App Purchase with TestFlight, many prompts.
During my first external test using TestFlight for an In-App Purchase (iPadOS), the user was (1) Prompted for their Apple ID & password (2) Prompted for their password a second time (3) (User believes) prompted for their password a third time Are these multiple prompts for their password expected behavior, or have I done something wrong?
Replies
2
Boosts
0
Views
2.1k
Activity
Nov ’21
usdzconvert and metersPerUnit
I've been creating USDA files manually and converting them to USDZ via Apple's usdzconvert tool (version 0.64). In the file I set unit size to be 1 meter metersPerUnit = 1.0 but the USDZ keeps the unit size at 1 cm. Apple's Reality Converter does process the metersPerUnit metadata, so that is a viable work-around for me. But sometimes I'd prefer the command-line tool. Is there an update to the usdzconvert tool? I couldn't find one.
Replies
0
Boosts
0
Views
900
Activity
Nov ’21
OcclusionMaterial filter?
RealityKit has a CollisionFilter to determine which entities can collide with which other ones. Perchance, is there something similar for OcclusionMaterial? In effect, I'd like to have the ability to have a model with an OcclusionMaterial "occlude this entity but not that entity".
Replies
0
Boosts
0
Views
546
Activity
Nov ’21
AR Quick Look additional controls?
I've recently added some USDZ files to a web page, and I can download and display them fine via AR Quick Look on an iPhone or iPad. I've noticed full occlusion is active in the AR view. Over time, the device appears to heat up and the frame rate drops. Are there any properties I can set in the <a rel="ar" ...> HTML tag to control things like occlusion or autofocus (i.e., turn them off)?
Replies
0
Boosts
0
Views
938
Activity
Nov ’21
polygon count vs. triangle count?
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
Replies
2
Boosts
0
Views
2.5k
Activity
Nov ’21
Are RealityKit lights expensive?
I am finding some unexpected behavior with lights I've been adding to a RealityKit scene. For example, I created 14 PointLights, but only 8 appeared to be used to illuminate the scene. In another example, I created 7 PointLights and 7 SpotLights, and the frame rate dropped quite a bit. Are lights computationally expensive, causing some adaptive behavior by RealityKit? Should I be judicious in my use of lights for a scene? (Note: I set arView.environment.lighting.resource to a Skybox with a black image; my goal was to completely control the lighting. I don't know if that added to the computational load)
Replies
1
Boosts
1
Views
1.2k
Activity
Dec ’21