Post

Replies

Boosts

Views

Activity

Reply to RealityKit MeshResource Memory Leak
Hi, I've noticed similar behaviour on the iOS 16 Beta and was able to isolate it down into a really simple project. This is happening on iOS 16.0 (20A5312j) and Xcode Version 14.0 beta 3 (14A5270f). On iOS 15 things appear to be working fine. Apparently this only happens with dynamically generated meshes, not for example when loading a model from a USDZ. Nevertheless it seems like a pretty critical bug and causes constant memory ramp up for me. FB Number with sample project: FB10806403 import UIKit import RealityKit class ViewController: UIViewController {     // if I change this to let arView = ARView() there is no memory leak     lazy var arView = ARView()     override func loadView() {         view = arView     }     override func viewDidLoad() {         super.viewDidLoad()         let boxSize: SIMD3<Float> = .init(repeating: 0.3)         // ↓ just allocating this MeshResource creates a leak on iOS 16         let _: MeshResource = .generateBox(             width: boxSize.x,             height: boxSize.y,             depth: boxSize.z         )     } } Setup: Leak:
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’22
Reply to RealityKit MeshResource Memory Leak
Some new observations from today: In my project with the steady memory increase on iOS 16 I assign environmentTexturing = .automatic on my ARWorldTrackingConfiguration and subscribe to SceneEvents.DidAddEntity and SceneEvents.WillRemoveEntity. Even if I leave the subscription handler empty, I get that memory increase which quickly goes into the range of 1GB. As soon as I either remove the subscriptions (the handler is empty) or set environmentTexturing = .none memory consumption improves drastically and stays in manageable ranges. I understand that enabling environment texturing is expected to require more memory, but here I'm seeing constant growth.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’22
Reply to RealityKit MeshResource Memory Leak
I'm experiencing the same issues on Beta 4. Disabling environment texturing prevents the memory from constantly growing. Memory debugger still reports the same leaks as in the screenshots above. Wether they are false positives or not; it's at least irritating during the development process. Additionally I found that if I run the default ARKit+RealityKit Xcode template project and enable sceneReconstruction = .mesh there are lot's of CFData leaks reported. No critical issue, but again I find it irritating during development and debugging. Screenshot attached.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’22
Reply to Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Hi, thanks for the reply! I tried that, but unfortunately it doesn't really work in my case (or only once) because when I create a new UnlitMaterial for an animation step, assign the opacity value and set it on the model, the texture is already gone when I try to create that new UnlitMaterial for another animation step. The only way that worked was if I permanently store the texture resource somewhere and assign it manually. But I would really like to avoid that, because then I need to make a lot of assumptions in my code about the material my fade system is working on. A more generic approach would be good. Maybe you can check out the sample project from my bug report in case I overlooked something? Appreciate the support!
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’22
Reply to Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Alright, so my workaround for now is that I store the initial materials for my model in a dedicated animation component, define a target opacity and then always derive a new material for each animation step from the initial material and assign that to the model. This way the texture persists. If I try to always dynamically query the current material from the model itself and adjust that, the texture is being purged after assigning a new blending value. Nevertheless I feel like this is a bug that should be fixed. Another observation; On both iOS 15 and 16 the order when assigning blending is crucial: Has no effect: var unlitMaterial = UnlitMaterial() unlitMaterial.blending = .transparent(opacity: 0.1) unlitMaterial.color = .init(texture: .init(textureResource)) Works: var unlitMaterial = UnlitMaterial() unlitMaterial.color = .init(texture: .init(textureResource)) unlitMaterial.blending = .transparent(opacity: 0.1)
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’22
Reply to How to get Euler angles in RealityKit?
Possible Solutions: public extension Transform {     // SceneKit Detour     var eulerAnglesQuickAndDirty: SIMD3<Float> {         let node = SCNNode()         node.simdTransform = matrix         return node.simdEulerAngles     }     // From: https://stackoverflow.com/questions/50236214/arkit-eulerangles-of-transform-matrix-4x4     var eulerAngles: SIMD3<Float> {         let matrix = matrix         return .init(             x: asin(-matrix[2][1]),             y: atan2(matrix[2][0], matrix[2][2]),             z: atan2(matrix[0][1], matrix[1][1])         )     } }
Topic: Spatial Computing SubTopic: ARKit Tags:
Sep ’22
Reply to Reality_Face Tracking_USDZ_HDRI Question
Looks to me like you haven't configured EnvironmentTexturing in your ARConfiguration https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing or you've set disableAREnvironmentLighting in the ARViews RenderOptions – https://developer.apple.com/documentation/realitykit/arview/renderoptions-swift.struct/disablearenvironmentlighting
Topic: Spatial Computing SubTopic: ARKit Tags:
Mar ’23
Reply to Mixing CIKernel with RealityKit Geometry Modifier - Error
Hi, on first glimpse that looks to me like either you misspelled the function name of your geometry modifier or your metal file isn't being compiled. If you are using the default metal library make sure that you have your geometry modifier function available like so: #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void nameOfYourGeometryModifier(realitykit::geometry_parameters params) { // … your shader }
Topic: Graphics & Games SubTopic: RealityKit Tags:
May ’23
Reply to How to store/hold some data in ModelEntity/Entity?
You could create a custom component containing your data (https://developer.apple.com/documentation/realitykit/component) and assign it to your entity. Then on tap check if such a component is available and read/display the data. Alternatively you could also subclass Entity and store some data within that custom class.
Topic: Spatial Computing SubTopic: ARKit Tags:
May ’23
Reply to RealityKit visionOS anchor to POV
I just found this: https://developer.apple.com/documentation/arkit/worldtrackingprovider/4218774-querypose which sounds promising if I enable ARKit tracking. Will give it a go. Can someone from the RealityKit confirm that this would be the way to go? Also there is https://developer.apple.com/documentation/realitykit/anchoringcomponent/target-swift.enum/head Does this also only work when ARKit is enabled? So far I wasn't able to run it successfully in the Simulator.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jun ’23
Reply to Detecting Tap Location on detected "PlaneAnchor"? Replacement for Raycast?
Hi, I've found that you can retrieve the RealityKit Scene from an entity, once added (subscribe to the event). Then you can use the Scene's Raycast methods, e.g: https://developer.apple.com/documentation/realitykit/scene/raycast(from:to:query:mask:relativeto:) For that to work though you would need to create CollisionShapes for either the detected planes or the reconstructed world mesh geometry.
Topic: Spatial Computing SubTopic: ARKit Tags:
Aug ’23
Reply to PortalComponent – allow world content to peek out
Hi, thanks so much for the quick reply! I was afraid that would be the case, seems kinda wasteful with the mesh copy :/ Will definitely file an enhancement request! I got it working now: https://streamable.com/7nakkb But instead of using the ShaderGraphMaterial I put a plane behind the portal with an occlusion material. AFAIK ShaderGraphMaterial does not allow me to discard fragments though, correct? And I should probably be able to still use PBR materials instead of Unlit if I give both models the same ImageBasedLightReceiverComponent? Regarding ClippingPlane I'm curious: what would be a sample use case for the clipping inside of the portal world?
Topic: Graphics & Games SubTopic: General Tags:
Feb ’24