Post

Replies

Boosts

Views

Activity

Reply to Get coordinates of pivot point in ModelEntity
Hi, as far as I know there is currently no way to retrieve the pivot point directly. What you can do though is to query the bounding box and then calculate a Y-Offset based on the min and max positions of the bounding boxes corners and set that as the position of the entity. Now when you wrap it into another empty parent entity, you should have the desired effect. public extension Entity {     enum PivotPosition {         case top         case center         case bottom     }     func wrapEntityAndSetPivotPosition(to targetPosition: PivotPosition) -> Entity {         setPivotPosition(to: targetPosition, animated: false)         let entity = Entity()         entity.addChild(self)         return entity     }     func setPivotPosition(to targetPosition: PivotPosition, animated: Bool = false) {         let boundingBox = visualBounds(relativeTo: nil)         let min = boundingBox.min         let max = boundingBox.max         let yTranslation: Float         switch targetPosition {         case .top:             yTranslation = -max.y         case .center:             yTranslation = -(min.y + (max.y - min.y) / 2)         case .bottom:             yTranslation = -min.y         }         let targetPosition = simd_float3(             x: boundingBox.center.x * -1,             y: yTranslation,             z: boundingBox.center.z * -1         )         guard animated else {             position = targetPosition             return         }         guard isAnchored, parent != nil else {             print("Warning: to set the Entities pivot position animated make sure it is already anchored and has a parent set.")             return         }         var translationTransform = transform         translationTransform.translation = targetPosition         move(to: translationTransform, relativeTo: parent, duration: 0.3, timingFunction: .easeOut)     } } And the whole thing in action:         let boxModel = ModelEntity(mesh: .generateBox(size: 0.3))         let wrappedBoxEntity = boxModel.wrapEntityAndSetPivotPosition(to: .bottom)         let boxAnchor = AnchorEntity(plane: .horizontal)         boxAnchor.addChild(wrappedBoxEntity)         arView.scene.anchors.append(boxAnchor) I agree though that there should be a dedicated API for this – as we have in SceneKit. I filed feedback a while ago.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Apr ’22
Reply to RealityKit MeshResource Memory Leak
Hi, I've noticed similar behaviour on the iOS 16 Beta and was able to isolate it down into a really simple project. This is happening on iOS 16.0 (20A5312j) and Xcode Version 14.0 beta 3 (14A5270f). On iOS 15 things appear to be working fine. Apparently this only happens with dynamically generated meshes, not for example when loading a model from a USDZ. Nevertheless it seems like a pretty critical bug and causes constant memory ramp up for me. FB Number with sample project: FB10806403 import UIKit import RealityKit class ViewController: UIViewController {     // if I change this to let arView = ARView() there is no memory leak     lazy var arView = ARView()     override func loadView() {         view = arView     }     override func viewDidLoad() {         super.viewDidLoad()         let boxSize: SIMD3<Float> = .init(repeating: 0.3)         // ↓ just allocating this MeshResource creates a leak on iOS 16         let _: MeshResource = .generateBox(             width: boxSize.x,             height: boxSize.y,             depth: boxSize.z         )     } } Setup: Leak:
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’22
Reply to RealityKit MeshResource Memory Leak
Some new observations from today: In my project with the steady memory increase on iOS 16 I assign environmentTexturing = .automatic on my ARWorldTrackingConfiguration and subscribe to SceneEvents.DidAddEntity and SceneEvents.WillRemoveEntity. Even if I leave the subscription handler empty, I get that memory increase which quickly goes into the range of 1GB. As soon as I either remove the subscriptions (the handler is empty) or set environmentTexturing = .none memory consumption improves drastically and stays in manageable ranges. I understand that enabling environment texturing is expected to require more memory, but here I'm seeing constant growth.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’22
Reply to RealityKit MeshResource Memory Leak
I'm experiencing the same issues on Beta 4. Disabling environment texturing prevents the memory from constantly growing. Memory debugger still reports the same leaks as in the screenshots above. Wether they are false positives or not; it's at least irritating during the development process. Additionally I found that if I run the default ARKit+RealityKit Xcode template project and enable sceneReconstruction = .mesh there are lot's of CFData leaks reported. No critical issue, but again I find it irritating during development and debugging. Screenshot attached.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’22
Reply to Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Hi, thanks for the reply! I tried that, but unfortunately it doesn't really work in my case (or only once) because when I create a new UnlitMaterial for an animation step, assign the opacity value and set it on the model, the texture is already gone when I try to create that new UnlitMaterial for another animation step. The only way that worked was if I permanently store the texture resource somewhere and assign it manually. But I would really like to avoid that, because then I need to make a lot of assumptions in my code about the material my fade system is working on. A more generic approach would be good. Maybe you can check out the sample project from my bug report in case I overlooked something? Appreciate the support!
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’22
Reply to Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Alright, so my workaround for now is that I store the initial materials for my model in a dedicated animation component, define a target opacity and then always derive a new material for each animation step from the initial material and assign that to the model. This way the texture persists. If I try to always dynamically query the current material from the model itself and adjust that, the texture is being purged after assigning a new blending value. Nevertheless I feel like this is a bug that should be fixed. Another observation; On both iOS 15 and 16 the order when assigning blending is crucial: Has no effect: var unlitMaterial = UnlitMaterial() unlitMaterial.blending = .transparent(opacity: 0.1) unlitMaterial.color = .init(texture: .init(textureResource)) Works: var unlitMaterial = UnlitMaterial() unlitMaterial.color = .init(texture: .init(textureResource)) unlitMaterial.blending = .transparent(opacity: 0.1)
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’22
Reply to How to get Euler angles in RealityKit?
Possible Solutions: public extension Transform {     // SceneKit Detour     var eulerAnglesQuickAndDirty: SIMD3<Float> {         let node = SCNNode()         node.simdTransform = matrix         return node.simdEulerAngles     }     // From: https://stackoverflow.com/questions/50236214/arkit-eulerangles-of-transform-matrix-4x4     var eulerAngles: SIMD3<Float> {         let matrix = matrix         return .init(             x: asin(-matrix[2][1]),             y: atan2(matrix[2][0], matrix[2][2]),             z: atan2(matrix[0][1], matrix[1][1])         )     } }
Topic: Spatial Computing SubTopic: ARKit Tags:
Sep ’22
Reply to Reality_Face Tracking_USDZ_HDRI Question
Looks to me like you haven't configured EnvironmentTexturing in your ARConfiguration https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing or you've set disableAREnvironmentLighting in the ARViews RenderOptions – https://developer.apple.com/documentation/realitykit/arview/renderoptions-swift.struct/disablearenvironmentlighting
Topic: Spatial Computing SubTopic: ARKit Tags:
Mar ’23
Reply to Mixing CIKernel with RealityKit Geometry Modifier - Error
Hi, on first glimpse that looks to me like either you misspelled the function name of your geometry modifier or your metal file isn't being compiled. If you are using the default metal library make sure that you have your geometry modifier function available like so: #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void nameOfYourGeometryModifier(realitykit::geometry_parameters params) { // … your shader }
Topic: Graphics & Games SubTopic: RealityKit Tags:
May ’23
Reply to How to store/hold some data in ModelEntity/Entity?
You could create a custom component containing your data (https://developer.apple.com/documentation/realitykit/component) and assign it to your entity. Then on tap check if such a component is available and read/display the data. Alternatively you could also subclass Entity and store some data within that custom class.
Topic: Spatial Computing SubTopic: ARKit Tags:
May ’23
Reply to RealityKit visionOS anchor to POV
I just found this: https://developer.apple.com/documentation/arkit/worldtrackingprovider/4218774-querypose which sounds promising if I enable ARKit tracking. Will give it a go. Can someone from the RealityKit confirm that this would be the way to go? Also there is https://developer.apple.com/documentation/realitykit/anchoringcomponent/target-swift.enum/head Does this also only work when ARKit is enabled? So far I wasn't able to run it successfully in the Simulator.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jun ’23