Post

Replies

Boosts

Views

Activity

Why does this entity appear behind spatial tap collision location?
I am trying to make a world anchor where a user taps a detected plane. How am I trying this? First, I add an entity to a RealityView like so: let anchor = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 2.0]), trackingMode: .continuous) anchor.transform.rotation *= simd_quatf(angle: -.pi / 2, axis: SIMD3<Float>(1, 0, 0)) let interactionEntity = Entity() interactionEntity.name = "PLANE" let collisionComponent = CollisionComponent(shapes: [ShapeResource.generateBox(width: 2.0, height: 2.0, depth: 0.02)]) interactionEntity.components.set(collisionComponent) interactionEntity.components.set(InputTargetComponent()) anchor.addChild(interactionEntity) content.add(anchor) This: Declares an anchor that requires a wall 2 meters by 2 meters to appear in the scene with continuous tracking Makes an empty entity and gives it a 2m by 2m by 2cm collision box Attaches the collision entity to the anchor Finally then adds the anchor to the scene It appears in the scene like this: Great! Appears to sit right on the wall. I then add a tap gesture recognizer like this: SpatialTapGesture() .targetedToAnyEntity() .onEnded { value in guard value.entity.name == "PLANE" else { return } var worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) let pose = Pose3D(position: worldPosition, rotation: value.entity.transform.rotation) let worldAnchor = WorldAnchor(transform: simd_float4x4(pose)) let model = ModelEntity(mesh: .generateBox(size: 0.1, cornerRadius: 0.03), materials: [SimpleMaterial(color: .blue, isMetallic: true)]) model.transform = Transform(matrix: worldAnchor.transform) realityViewContent?.add(model) I ASSUME This: Makes a world position from the where the tap connects with the collision entity. Integrates the position and the collision plane's rotation to create a Pose3D. Makes a world anchor from that pose (So it can be persisted in a world tracking provider) Then I make a basic cube entity and give it that transform. Weird Stuff: It doesn't appear on the plane.. it appears behind it... Why, What have I done wrong? The X and Y of the tap location appears spot on, but something is "off" about the z position. Also, is there a recommended way to debug this with the available tools? I'm guessing I'll have to file a DTS about this because feedback on the forum has been pretty low since labs started.
2
0
1.5k
Oct ’23
Debugging Max Volume size - GeometryReader3D units?
Hello, I'm curious if anyone has some useful debug tools for out-of-bounds issues with Volumes. I am opening a volume with a size of 1m, 1m, 10cm. I am adding a RealityView with a ModelEntity that is 0.5m tall and I am seeing the model clip at the top and bottom. I find this odd, because I feel like it should be within the size of the Volume.... I was curious what size SwiftUI says the Volume's size is so I tried using a GeometryReader3D to tell me.... GeometryReader3D { proxy in VStack { Text("\(proxy.size.width)") Text("\(proxy.size.height)") Text("\(proxy.size.depth)") } .padding().glassBackgroundEffect() } Unfortunately I get, 680, 1360, and 68. I'm guessing these units are in points, but that's not very helpful. The documentation says to use real-world units for Volumes, but none of the SwiftUI frame setters and getters appear to support different units. Is there a way to convert between the two? I'm not clear if this is a bug or a feature suggestion.
1
0
903
Oct ’23
Create a World Anchor from a Spatial Tap Gesture?
World Anchor from SpatialTapGesture ?? At 19:56 in the video, it's mentioned that we can use a SpatialTapGesture to "identify a position in the world" to make a world anchor. Which API calls are utilized to make this happen? World anchors are created with 4x4 matrices, and a SpatialTapGestures doesn't seem to generate one of those. Any ideas?
0
0
838
Aug ’23
Detecting Tap Location on detected "PlaneAnchor"? Replacement for Raycast?
While PlaneAnchors are still not generated by the PlaneDetectionProvider in the simulator, I am still brainstorming how to detect a tap on one of the planes. In an iOS ARKit application I could use a raycastQuery on existingPlaneGeometry to make an anchor with the raycast result's world transform. I've not yet found the VisionOS replacement for this. A possible hunch is that I need to install my own mesh-less PlaneModelEntities for each planeAnchor that's returned by the PlaneDetectionProvider. From there I can use a TapGesture targeted to those models? And then I could build a an WorldAnchor from the tap location on those entities. Anyone have any ideas?
1
0
1.2k
Aug ’23
[Newbie] Why does my ShaderGraphMaterial appear distorted?
Disclaimer: I am new to all things 3D. There could be a variety of things wrong with what I'm doing that are not unique to RealityKit. Any domain info would be appreciated. So, I'm following, what I think are, the recommended steps to import a shader-node material from reality composer pro and apply it to another modelEntity. I do the following: guard let entity = try? Entity.load(named: "Materials", in: RealityKitContent.realityKitContentBundle) else { return model } let materialEntity = entity.findEntity(named: "materialModel") as? ModelEntity guard let materialEntity else { return model } I then configure a property on it like so: guard var material = materialEntity.model?.materials[0] as? ShaderGraphMaterial else { return model } try coreMaterial.setParameter(name: "BaseColor", value: .color(matModel.matCoreUIColor)) I then apply it. This is what my texture looks like in RealityComposer: I notice that my rendered object has distortions in the actual RealityView. Note the diagonal lines that appear "Stretched". What could be doing this? I thought Node Shaders were supposed to be more resilient to distortions like this? I'm not sure if I've got a bug or if I'm using it wrong. FWIW, this is a shader based on apple's felt material shader. My graph looks like this: Thanks
2
0
1.1k
Aug ’23
Is it possible to disable Widget Accenting on the Lock Screen?
I'd like an Image subview of a lock screen widget to render as itself, and not with the multiply-like effect it gets today. I've tried .widgetAccentable(true) and .widgetAccentable(false), but none have the appearance I'm looking for. Is there maybe a new modifier that lets me "force" the rendering mode? Hoping there is and it's just not jumping out at me. Thanks for your help.
2
3
2.1k
Jan ’23
ARView "showAnchorGeometry" interpretation help?
When using the showAnchorGeometry I see lots of green surface anchors in my scene and it has been really helpful for debugging the placement of objects when I tap the screen of my device. But I also get some blue shapes too, and I'm not quite sure what those mean... Is there a document that explains what showAnchorGeometry is actually... showing? Googling "showAnchorGeometry blue" wasn't helpful! (I promise I tried)
1
0
1.4k
Dec ’22
Opening a SwiftUI app window to a particular place via intent?
It is possible via the new AppIntents framework to open your app from via a shortcuts intent, but I am currently very confused about how to ensure that a particular window is opened in a SwiftUI-runtime app. If the use says "Open View A" via a shortcuts, or Siri, I'd like to make sure it opens the window for "View A", though a duplicate window could be acceptable too. The WWDC22 presentation has the following: @MainActor func perform() async throws -> some IntentResult { Navigator.shared.openShelf(.currentlyReading) return .result() } Where, from the perform method of the Intent structure, they tell an arbitrary Navigator (code not provided) to just open a view of the app. (How convenient!) But for a multiwindow swiftUI app, I'm not sure how to make this work. @Environment variables are not available within the Intent struct, and even if I did have a "Navigator Singleton", I'm not sure how it could get the @Environment for openWindow since it's a View environment. AppIntents exist outside the View environment tree AFAIK. Any Ideas? I'd be a little shocked if this is a UIKit only sort of thing, but at the same time... ya never know.
2
0
2.6k
Aug ’22
AppIntent with Input from share sheet or previous step?
I have the following parameter:     @Parameter(title: "Image", description: "Image to copy", supportedTypeIdentifiers: ["com.image"], inputConnectionBehavior: .connectToPreviousIntentResult)     var imageFile: IntentFile? When I drop my AppIntent into a shortcut, though, I am unable to connect this parameter to the output of the previous step. Given the documentation I have no idea how to achieve this, if the above, is not the correct way to do so.
2
0
1.8k
Aug ’22
Consistent spacing on a grid of ContainerRelativeShapes?
Hello, I'm trying to make a grid of container-relative shapes, where outside gutters match the gutters in between the items. This stickiest part of this problem is the fact that calling .inset on a ContainerRelativeShape doubles the gutter in between the items. I've tried LazyVGrid, and an HStack of VStacks, and they all have this double gutter in between. I think I could move forward with some gnarly frame math, but I was curious if I'm missing some SwiftUI layout feature that could make this easier and more maintainable.
1
0
1k
Apr ’22
Disable occlusion for collaboration debugging?
Hello, I’m noticing that during a collaborative session anchors created on the host device are appearing in a different location on client devices and it’s making it challenging to test other collaboration logic. For example, when the client places a textured plane mesh at an anchor placed by the host, this placement can sometimes be considered as behind a surface and it gets clipped by RealityKits mesh occlusion. I’d prefer if I could see it floating in space when testing so I can see that something is happening. Im drawing a blank on if there are any debugging options to help me out. Nothing in render or debug options jumped out at me. Thoughts?
1
0
1.2k
Feb ’22