Post

Replies

Boosts

Views

Activity

Reply to Can't center entity on AnchorEntity(.plane)
I felt like this should be obvious because it's such an important use case for dropping RealityKit scenes into the world without user interaction, but I tried a few things with translation that failed. For some reason what worked for me was calling box.setPosition.... relative to nil. I'm not sure why it works given that the documentation says "nil" means "world space", when it appears to behave as if nil means "parent space" in this case? class Model: ObservableObject { var wall: AnchorEntity? var child: ModelEntity? } struct ImmersiveView: View { @StateObject var model = Model() var body: some View { RealityView { content in let wall = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 1.5]), trackingMode: .continuous) model.wall = wall let mesh = MeshResource.generateBox(size: 0.3) let box = ModelEntity(mesh: mesh, materials: [SimpleMaterial(color: .green, isMetallic: false)]) model.child = box wall.addChild(box, preservingWorldTransform: false) content.add(wall) box.setPosition([0, 0, 0], relativeTo: wall) } update: { content in if let box = model.child, let wall = model.wall { // box.setPosition([0, 0, 0], relativeTo: wall) // <---- DOES NOT WORK box.setPosition([0, 0, 0], relativeTo: nil) // <---- DOES WORK even though nil means "world space"???? } } } }
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’23
Reply to How can I pinch to open a menu in VisionOS simulator?
I don’t see code in your recent post that invokes the open window command. However, without an entity to receive the tap, it may not be possible to do what you want. An empty Immersive View doesn’t have anything to receive the tap. You may need to use the scene reconstruction provider to generate meshes of the world so those meshes can receive the taps. This is not available in the simulator at the moment.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Sep ’23
Reply to Send Pointer to Device
Speculation: Vision Pro allows a stationary user to interact with their application windows via keyboard+mouse/trackpad. I suspect this setting lets the cursor function as a mouse/trackpad pointer instead of the user’s eyes In the VisionOS simulator.
Oct ’23
Reply to Can't center entity on AnchorEntity(.plane)
I felt like this should be obvious because it's such an important use case for dropping RealityKit scenes into the world without user interaction, but I tried a few things with translation that failed. For some reason what worked for me was calling box.setPosition.... relative to nil. I'm not sure why it works given that the documentation says "nil" means "world space", when it appears to behave as if nil means "parent space" in this case? class Model: ObservableObject { var wall: AnchorEntity? var child: ModelEntity? } struct ImmersiveView: View { @StateObject var model = Model() var body: some View { RealityView { content in let wall = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 1.5]), trackingMode: .continuous) model.wall = wall let mesh = MeshResource.generateBox(size: 0.3) let box = ModelEntity(mesh: mesh, materials: [SimpleMaterial(color: .green, isMetallic: false)]) model.child = box wall.addChild(box, preservingWorldTransform: false) content.add(wall) box.setPosition([0, 0, 0], relativeTo: wall) } update: { content in if let box = model.child, let wall = model.wall { // box.setPosition([0, 0, 0], relativeTo: wall) // <---- DOES NOT WORK box.setPosition([0, 0, 0], relativeTo: nil) // <---- DOES WORK even though nil means "world space"???? } } } }
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Creating an immersive space using UIKit?
Not possible per Apple reply here: Link
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Creating an immersive space using UIKit?
Not possible per Apple reply here: Link
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to How to launch a Volume or ImmersiveSpace from UIKit?
ImmersiveSpaces are not possible per Apple reply here: Link
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to VisionOS Simulator and ARKit Features
Correct, in betas 1-3 of VisionOS ARKit data provides like plane detectors and scene reconstruction do not work. You can make an AnchorEntity of type plane and it will find a surface, though.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to QR / Image marker recognition with Vision Pro
You can construct an AnchorEntity targeting an image in the app's bundle: https://developer.apple.com/documentation/realitykit/anchorentity/init(_:trackingmode:)
Topic: Programming Languages SubTopic: Swift Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to How can I pinch to open a menu in VisionOS simulator?
I don’t see code in your recent post that invokes the open window command. However, without an entity to receive the tap, it may not be possible to do what you want. An empty Immersive View doesn’t have anything to receive the tap. You may need to use the scene reconstruction provider to generate meshes of the world so those meshes can receive the taps. This is not available in the simulator at the moment.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to PlaneDetection, ImageTracking and Scene Reconstruction support on VisionOS Simulator NOT WORKING
At the moment there is no workaround other than receive a dev-kit or purchase a device when it releases. I am in the same boat as you.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to PlaneDetection, ImageTracking and Scene Reconstruction support on VisionOS Simulator NOT WORKING
Still not working in beta 4. Still haven't heard back on devkit application.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to PlaneDetection, ImageTracking and Scene Reconstruction support on VisionOS Simulator NOT WORKING
My feedback says no recent similar reports. FB12639395.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Vision Pro: UX: how to close an immersive view?
Control center also offers a way to exit immersive scenes
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Send Pointer to Device
Speculation: Vision Pro allows a stationary user to interact with their application windows via keyboard+mouse/trackpad. I suspect this setting lets the cursor function as a mouse/trackpad pointer instead of the user’s eyes In the VisionOS simulator.
Replies
Boosts
Views
Activity
Oct ’23
Reply to Xcode 15.1 beta and vision os 1 beta 4
Intel or Apple silicon?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Why does this entity appear behind spatial tap collision location?
This works in beta 3
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to vision OS development - does voice search work without extra effort
Search of what? Are you talking about Spotlight support via voice?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’23