Post

Replies

Boosts

Views

Activity

Reply to Placing Item on plane, position is great, but trouble with rotation
My Solution You can get the location of the tap gesture on an entity with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) You can grab he orientation of the tapped plane with let rotation = value.entity.orientation(relativeTo: nil) I want to track this location so it can be persisted in the WorldTrackingProvider (whenever it works??? ) with a world anchor. I do so by making a world anchor with: let pose = Pose3D(position: worldPosition, rotation: rotation) let worldAnchor = WorldAnchor(originFromAnchorTransform: simd_float4x4(pose)) When you get the anchor from your world tracking listener you can apply it's transform (which contains position and orientation) to an entity like so: entityForAnchor.transform = Transform(matrix: anchor.originFromAnchorTransform) If you've got another way, or a way that has a more intuitive use of the API topology please share. This was cobbled together with a ton of trial and error.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Oct ’23
Reply to Send Pointer to Device
Speculation: Vision Pro allows a stationary user to interact with their application windows via keyboard+mouse/trackpad. I suspect this setting lets the cursor function as a mouse/trackpad pointer instead of the user’s eyes In the VisionOS simulator.
Oct ’23
Reply to How can I pinch to open a menu in VisionOS simulator?
I don’t see code in your recent post that invokes the open window command. However, without an entity to receive the tap, it may not be possible to do what you want. An empty Immersive View doesn’t have anything to receive the tap. You may need to use the scene reconstruction provider to generate meshes of the world so those meshes can receive the taps. This is not available in the simulator at the moment.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Sep ’23
Reply to Placing Item on plane, position is great, but trouble with rotation
My Solution You can get the location of the tap gesture on an entity with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) You can grab he orientation of the tapped plane with let rotation = value.entity.orientation(relativeTo: nil) I want to track this location so it can be persisted in the WorldTrackingProvider (whenever it works??? ) with a world anchor. I do so by making a world anchor with: let pose = Pose3D(position: worldPosition, rotation: rotation) let worldAnchor = WorldAnchor(originFromAnchorTransform: simd_float4x4(pose)) When you get the anchor from your world tracking listener you can apply it's transform (which contains position and orientation) to an entity like so: entityForAnchor.transform = Transform(matrix: anchor.originFromAnchorTransform) If you've got another way, or a way that has a more intuitive use of the API topology please share. This was cobbled together with a ton of trial and error.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to vision OS development - does voice search work without extra effort
Search of what? Are you talking about Spotlight support via voice?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Why does this entity appear behind spatial tap collision location?
This works in beta 3
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Xcode 15.1 beta and vision os 1 beta 4
Intel or Apple silicon?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to Send Pointer to Device
Speculation: Vision Pro allows a stationary user to interact with their application windows via keyboard+mouse/trackpad. I suspect this setting lets the cursor function as a mouse/trackpad pointer instead of the user’s eyes In the VisionOS simulator.
Replies
Boosts
Views
Activity
Oct ’23
Reply to Vision Pro: UX: how to close an immersive view?
Control center also offers a way to exit immersive scenes
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to PlaneDetection, ImageTracking and Scene Reconstruction support on VisionOS Simulator NOT WORKING
My feedback says no recent similar reports. FB12639395.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to PlaneDetection, ImageTracking and Scene Reconstruction support on VisionOS Simulator NOT WORKING
Still not working in beta 4. Still haven't heard back on devkit application.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23
Reply to PlaneDetection, ImageTracking and Scene Reconstruction support on VisionOS Simulator NOT WORKING
At the moment there is no workaround other than receive a dev-kit or purchase a device when it releases. I am in the same boat as you.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to How can I pinch to open a menu in VisionOS simulator?
I don’t see code in your recent post that invokes the open window command. However, without an entity to receive the tap, it may not be possible to do what you want. An empty Immersive View doesn’t have anything to receive the tap. You may need to use the scene reconstruction provider to generate meshes of the world so those meshes can receive the taps. This is not available in the simulator at the moment.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to QR / Image marker recognition with Vision Pro
You can construct an AnchorEntity targeting an image in the app's bundle: https://developer.apple.com/documentation/realitykit/anchorentity/init(_:trackingmode:)
Topic: Programming Languages SubTopic: Swift Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to VisionOS Simulator and ARKit Features
Correct, in betas 1-3 of VisionOS ARKit data provides like plane detectors and scene reconstruction do not work. You can make an AnchorEntity of type plane and it will find a surface, though.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to How to launch a Volume or ImmersiveSpace from UIKit?
ImmersiveSpaces are not possible per Apple reply here: Link
Topic: UI Frameworks SubTopic: UIKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Creating an immersive space using UIKit?
Not possible per Apple reply here: Link
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Creating an immersive space using UIKit?
Not possible per Apple reply here: Link
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Sep ’23