Post

Replies

Boosts

Views

Activity

Reply to visionOS 3D tap location offset by ~0.35m?
No SwiftUI views inside the ImmersiveSpace. I only have Entity and ModelEntity instances created manually with RealityKit. I'll try some additional experiments. I think I will place a small object programmatically 1.5 meters in front of me (but not have it tappable), look at it, and tap (I assume the gaze will go through to hit the floor below it), and then compare the event's location3D with the object placed at a specific location.
Topic: Spatial Computing SubTopic: ARKit Tags:
Apr ’24
Reply to visionOS 3D tap location offset by ~0.35m?
I made the code as simple as possible: a single 5cm target square placed at location (0, 0, -1.5) I then tapped on target while standing in several locations, and the reported tap location was off along the Z axis by about 0.5m. Here were several tap locations (Z location is underlined): tap location: SIMD3(0.0067811073, 0.019996116, -1.1157947), name: target tap location: SIMD3(-0.00097223074, 0.019996116, -1.1036792), name: target tap location: SIMD3(0.0008024718, 0.019995179, -1.1074299), name: target tap location: SIMD3(-0.009804221, 0.019996116, -1.0694565), name: target tap location: SIMD3(-0.0037206858, 0.019995492, -1.0778457), name: target tap location: SIMD3(-0.009298846, 0.019996116, -1.0772702), name: target Here is the code to set up the RealityView: import SwiftUI import RealityKit import RealityKitContent struct ImmersiveView: View { @StateObject var model = MyModel() /// Spatial tap gesture that tells the model the tap location. var myTapGesture: some Gesture { SpatialTapGesture() .targetedToAnyEntity() .onEnded { event in let location3D = event.convert(event.location3D, from: .global, to: .scene) let entity = event.entity model.handleTap(location: location3D, entity: entity) } } var body: some View { RealityView { content in model.setupContentEntity(content: content) } .gesture(myTapGesture) } } Here is the model code: import Foundation import SwiftUI import RealityKit import RealityKitContent import ARKit import os.log @MainActor class MyModel: ObservableObject { private var realityViewContent: RealityViewContent? /// Capture RealityViewContent and create target /// /// - Parameter content: container for all RealityView content func setupContentEntity(content: RealityViewContent) { self.realityViewContent = content placeTargetObject() } /// Place a small red target at position 0, 0, -1.5 /// /// I will look at this position and tap my fingers. The tap location /// should be near the same position (0, 0, -1.5) func placeTargetObject() { guard let realityViewContent else { return } let width: Float = 0.05 let height: Float = 0.02 let x: Float = 0 let y: Float = 0 let z: Float = -1.5 // Create red target square let material = SimpleMaterial(color: .red, isMetallic: false) let mesh = MeshResource.generateBox(width: width, height: height, depth: width) let target = ModelEntity(mesh: mesh, materials: [material]) // Add collision and target component to make it tappable let shapeBox = ShapeResource.generateBox(width: width, height: height, depth: width) let collision = CollisionComponent(shapes: [shapeBox], isStatic: true) target.collision = collision target.components.set(InputTargetComponent()) // Set name, position, and add it to scene target.name = "target" target.setPosition(SIMD3<Float>(x,y + height/2, z), relativeTo: nil) realityViewContent.add(target) } /// Respond to the user tapping on an object by printing name of entity and tap location /// /// - Parameters: /// - location: location of tap gesture /// - entity: entity that was tapped func handleTap(location: SIMD3<Float>, entity: Entity) { os_log("tap location: \(location), name: \(entity.name, privacy: .public)") } } Example of the small red target:
Topic: Spatial Computing SubTopic: ARKit Tags:
Apr ’24
Reply to Xbox controller and visionOS 2
I've largely solved it following the information from this discussion: https://forums.developer.apple.com/forums/thread/746728 Side note: When I look at my virtual Mac screen while in immersive mode (a new feature) to look for the console messages and press the 'A' button, I lose the ability to detect the button presses in my app. Does the input focus switch?
Topic: Spatial Computing SubTopic: General Tags:
Jun ’24
Reply to Getting the Wi-Fi's SSID on macOS
I figured it out. I just needed to request permission for Core Location services. Added this code to one of my objects: locationManager = CLLocationManager() locationManager?.delegate = myLocationDelegate locationManager?.requestWhenInUseAuthorization() For my app sandbox, I also enabled Outgoing Connections (Client) Location (I'm not certain the second one is needed) I now get the SSID and BSSID. Also, the app now shows up in System Setting's Location Services, where I guess the user can turn it on or off.
Jul ’24
Reply to RealityView in macOS, Skybox, and lighting issue
I have found a different approach that works for me. I have abandoned setting the content.environment property and use a sky dome model for the background and use ImageBasedLightComponent and ImageBasedLightReceiverComponent to choose the lighting for the cube. For more information on this approach, see WWDC session Optimize your 3D assets for spatial computing, and jump to sections 15:07 - Sky dome setup 16:03 - Image-based lighting I did everything programmatically (instead of using Reality Composer Pro), but it pretty much works the same. Sample code (caveat, I have no idea if this is the preferred approach, but it works for me): import SwiftUI import RealityKit import os.log struct MyRealityView: View { @Binding var useNebulaForLighting: Bool @Binding var showNebula: Bool @State private var nebulaIbl: ImageBasedLightComponent? @State private var indoorIbl: ImageBasedLightComponent? @State private var iblEntity: Entity? @State private var litCube: Entity? @State private var skydome: Entity? var body: some View { RealityView { content in // Create a red cube 1m on a side let mesh = MeshResource.generateBox(size: 1.0) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) litCube = redBoxEntity // Get hi-res texture to show as background let immersion_name = "BlueNebula" guard let resource = try? await TextureResource(named: immersion_name) else { fatalError("Unable to load texture.") } var material = UnlitMaterial() material.color = .init(texture: .init(resource)) // Create sky dome sphere let sphereMesh = MeshResource.generateSphere(radius: 1000) let sphereModelComponent = ModelComponent(mesh: sphereMesh, materials: [material]) // Create an entity and set its model component let sphereEntity = Entity() sphereEntity.components.set(sphereModelComponent) // Trick/hack to make the texture image point inward to the viewer. sphereEntity.scale *= .init(x: -1, y: 1, z: 1) // Add sky dome to the scene skydome = sphereEntity skydome?.isEnabled = showNebula content.add(skydome!) // Create Image Based Lighting entity for scene iblEntity = Entity() content.add(iblEntity!) // Load low-res nebula resource for image based lighting if let environmentResource = try? await EnvironmentResource(named: "BlueNeb2") { let iblSource = ImageBasedLightComponent.Source.single(environmentResource) let iblComponent = ImageBasedLightComponent(source: iblSource) nebulaIbl = iblComponent } // Load low-res indoor light resource for image based lighting if let environmentResource = try? await EnvironmentResource(named: "IndoorLights") { let iblSource = ImageBasedLightComponent.Source.single(environmentResource) let iblComponent = ImageBasedLightComponent(source: iblSource) indoorIbl = iblComponent } // Set initial settings applyModelSettings() } update: { content in applyModelSettings() } .realityViewCameraControls(CameraControls.orbit) } func applyModelSettings() { // Set image based lighting if (useNebulaForLighting == true) && (litCube != nil) && (nebulaIbl != nil) { iblEntity!.components.set(nebulaIbl!) let iblrc = ImageBasedLightReceiverComponent(imageBasedLight: iblEntity!) litCube?.components.set(iblrc) } else if (useNebulaForLighting == false) && (litCube != nil) && (indoorIbl != nil) { iblEntity!.components.set(indoorIbl!) let iblrc = ImageBasedLightReceiverComponent(imageBasedLight: iblEntity!) litCube?.components.set(iblrc) } // set skydome's status skydome?.isEnabled = showNebula } }
Topic: Spatial Computing SubTopic: General Tags:
Aug ’24