Post

Replies

Boosts

Views

Activity

Portals and ImmersiveSpace?
I've added a simple visionOS Portal to an app's initial WindowGroup (a window with an attached portal is all that is displayed), but I've had troubles adding a portal to an ImmersiveSpace. For example, using the boilerplate code that Xcode creates for a mixed spatial experience, I'd like to turn on & off the ImmersiveSpace which has a portal in it. So far, the portal isn't showing up. Is it possible to add a portal to an ImmersiveSpace? Are there any restrictions on where portals can be added?
1
0
760
Feb ’24
SpatialTapGesture and collision surface's normal?
I see example code converting the results of a SpatialTap to a SIMD3 location. For example, from WWDC session Meet ARKit for spatial computing: let location3D = value.convert(value.location3D, from: .global, to: .scene) What I really want is a simd_float4x4 that includes orientation of the surface that the tap gesture/cast collided with? My goal is to place an object with its Y-axis along the normal of the surface that was tapped. For example, in the referenced WWDC session, they create a CollisionComponent from the MeshAnchor data. If that mesh data is covering a curved couch cushion, I would like the normal from that curved cushion (i.e., the closest triangle approximating it). Is this possible? My planned fallback is to only use planes for collision surfaces for tap gestures, extract the tap gesture value's entity (which I am hoping is the plane), and grab its transform for the orientation information. I am hoping Apple has a simple function call that is more general than my fallback approach.
1
0
775
Mar ’24
Xbox controller and visionOS 2
I am having problems getting button input from an Xbox game controller. I have the visionOS 2 beta on my Apple Vision Pro, and I am trying to use an Xbox game controller with a RealityView following the instructions from the WWDC session Explore game input in visionOS. The notification about a game controller is picking up the game controller, finds GCInputButtonA, and I am setting closures for touchedChangedHandler, pressedChangedHandler, and valueChangedHandler that just print an os_log statement. buttonA.valueChangedHandler = { button, value, pressed in os_log("Got valueChangedHandler") } At the end of RealityView, I have the modifier RealityView { content in // stuff } .handlesGameControllerEvents(matching: .gamepad) But I am never seeing the log message appear in the console when I press the 'A' button (or any other button). Any ideas what I might be doing wrong? The Xbox controller is pretty old. Settings is reporting it as version 9.0.3
1
1
1.2k
Jun ’24
Getting the Wi-Fi's SSID on macOS
I want to extend an existing macOS app distributed through the Mac App Store with the capability to track the Wi-Fi's noise and signal strength along with the SSID it is connected to over time. Using CWWiFiClient.shared().interface(), I can get noiseMeasurement() and rssiValue() fine, but ssid() always returns nil. I am assuming this is a privacy issue (?). Are there specific entitlements I can request or ways to prompt the user to grant the app privilege to access the SSID values?
1
0
1.2k
Jul ’24
App Environment SkyDome's UV values
I started a visionOS app using Apple's new "App Environment" template, and when I looked at the UV mapping for the half SkyDome, the bottom edge had a UV 'Y' value of 0.318. Naively, I had assumed the bottom edge of a half dome would have a UV 'Y' value of 0.5 (half way up the texture map). Is this the standard UV mapping for half a SkyDome? It has caused some issues when I've applied some HDRIs.
1
0
699
Sep ’24
RealityView in macOS, Skybox, and lighting issue
I am testing RealityView on a Mac, and I am having troubles controlling the lighting. I initially add a red cube, and everything is fine. (see figure 1) I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field. Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was. Is there a way to return the lighting of the model to the original lighting I had before adding the skybox? I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment. Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c) struct MyRealityView: View { @Binding var isSwitchOn: Bool @State private var blueNebulaSkyboxResource: EnvironmentResource? var body: some View { RealityView { content in // Create a red cube 10cm on a side let mesh = MeshResource.generateBox(size: 0.1) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) // Load skybox let blueNeb2Name = "BlueNeb2" blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name) } update: { content in if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) { content.environment = .skybox(blueNebulaSkyboxResource!) } else { content.environment = .default } } .realityViewCameraControls(CameraControls.orbit) } } Figure 1 (default lighting before adding the skybox): Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox): Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
1
0
765
Aug ’24
Casting shadows on the ground
In visionOS 2 beta, I have a character loaded from a Reality Composer Pro scene standing on the floor, but he isn't casting a shadow on the floor. I added a GroundingShadowComponent in RealityView, and he does cast shadows on himself (e.g., his hands cast shadows on his shoes), but I don't see any shadow on the floor. Do I need to enable something to have my character cast a show on the real-world floor?
1
0
672
Sep ’24
Turn off camera in RealityView for iOS?
I am using RealityView for an iOS program. Is it possible to turn off the camera passthrough, so only my virtual content is showing? I am looking to create VR experience. I have a work around where I turn off occlusion and then create a sphere around me (e.g., with a black texture), but in the pre-RealityView days, I think I used something like this: arView.environment.background = .color(.black) Is there something similar in RealityView for iOS? Here are some snippets of my current work around inside RealityView. First create the sphere to surround the user: // Create sphere let blackMaterial = UnlitMaterial(color: .black) let sphereMesh = MeshResource.generateSphere(radius: 100) let sphereModelComponent = ModelComponent(mesh: sphereMesh, materials: [blackMaterial]) let sphereEntity = Entity() sphereEntity.components.set(sphereModelComponent) sphereEntity.scale *= .init(x: -1, y: 1, z: 1) content.add(sphereEntity) Then turn off occlusion: // Turn off occlusion let configuration = SpatialTrackingSession.Configuration( tracking: [], sceneUnderstanding: [], camera: .back) let session = SpatialTrackingSession() await session.run(configuration)
1
0
670
Sep ’24
Getting ARMeshClassification information
The ARMeshGeometry - https://developer.apple.com/documentation/arkit/armeshgeometry documentation references ARMeshClassification, - https://developer.apple.com/documentation/arkit/armeshclassification but I cannot find any obvious way to get classification information for the mesh data. I found the classificationOf(faceWithIndex: index) function in the Xcode sample project Visualizing and Interacting with a Reconstructed Scene - https://developer.apple.com/documentation/arkit/content_anchors/visualizing_and_interacting_with_a_reconstructed_scene, but it seems pretty complex. Is there something simpler that I am missing? It also seems from the code that a mesh doesn't have a classification, but only individual geometry faces in the mesh have a classification. Is it common for a single mesh to represent many different objects (e.g., a chair, floor, and wall) all at the same time? Thanks,
0
0
716
Mar ’21
Is the scene geometry OcclusionMaterial accessible in RealityKit?
Is it possible to turn on and off different occlusion material when using Scene Understanding with LiDAR and RealityKit? For example, if ARKit identifies a wall, I don't want that mesh to be used during occlusion (but I do want occlusion for other things, like the couch or the floor) If I could do this, it would essentially make my walls transparent, and I could see the RealityKit objects that extend beyond the room I am in. Thanks,
0
0
629
Mar ’21
AnchorEntity(plane:) ground shadow without the plane?
When I create an AnchorEntity like this: let entityAnchor = AnchorEntity(plane: [.horizontal], classification: [.floor], minimumBounds: [0.2,0.2]) and add a USDZ model to it, I get a nice ground shadow. But if I create an AnchorEntity using an ARAnchor like this: let entityAnchor = AnchorEntity(anchor: anchor) I do not get that nice ground shadow. Is there a way to get that ground shadow I get from a plane anchor but with an EntityAnchor where I can specify where it goes or attach it to an ARAnchor? [Note: for LiDAR devices, I can get a nice shadow using config.sceneReconstruction = .mesh arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.receivesLighting) but creating the environment mesh is computationally expensive. I'd like to avoid that if possible.]
0
0
870
Oct ’21
usdzconvert and metersPerUnit
I've been creating USDA files manually and converting them to USDZ via Apple's usdzconvert tool (version 0.64). In the file I set unit size to be 1 meter metersPerUnit = 1.0 but the USDZ keeps the unit size at 1 cm. Apple's Reality Converter does process the metersPerUnit metadata, so that is a viable work-around for me. But sometimes I'd prefer the command-line tool. Is there an update to the usdzconvert tool? I couldn't find one.
0
0
881
Nov ’21
OcclusionMaterial filter?
RealityKit has a CollisionFilter to determine which entities can collide with which other ones. Perchance, is there something similar for OcclusionMaterial? In effect, I'd like to have the ability to have a model with an OcclusionMaterial "occlude this entity but not that entity".
0
0
540
Nov ’21
AR Quick Look additional controls?
I've recently added some USDZ files to a web page, and I can download and display them fine via AR Quick Look on an iPhone or iPad. I've noticed full occlusion is active in the AR view. Over time, the device appears to heat up and the frame rate drops. Are there any properties I can set in the <a rel="ar" ...> HTML tag to control things like occlusion or autofocus (i.e., turn them off)?
0
0
928
Nov ’21
Body tracking robot and Blender
Has anyone successfully imported Apple's (FBX) robot for Apple's CapturingBodyMotionIn3D demo into Blender exported it back out (GLTF or other format) converted it back to USDZ via Reality Converter and gotten it to work in Apple's demo app again? I have run into numerous problems, and each effort to fix a problem leads to new ones. For example, importing Apple's FBX robot has the bones pointing in funny directions (see attachment). When I try to correct this on import by aligning the bones, the robot in the Apple app looks like it went through Star Trek transporter accident - limbs at weird angles.
0
0
906
Jan ’23