Post

Replies

Boosts

Views

Activity

Xbox controller and visionOS 2
I am having problems getting button input from an Xbox game controller. I have the visionOS 2 beta on my Apple Vision Pro, and I am trying to use an Xbox game controller with a RealityView following the instructions from the WWDC session Explore game input in visionOS. The notification about a game controller is picking up the game controller, finds GCInputButtonA, and I am setting closures for touchedChangedHandler, pressedChangedHandler, and valueChangedHandler that just print an os_log statement. buttonA.valueChangedHandler = { button, value, pressed in os_log("Got valueChangedHandler") } At the end of RealityView, I have the modifier RealityView { content in // stuff } .handlesGameControllerEvents(matching: .gamepad) But I am never seeing the log message appear in the console when I press the 'A' button (or any other button). Any ideas what I might be doing wrong? The Xbox controller is pretty old. Settings is reporting it as version 9.0.3
1
1
1.2k
Jun ’24
Getting the Wi-Fi's SSID on macOS
I want to extend an existing macOS app distributed through the Mac App Store with the capability to track the Wi-Fi's noise and signal strength along with the SSID it is connected to over time. Using CWWiFiClient.shared().interface(), I can get noiseMeasurement() and rssiValue() fine, but ssid() always returns nil. I am assuming this is a privacy issue (?). Are there specific entitlements I can request or ways to prompt the user to grant the app privilege to access the SSID values?
1
0
1.2k
Jul ’24
App Environment SkyDome's UV values
I started a visionOS app using Apple's new "App Environment" template, and when I looked at the UV mapping for the half SkyDome, the bottom edge had a UV 'Y' value of 0.318. Naively, I had assumed the bottom edge of a half dome would have a UV 'Y' value of 0.5 (half way up the texture map). Is this the standard UV mapping for half a SkyDome? It has caused some issues when I've applied some HDRIs.
1
0
699
Sep ’24
RealityView in macOS, Skybox, and lighting issue
I am testing RealityView on a Mac, and I am having troubles controlling the lighting. I initially add a red cube, and everything is fine. (see figure 1) I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field. Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was. Is there a way to return the lighting of the model to the original lighting I had before adding the skybox? I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment. Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c) struct MyRealityView: View { @Binding var isSwitchOn: Bool @State private var blueNebulaSkyboxResource: EnvironmentResource? var body: some View { RealityView { content in // Create a red cube 10cm on a side let mesh = MeshResource.generateBox(size: 0.1) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) // Load skybox let blueNeb2Name = "BlueNeb2" blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name) } update: { content in if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) { content.environment = .skybox(blueNebulaSkyboxResource!) } else { content.environment = .default } } .realityViewCameraControls(CameraControls.orbit) } } Figure 1 (default lighting before adding the skybox): Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox): Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
1
0
765
Aug ’24
Casting shadows on the ground
In visionOS 2 beta, I have a character loaded from a Reality Composer Pro scene standing on the floor, but he isn't casting a shadow on the floor. I added a GroundingShadowComponent in RealityView, and he does cast shadows on himself (e.g., his hands cast shadows on his shoes), but I don't see any shadow on the floor. Do I need to enable something to have my character cast a show on the real-world floor?
1
0
674
Sep ’24
Turn off camera in RealityView for iOS?
I am using RealityView for an iOS program. Is it possible to turn off the camera passthrough, so only my virtual content is showing? I am looking to create VR experience. I have a work around where I turn off occlusion and then create a sphere around me (e.g., with a black texture), but in the pre-RealityView days, I think I used something like this: arView.environment.background = .color(.black) Is there something similar in RealityView for iOS? Here are some snippets of my current work around inside RealityView. First create the sphere to surround the user: // Create sphere let blackMaterial = UnlitMaterial(color: .black) let sphereMesh = MeshResource.generateSphere(radius: 100) let sphereModelComponent = ModelComponent(mesh: sphereMesh, materials: [blackMaterial]) let sphereEntity = Entity() sphereEntity.components.set(sphereModelComponent) sphereEntity.scale *= .init(x: -1, y: 1, z: 1) content.add(sphereEntity) Then turn off occlusion: // Turn off occlusion let configuration = SpatialTrackingSession.Configuration( tracking: [], sceneUnderstanding: [], camera: .back) let session = SpatialTrackingSession() await session.run(configuration)
1
0
670
Sep ’24
Zombie System Extensions
I had a weird case today when an endpoint system extension remained running even after I deleted the .app bundle.If I tried killing the process with "sudo kill -9 <pid>", the extension respawned.If I tried "sudo launchctl remove <name>", I was told I didn't have privilege.Searching my hard drive I found a copy of the system extension in /Macintosh HD/Library/System Extensions/...I rebooted into recovery mode, deleted the extension bundle, and restarted. Everything initially looked fine. The process did not come back.But then when I tried to re-build, re-package, re-install, and re-launch the application, the operating system complained that it could not find the system extension even though it was there in the .app bundle.The operating system seems to (A) create a cache/copy of the system extension bundle, and (my guess) (B) maintains a link to that cache location somewhere and tries to launch that cached system extension bundle.[my hacked solution was to rename the extension, including creating a new bundle ID and associated provisioning profile]Has anyone encountered a system extension that woud not die? Did you figure out how to kill it and clear out any caches of it?Thanks,
10
2
7.9k
Feb ’22
PHAsset of older photos is missing GPS info in Swift
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset. For newer photos, the asset.location is populated with GPS information. Also, with newer photos, CIImage's property dictionary has {GPS} information. So all is good with newer photos. But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage. However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information. So am I missing some settings to find the GPS information in older photos using PHAsset on iOS? Thanks,
2
0
1.3k
Sep ’21
RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds) Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*). Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
2
0
1.7k
Apr ’21
ARKit FPS drop when device gets hot?
During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there. This low FPS will continue even when I kill the app completely and restart. However, after giving my phone a break, the fps returns to 60 fps. Does ARKit automatically throttle down FPS when the device gets too hot? If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
2
0
1.7k
Sep ’21
polygon count vs. triangle count?
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
2
0
2.4k
Nov ’21
iOS 16.2: Cannot load underlying module for 'ARKit'
I have a strange warning in Xcode associated with ARKit. It isn't a big issue because there is a work around, but I am curious why this is happening and how I can avoid it in the future. I opened up an old AR project in Xcode, and the editor gave a strange error message on the "import ARKit" line saying Cannot load underlying module for 'ARKit' Despite the error message, the code continues to build and run. I've quit & restarted Xcode, rebooted the Mac, and even deleted and and redownloaded Xcode, but the error/warning was still there. Upon some additional testing, I discovered that I only get this message when targeting "iOS 16.2" but not when targeting 16.0, 16.1, 16.3, or 16.4. (I did not try pre-iOS 16) Any idea why my Xcode no longer likes ARKit on iOS 16.2 on my Mac? Development platform: Xcode: Version 14.3.1 (14E300c) macOS: 13.4 (22F66) Mac: Mac Studio 2022 iOS on iPhone 14 Pro Max: 16.5 (20F66) Screenshot:
2
1
1.4k
Aug ’23
Xcode Beta RC didn't have an option for vision simulator
I just downloaded the latest Xcode beta, Version 15.0 (15A240d) and ran into some issues: On start up, I was not given an option to download the Vision simulator. I cannot create a project targeted at visionOS I cannot build/run a hello world app for Vision. In my previous Xcode-beta (Version 15.0 beta 8 (15A5229m)), there was an option to download the vision simulator, and I can create projects for the visionOS and run the code in the vision simulator. The Xcode file downloaded was named "Xcode" instead of "Xcode-beta". I didn't want to get rid of the exiting Xcode, so I selected Keep Both. Now I have 3 Xcodes in the Applications folder Xcode Xcode copy Xcode-beta That is the only thing I see that might have been different about my install. Hardware: Mac Studio 2022 with M1 Max macOS Ventura 13.5.2 Any idea what I did wrong?
2
0
1.8k
Sep ’23
Placement of model inside volumetric window?
I am having troubles placing a model inside a volumetric window. I have a model - just a simple cube created in Reality Composer Pro that is 0.2m on a side and centered at the origin - and I want to display it in a volumetric window that is 1.0m on a side while preserving the cube's origin 0.2m size. The small cube seems to be flush against the back and top of the larger volumetric window. Is it possible to initially position the model inside the volume? For example, can the model be placed flush against the bottom and front of the volumetric window? (note: the actual use case is wanting to place 3D terrain (which tends to be mostly flat like a pizza box) flush against the bottom of the volumetric window)
2
2
740
Dec ’24