Post

Replies

Boosts

Views

Activity

Zombie System Extensions
I had a weird case today when an endpoint system extension remained running even after I deleted the .app bundle.If I tried killing the process with "sudo kill -9 <pid>", the extension respawned.If I tried "sudo launchctl remove <name>", I was told I didn't have privilege.Searching my hard drive I found a copy of the system extension in /Macintosh HD/Library/System Extensions/...I rebooted into recovery mode, deleted the extension bundle, and restarted. Everything initially looked fine. The process did not come back.But then when I tried to re-build, re-package, re-install, and re-launch the application, the operating system complained that it could not find the system extension even though it was there in the .app bundle.The operating system seems to (A) create a cache/copy of the system extension bundle, and (my guess) (B) maintains a link to that cache location somewhere and tries to launch that cached system extension bundle.[my hacked solution was to rename the extension, including creating a new bundle ID and associated provisioning profile]Has anyone encountered a system extension that woud not die? Did you figure out how to kill it and clear out any caches of it?Thanks,
10
2
7.9k
Feb ’22
usdzconvert and metersPerUnit
I've been creating USDA files manually and converting them to USDZ via Apple's usdzconvert tool (version 0.64). In the file I set unit size to be 1 meter metersPerUnit = 1.0 but the USDZ keeps the unit size at 1 cm. Apple's Reality Converter does process the metersPerUnit metadata, so that is a viable work-around for me. But sometimes I'd prefer the command-line tool. Is there an update to the usdzconvert tool? I couldn't find one.
0
0
881
Nov ’21
iOS 16.2: Cannot load underlying module for 'ARKit'
I have a strange warning in Xcode associated with ARKit. It isn't a big issue because there is a work around, but I am curious why this is happening and how I can avoid it in the future. I opened up an old AR project in Xcode, and the editor gave a strange error message on the "import ARKit" line saying Cannot load underlying module for 'ARKit' Despite the error message, the code continues to build and run. I've quit & restarted Xcode, rebooted the Mac, and even deleted and and redownloaded Xcode, but the error/warning was still there. Upon some additional testing, I discovered that I only get this message when targeting "iOS 16.2" but not when targeting 16.0, 16.1, 16.3, or 16.4. (I did not try pre-iOS 16) Any idea why my Xcode no longer likes ARKit on iOS 16.2 on my Mac? Development platform: Xcode: Version 14.3.1 (14E300c) macOS: 13.4 (22F66) Mac: Mac Studio 2022 iOS on iPhone 14 Pro Max: 16.5 (20F66) Screenshot:
2
1
1.4k
Aug ’23
Are RealityKit lights expensive?
I am finding some unexpected behavior with lights I've been adding to a RealityKit scene. For example, I created 14 PointLights, but only 8 appeared to be used to illuminate the scene. In another example, I created 7 PointLights and 7 SpotLights, and the frame rate dropped quite a bit. Are lights computationally expensive, causing some adaptive behavior by RealityKit? Should I be judicious in my use of lights for a scene? (Note: I set arView.environment.lighting.resource to a Skybox with a black image; my goal was to completely control the lighting. I don't know if that added to the computational load)
1
1
1.2k
Dec ’21
Can exposureCompensation affect SLAM?
When setting ARView's environment camera feed exposure to a negative value to make the camera feed dimmer, for example arView.environment.background = .cameraFeed(exposureCompensation: -3) can this negatively affect ARKit's ability to track the device's localization and mapping capability? That is, is the device's use of the camera for SLAM purposes independent of the exposureCompensation value?
0
1
714
Feb ’23
Placement of model inside volumetric window?
I am having troubles placing a model inside a volumetric window. I have a model - just a simple cube created in Reality Composer Pro that is 0.2m on a side and centered at the origin - and I want to display it in a volumetric window that is 1.0m on a side while preserving the cube's origin 0.2m size. The small cube seems to be flush against the back and top of the larger volumetric window. Is it possible to initially position the model inside the volume? For example, can the model be placed flush against the bottom and front of the volumetric window? (note: the actual use case is wanting to place 3D terrain (which tends to be mostly flat like a pizza box) flush against the bottom of the volumetric window)
2
2
736
Dec ’24
Getting to MeshAnchor.MeshClassification from MeshAnchor?
I am working with MeshAnchors, and I am having troubles getting to the classification of the triangles/faces. This post references the MeshAnchor.Geometry, and that struct does have a property named "classifications", but it is of type GeometrySource. I cannot find any classification information in GeometrySource. Am I missing something there? I think I am looking for something of type MeshAnchor.MeshClassification, but I cannot find any structs with this as a property.
3
0
1.3k
Feb ’25
Triangle count and texture size budget for RealityKit on visionOS
In the past, Apple recommended restricting USDZ models to a maximum of 100,000 triangles and a texture sizes of 2048x2048 for Apple QuickLook (and I think for RealityKit on iOS in general). Does Apple have any recommended max polygon counts for visionOS? Is it the same for models running in a Volumetric window in the shared space and in ImmersiveSpace? What is the recommended texture size for visionOS? (I seem to recall 8192x8192, but I can't find it now)
2
0
1.6k
Jun ’24
Xbox controller and visionOS 2
I am having problems getting button input from an Xbox game controller. I have the visionOS 2 beta on my Apple Vision Pro, and I am trying to use an Xbox game controller with a RealityView following the instructions from the WWDC session Explore game input in visionOS. The notification about a game controller is picking up the game controller, finds GCInputButtonA, and I am setting closures for touchedChangedHandler, pressedChangedHandler, and valueChangedHandler that just print an os_log statement. buttonA.valueChangedHandler = { button, value, pressed in os_log("Got valueChangedHandler") } At the end of RealityView, I have the modifier RealityView { content in // stuff } .handlesGameControllerEvents(matching: .gamepad) But I am never seeing the log message appear in the console when I press the 'A' button (or any other button). Any ideas what I might be doing wrong? The Xbox controller is pretty old. Settings is reporting it as version 9.0.3
1
1
1.2k
Jun ’24
AnchorEntity to ARAnchor (or vice versa)
Given an AnchorEntity from say RealityKit's Scene anchors collection, is it possible to retrieve the ARAnchor that was used when creating the AnchorEntity? Looking through the AnchorEntity documentation, - https://developer.apple.com/documentation/realitykit/anchorentity it seems that while you can create an AnchorEntity using an ARAnchor, there is no way to retrieve that ARAnchor afterwards. Alternatively, the ARSession delegate functions receive a list of ARAnchors or an ARFrame that has ARAnchors, but I could not find an approach to retrieve AnchorEntities that might be associated with any of these ARAnchors. Given an ARAnchor, is there a way to get an AnchorEntity associated with it? Thanks,
1
0
1.4k
Mar ’21
flickering, double vision with raycast on iPadOS 15.0
In ARKit+RealityKit I do a raycast from the ARView's center, then create an AnchorEntity at the result and add a target ModelEntity (a flattened cube) to the AnchorEntity. guard let result = session.raycast(query).first else { return } let newAnchor = AnchorEntity(raycastResult: result) newAnchor.addChild(placementTargetEntity) arView.scene.addAnchor(newAnchor) I repeat this for each frame update via the ARSessionDelegate session(_:didUpdate:), removing the previous AnchorEntity first. I use this as a target to let the user know where the full model will be placed when they tap the screen. This works find under iOS 14, but I get strange results with iPadOS 15 - two different placements are created on different screen updates, offset from each other and slightly rotated from each other. Has anyone else had issues with raycast() or creating an AnchorEntity from the result? Is the use of session(_:didUpdate:) via ARSessionDelegate to update virtual content considered bad style now? (I noticed in the WWDC21 they used a different mechanism to update their virtual content.) (If any Apple engineers read this, I filed a feedback with sample code and video of the issue at FB9535616)
5
0
1.2k
Aug ’21