Post

Replies

Boosts

Views

Activity

RealityView in macOS, Skybox, and lighting issue
I am testing RealityView on a Mac, and I am having troubles controlling the lighting. I initially add a red cube, and everything is fine. (see figure 1) I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field. Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was. Is there a way to return the lighting of the model to the original lighting I had before adding the skybox? I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment. Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c) struct MyRealityView: View { @Binding var isSwitchOn: Bool @State private var blueNebulaSkyboxResource: EnvironmentResource? var body: some View { RealityView { content in // Create a red cube 10cm on a side let mesh = MeshResource.generateBox(size: 0.1) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) // Load skybox let blueNeb2Name = "BlueNeb2" blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name) } update: { content in if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) { content.environment = .skybox(blueNebulaSkyboxResource!) } else { content.environment = .default } } .realityViewCameraControls(CameraControls.orbit) } } Figure 1 (default lighting before adding the skybox): Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox): Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
1
0
813
Aug ’24
Casting shadows on the ground
In visionOS 2 beta, I have a character loaded from a Reality Composer Pro scene standing on the floor, but he isn't casting a shadow on the floor. I added a GroundingShadowComponent in RealityView, and he does cast shadows on himself (e.g., his hands cast shadows on his shoes), but I don't see any shadow on the floor. Do I need to enable something to have my character cast a show on the real-world floor?
1
0
692
Sep ’24
Turn off camera in RealityView for iOS?
I am using RealityView for an iOS program. Is it possible to turn off the camera passthrough, so only my virtual content is showing? I am looking to create VR experience. I have a work around where I turn off occlusion and then create a sphere around me (e.g., with a black texture), but in the pre-RealityView days, I think I used something like this: arView.environment.background = .color(.black) Is there something similar in RealityView for iOS? Here are some snippets of my current work around inside RealityView. First create the sphere to surround the user: // Create sphere let blackMaterial = UnlitMaterial(color: .black) let sphereMesh = MeshResource.generateSphere(radius: 100) let sphereModelComponent = ModelComponent(mesh: sphereMesh, materials: [blackMaterial]) let sphereEntity = Entity() sphereEntity.components.set(sphereModelComponent) sphereEntity.scale *= .init(x: -1, y: 1, z: 1) content.add(sphereEntity) Then turn off occlusion: // Turn off occlusion let configuration = SpatialTrackingSession.Configuration( tracking: [], sceneUnderstanding: [], camera: .back) let session = SpatialTrackingSession() await session.run(configuration)
1
0
688
Sep ’24
Zombie System Extensions
I had a weird case today when an endpoint system extension remained running even after I deleted the .app bundle.If I tried killing the process with "sudo kill -9 <pid>", the extension respawned.If I tried "sudo launchctl remove <name>", I was told I didn't have privilege.Searching my hard drive I found a copy of the system extension in /Macintosh HD/Library/System Extensions/...I rebooted into recovery mode, deleted the extension bundle, and restarted. Everything initially looked fine. The process did not come back.But then when I tried to re-build, re-package, re-install, and re-launch the application, the operating system complained that it could not find the system extension even though it was there in the .app bundle.The operating system seems to (A) create a cache/copy of the system extension bundle, and (my guess) (B) maintains a link to that cache location somewhere and tries to launch that cached system extension bundle.[my hacked solution was to rename the extension, including creating a new bundle ID and associated provisioning profile]Has anyone encountered a system extension that woud not die? Did you figure out how to kill it and clear out any caches of it?Thanks,
10
2
7.9k
Feb ’22
PHAsset of older photos is missing GPS info in Swift
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset. For newer photos, the asset.location is populated with GPS information. Also, with newer photos, CIImage's property dictionary has {GPS} information. So all is good with newer photos. But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage. However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information. So am I missing some settings to find the GPS information in older photos using PHAsset on iOS? Thanks,
2
0
1.3k
Sep ’21
RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds) Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*). Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
2
0
1.7k
Apr ’21
ARKit FPS drop when device gets hot?
During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there. This low FPS will continue even when I kill the app completely and restart. However, after giving my phone a break, the fps returns to 60 fps. Does ARKit automatically throttle down FPS when the device gets too hot? If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
2
0
1.7k
Sep ’21
polygon count vs. triangle count?
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
2
0
2.5k
Nov ’21
iOS 16.2: Cannot load underlying module for 'ARKit'
I have a strange warning in Xcode associated with ARKit. It isn't a big issue because there is a work around, but I am curious why this is happening and how I can avoid it in the future. I opened up an old AR project in Xcode, and the editor gave a strange error message on the "import ARKit" line saying Cannot load underlying module for 'ARKit' Despite the error message, the code continues to build and run. I've quit & restarted Xcode, rebooted the Mac, and even deleted and and redownloaded Xcode, but the error/warning was still there. Upon some additional testing, I discovered that I only get this message when targeting "iOS 16.2" but not when targeting 16.0, 16.1, 16.3, or 16.4. (I did not try pre-iOS 16) Any idea why my Xcode no longer likes ARKit on iOS 16.2 on my Mac? Development platform: Xcode: Version 14.3.1 (14E300c) macOS: 13.4 (22F66) Mac: Mac Studio 2022 iOS on iPhone 14 Pro Max: 16.5 (20F66) Screenshot:
2
1
1.4k
Aug ’23
Xcode Beta RC didn't have an option for vision simulator
I just downloaded the latest Xcode beta, Version 15.0 (15A240d) and ran into some issues: On start up, I was not given an option to download the Vision simulator. I cannot create a project targeted at visionOS I cannot build/run a hello world app for Vision. In my previous Xcode-beta (Version 15.0 beta 8 (15A5229m)), there was an option to download the vision simulator, and I can create projects for the visionOS and run the code in the vision simulator. The Xcode file downloaded was named "Xcode" instead of "Xcode-beta". I didn't want to get rid of the exiting Xcode, so I selected Keep Both. Now I have 3 Xcodes in the Applications folder Xcode Xcode copy Xcode-beta That is the only thing I see that might have been different about my install. Hardware: Mac Studio 2022 with M1 Max macOS Ventura 13.5.2 Any idea what I did wrong?
2
0
1.8k
Sep ’23
Placement of model inside volumetric window?
I am having troubles placing a model inside a volumetric window. I have a model - just a simple cube created in Reality Composer Pro that is 0.2m on a side and centered at the origin - and I want to display it in a volumetric window that is 1.0m on a side while preserving the cube's origin 0.2m size. The small cube seems to be flush against the back and top of the larger volumetric window. Is it possible to initially position the model inside the volume? For example, can the model be placed flush against the bottom and front of the volumetric window? (note: the actual use case is wanting to place 3D terrain (which tends to be mostly flat like a pizza box) flush against the bottom of the volumetric window)
2
2
765
Dec ’24
Triangle count and texture size budget for RealityKit on visionOS
In the past, Apple recommended restricting USDZ models to a maximum of 100,000 triangles and a texture sizes of 2048x2048 for Apple QuickLook (and I think for RealityKit on iOS in general). Does Apple have any recommended max polygon counts for visionOS? Is it the same for models running in a Volumetric window in the shared space and in ImmersiveSpace? What is the recommended texture size for visionOS? (I seem to recall 8192x8192, but I can't find it now)
2
0
1.6k
Jun ’24
Reality Converter scale issue
I don't know if this is an issue with Apple's Reality Converter app or Blender (I'm using 3.0 on the Mac), but when I export a model as .obj and import it to Reality Converter, the scale is off by a factor of 100. That is, the following workflow creates tiny (1/100 scale) entities: Blender > [.obj] > Reality Converter > [USDZ] But this workflow is OK: Blender > [.glb] > Reality Converter > [USDZ] Two workarounds are: export as .glb/.gltf, when exporting .obj set the scale factor to 100 in Blender Is this a known issue, or am I doing something wrong? If it is an issue, should I file a bug report?
3
0
1.9k
Dec ’21
RealityView in macOS, Skybox, and lighting issue
I am testing RealityView on a Mac, and I am having troubles controlling the lighting. I initially add a red cube, and everything is fine. (see figure 1) I then activate a skybox with a star field, the star field appears, and then the red cube is only lit by the star field. Then I deactivate the skybox expecting the original lighting to return, but the cube continues to be lit by the skybox. The background is no longer showing the skybox, but the cube is never lit like it originally was. Is there a way to return the lighting of the model to the original lighting I had before adding the skybox? I seem to recall ARView's environment property had both a lighting.resource and a background, but I don't see both of those properties in RealityViewCameraContent's environment. Sample code for 15.1 Beta (24B5024e), Xcode 16.0 beta (16A5171c) struct MyRealityView: View { @Binding var isSwitchOn: Bool @State private var blueNebulaSkyboxResource: EnvironmentResource? var body: some View { RealityView { content in // Create a red cube 10cm on a side let mesh = MeshResource.generateBox(size: 0.1) let simpleMaterial = SimpleMaterial(color: .red, isMetallic: false) let model = ModelComponent( mesh: mesh, materials: [simpleMaterial] ) let redBoxEntity = Entity() redBoxEntity.components.set(model) content.add(redBoxEntity) // Load skybox let blueNeb2Name = "BlueNeb2" blueNebulaSkyboxResource = try? await EnvironmentResource(named: blueNeb2Name) } update: { content in if (blueNebulaSkyboxResource != nil) && (isSwitchOn == true) { content.environment = .skybox(blueNebulaSkyboxResource!) } else { content.environment = .default } } .realityViewCameraControls(CameraControls.orbit) } } Figure 1 (default lighting before adding the skybox): Figure 2 (after activating skybox with star field; cube is lit by / reflects skybox): Figure 3 (removing skybox by setting content.environment to .default, cube still reflects skybox; it is hard to see):
Replies
1
Boosts
0
Views
813
Activity
Aug ’24
Casting shadows on the ground
In visionOS 2 beta, I have a character loaded from a Reality Composer Pro scene standing on the floor, but he isn't casting a shadow on the floor. I added a GroundingShadowComponent in RealityView, and he does cast shadows on himself (e.g., his hands cast shadows on his shoes), but I don't see any shadow on the floor. Do I need to enable something to have my character cast a show on the real-world floor?
Replies
1
Boosts
0
Views
692
Activity
Sep ’24
Turn off camera in RealityView for iOS?
I am using RealityView for an iOS program. Is it possible to turn off the camera passthrough, so only my virtual content is showing? I am looking to create VR experience. I have a work around where I turn off occlusion and then create a sphere around me (e.g., with a black texture), but in the pre-RealityView days, I think I used something like this: arView.environment.background = .color(.black) Is there something similar in RealityView for iOS? Here are some snippets of my current work around inside RealityView. First create the sphere to surround the user: // Create sphere let blackMaterial = UnlitMaterial(color: .black) let sphereMesh = MeshResource.generateSphere(radius: 100) let sphereModelComponent = ModelComponent(mesh: sphereMesh, materials: [blackMaterial]) let sphereEntity = Entity() sphereEntity.components.set(sphereModelComponent) sphereEntity.scale *= .init(x: -1, y: 1, z: 1) content.add(sphereEntity) Then turn off occlusion: // Turn off occlusion let configuration = SpatialTrackingSession.Configuration( tracking: [], sceneUnderstanding: [], camera: .back) let session = SpatialTrackingSession() await session.run(configuration)
Replies
1
Boosts
0
Views
688
Activity
Sep ’24
Zombie System Extensions
I had a weird case today when an endpoint system extension remained running even after I deleted the .app bundle.If I tried killing the process with "sudo kill -9 <pid>", the extension respawned.If I tried "sudo launchctl remove <name>", I was told I didn't have privilege.Searching my hard drive I found a copy of the system extension in /Macintosh HD/Library/System Extensions/...I rebooted into recovery mode, deleted the extension bundle, and restarted. Everything initially looked fine. The process did not come back.But then when I tried to re-build, re-package, re-install, and re-launch the application, the operating system complained that it could not find the system extension even though it was there in the .app bundle.The operating system seems to (A) create a cache/copy of the system extension bundle, and (my guess) (B) maintains a link to that cache location somewhere and tries to launch that cached system extension bundle.[my hacked solution was to rename the extension, including creating a new bundle ID and associated provisioning profile]Has anyone encountered a system extension that woud not die? Did you figure out how to kill it and clear out any caches of it?Thanks,
Replies
10
Boosts
2
Views
7.9k
Activity
Feb ’22
PHAsset of older photos is missing GPS info in Swift
I've been working in Swift on iOS to access images via UIImagePickerController, pulling the PHAsset from the picker delegate's "info" dictionary, and then pulling GPS information from the PHAsset. For newer photos, the asset.location is populated with GPS information. Also, with newer photos, CIImage's property dictionary has {GPS} information. So all is good with newer photos. But when I go back to images taken in 2017, asset.location is nil and there is no "{GPS} information in the CIImage. However, if I export the photo from Photos app on my Mac and then view it in Preview, there *is* GPS information. So am I missing some settings to find the GPS information in older photos using PHAsset on iOS? Thanks,
Replies
2
Boosts
0
Views
1.3k
Activity
Sep ’21
RealityKit Transform rotation: choosing clockwise vs. anti-clockwise
I'm using Transform's move(to:relativeTo:duration:timingFunction:) to rotate an Entity around the Y axis in an animated fashion (e.g., duration 2 seconds) Unfortunately, when I rotate from 6 radians (343.7*) to 6.6 radians (378.2*), the rotation does not continue anti-clockwise past 2 pi (360*) but backwards to 0.317 radians (18.2*). Is there a way to force a rotation about an axis to go in a clockwise or anti-clockwise direction when animating?
Replies
2
Boosts
0
Views
1.7k
Activity
Apr ’21
ARKit FPS drop when device gets hot?
During testing of my app the frames per second -- shown either in the Xcode debug navigator or ARView .showStatistics -- sometimes drops by half and stays down there. This low FPS will continue even when I kill the app completely and restart. However, after giving my phone a break, the fps returns to 60 fps. Does ARKit automatically throttle down FPS when the device gets too hot? If so, is there a signal my program can catch from ARKit or the OS that can tell me this is happening?
Replies
2
Boosts
0
Views
1.7k
Activity
Sep ’21
In-App Purchase with TestFlight, many prompts.
During my first external test using TestFlight for an In-App Purchase (iPadOS), the user was (1) Prompted for their Apple ID & password (2) Prompted for their password a second time (3) (User believes) prompted for their password a third time Are these multiple prompts for their password expected behavior, or have I done something wrong?
Replies
2
Boosts
0
Views
2.1k
Activity
Nov ’21
polygon count vs. triangle count?
In a previous post I asked if 100,000 polygons is still the recommended size for USDZ Quick Look models on the web. (The answer is yes) But I realize my polygons are 4-sided but are not planar, so they have to be broken down into 2 triangles when rendered. Given that, should I shoot for 50,000 polygons (i.e., 100,000 triangles)? Or does the 100,000 polygon statistic already assume polygons will be subdivided into triangles? (The models are generated from digital terrain (GeoTIFF) data, not a 3D modeling tool)
Replies
2
Boosts
0
Views
2.5k
Activity
Nov ’21
iOS 16.2: Cannot load underlying module for 'ARKit'
I have a strange warning in Xcode associated with ARKit. It isn't a big issue because there is a work around, but I am curious why this is happening and how I can avoid it in the future. I opened up an old AR project in Xcode, and the editor gave a strange error message on the "import ARKit" line saying Cannot load underlying module for 'ARKit' Despite the error message, the code continues to build and run. I've quit & restarted Xcode, rebooted the Mac, and even deleted and and redownloaded Xcode, but the error/warning was still there. Upon some additional testing, I discovered that I only get this message when targeting "iOS 16.2" but not when targeting 16.0, 16.1, 16.3, or 16.4. (I did not try pre-iOS 16) Any idea why my Xcode no longer likes ARKit on iOS 16.2 on my Mac? Development platform: Xcode: Version 14.3.1 (14E300c) macOS: 13.4 (22F66) Mac: Mac Studio 2022 iOS on iPhone 14 Pro Max: 16.5 (20F66) Screenshot:
Replies
2
Boosts
1
Views
1.4k
Activity
Aug ’23
Xcode Beta RC didn't have an option for vision simulator
I just downloaded the latest Xcode beta, Version 15.0 (15A240d) and ran into some issues: On start up, I was not given an option to download the Vision simulator. I cannot create a project targeted at visionOS I cannot build/run a hello world app for Vision. In my previous Xcode-beta (Version 15.0 beta 8 (15A5229m)), there was an option to download the vision simulator, and I can create projects for the visionOS and run the code in the vision simulator. The Xcode file downloaded was named "Xcode" instead of "Xcode-beta". I didn't want to get rid of the exiting Xcode, so I selected Keep Both. Now I have 3 Xcodes in the Applications folder Xcode Xcode copy Xcode-beta That is the only thing I see that might have been different about my install. Hardware: Mac Studio 2022 with M1 Max macOS Ventura 13.5.2 Any idea what I did wrong?
Replies
2
Boosts
0
Views
1.8k
Activity
Sep ’23
Placement of model inside volumetric window?
I am having troubles placing a model inside a volumetric window. I have a model - just a simple cube created in Reality Composer Pro that is 0.2m on a side and centered at the origin - and I want to display it in a volumetric window that is 1.0m on a side while preserving the cube's origin 0.2m size. The small cube seems to be flush against the back and top of the larger volumetric window. Is it possible to initially position the model inside the volume? For example, can the model be placed flush against the bottom and front of the volumetric window? (note: the actual use case is wanting to place 3D terrain (which tends to be mostly flat like a pizza box) flush against the bottom of the volumetric window)
Replies
2
Boosts
2
Views
765
Activity
Dec ’24
Triangle count and texture size budget for RealityKit on visionOS
In the past, Apple recommended restricting USDZ models to a maximum of 100,000 triangles and a texture sizes of 2048x2048 for Apple QuickLook (and I think for RealityKit on iOS in general). Does Apple have any recommended max polygon counts for visionOS? Is it the same for models running in a Volumetric window in the shared space and in ImmersiveSpace? What is the recommended texture size for visionOS? (I seem to recall 8192x8192, but I can't find it now)
Replies
2
Boosts
0
Views
1.6k
Activity
Jun ’24
SynchronizationServices over the Internet?
Is there an equivalent to MultipeerConnectivityService that implements SynchronizationService over TCP/IP connections? I'd like to have two users in separate locations, each with a local ARAnchor but then have a synchronized RealityKit scene graph attached to their separate ARAnchors. Is this possible? Thanks,
Replies
3
Boosts
0
Views
1.8k
Activity
Feb ’22
Reality Converter scale issue
I don't know if this is an issue with Apple's Reality Converter app or Blender (I'm using 3.0 on the Mac), but when I export a model as .obj and import it to Reality Converter, the scale is off by a factor of 100. That is, the following workflow creates tiny (1/100 scale) entities: Blender > [.obj] > Reality Converter > [USDZ] But this workflow is OK: Blender > [.glb] > Reality Converter > [USDZ] Two workarounds are: export as .glb/.gltf, when exporting .obj set the scale factor to 100 in Blender Is this a known issue, or am I doing something wrong? If it is an issue, should I file a bug report?
Replies
3
Boosts
0
Views
1.9k
Activity
Dec ’21