Post

Replies

Boosts

Views

Activity

RealityKit/ARKit Environment Texturing broken on iOS 18
Devices running iOS 18 using RealityKit do not seem to receive lighting supplied via ARKit Environment Texturing (https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing). Instead just a default IBL is used by RealityKit. This happens with RealityView as well as ARView. It also happens when I explicitly opt-in to environment texturing: let worldTrackingConfig = ARWorldTrackingConfiguration() worldTrackingConfig.environmentTexturing = .automatic arView.session.run(worldTrackingConfig) Even the Xcode AR Template has this issue. I'm attaching a screenshot of the sample app running on iOS 18 where it's broken and from iOS 17 where it works as expected. I hope this can get resolved quickly since I see it as a major regression. Feedback ID: FB15091335 UPDATE: It works on my older iPhone XS (iOS 18 22A5282m) Broken on iPad Pro (11-inch) (3rd generation) (iPadOS 18.0 (22A5350a)) Maybe it's related to LiDAR? Thank you! iOS 17 (works): iOS 18 (broken):
3
1
1k
Jan ’25
Blurred Background (RealityKit) Shader Graph Node not working on iOS/macOS
The ShaderGraph Node Blurred Background (RealityKit) – https://developer.apple.com/documentation/shadergraph/realitykit/blurred-background-(realitykit) works fine within the RealityComposer Pro 2 editor but isn't working on iOS 18 or macOS 15. Instead of the blurred content it just renders as opaque in a single color (Screenshot 2). Interestingly it also fails to render within RealityComposer Pro when no other entities are within the scene, e.g only a background skybox set. Expected Behavior: It would be great if this node worked the same way as it does on visionOS since this would allow for really interesting and nice effects for scenes. Feedback ID: FB15081190
1
1
780
Jan ’25
RealityKit visionOS anchor to POV
Hi, is there a way in visionOS to anchor an entity to the POV via RealityKit? I need an entity which is always fixed to the 'camera'. I'm aware that this is discouraged from a design perspective as it can be visually distracting. In my case though I want to use it to attach a fixed collider entity, so that the camera can collide with objects in the scene. Edit: ARView on iOS has a lot of very useful helper properties and functions like cameraTransform (https://developer.apple.com/documentation/realitykit/arview/cameratransform) How would I get this information on visionOS? RealityViews content does not seem offer anything comparable. An example use case would be that I would like to add an entity to the scene at my users eye-level, basically depending on their height. I found https://developer.apple.com/documentation/realitykit/realityrenderer which has an activeCamera property but so far it's unclear to me in which context RealityRenderer is used and how I could access it. Appreciate any hints, thanks!
9
6
5.2k
Dec ’24
PortalComponent – allow world content to peek out
Hello, I've been tinkering with PortalComponent on visionOS a bit but noticed that the content of the WorldComponent is always clipped to the mesh geometry of whatever entities have the PortalComponent applied. Now I'm wondering if there is any way or trick to allow contents of the portal to peek out – similar to the Encounter Dinosaurs experience on Vision Pro (I assume it also uses PortalComponent?). I saw that PortalComponent has a clippingPlane property (https://developer.apple.com/documentation/realitykit/portalcomponent/clippingplane-swift.property). But so far I haven't been able to achieve a perceptible visual difference with it. If possible I would like to avoid hacky tricks using duplicate meshes or similar to achieve this. Thanks for any hints!
5
0
1.6k
Dec ’24
App Clip Support for visionOS
Hi, We've been leveraging AppClips on iOS for a while now to distribute native-app quality AR experiences (utilising ARKit and RealityKit) with the accessibility of a website. This has been a crucial differentiator for us and is a core driver for our business. Since our authoring tools also allows to run the same AR experiences on Vision Pro it would be amazing if they could be triggered by App Clips here as well. We've got this feedback from clients and users multiple times and since there seems to already be some basic App Clips support (e.g when registering the custom lens inserts) integrated into the system we would immensely appreciate if this feature could be opened up for 3rd party developers as well. Associated feedback ID: FB13348462 Thank you!
2
1
630
Sep ’24
Xcode 16 – Symbol not found: ShapeResource generateConvex
Hi, I have a RealityKit app that I am building with Xcode 16. The app has a minimum deployment target of iOS 17. If I run it on an iOS 17 device the app crashes: dyld[15716]: Symbol not found: _$s10RealityKit13ShapeResourceC14generateConvex4fromAcA04MeshD0C_tYaKFZ Referenced from: … Expected in: …/System/Library/Frameworks/RealityFoundation.framework/RealityFoundation My code looks something like this: @available(iOS, introduced: 13.0, obsoleted: 18.0) @MainActor @preconcurrency func generateNonAsyncConvexShapeResource(from meshResource: MeshResource) throws -> ShapeResource { ShapeResource.generateConvex(from: meshResource) } @available(iOS 18.0, *) func generateConvexShapeAsync(from meshResource: MeshResource) async throws -> ShapeResource { // This will only be available for iOS 18 and above return try await ShapeResource.generateConvex(from: meshResource) } if let meshResource = try? modelEntity.model?.mesh.applying(transform: transform.matrix) { if #available(visionOS 1.0, iOS 18.0, *) { try? await generateConvexShapeAsync(from: meshResource)// await shapeResources.append(.generateConvex(from: meshResource)) } else { try? generateNonAsyncConvexShapeResource(from: meshResource) } } So I actually do check for the system and only call the async variant on iOS 18. Any hints how to fix that? Thanks!
2
0
1.4k
Sep ’24
OcclusionMaterial renders plain black when custom skybox environment is set
Using OcclusionMaterial on macOS and iOS works fine in Non-AR mode when I set the background to just a simple color (https://developer.apple.com/documentation/realitykit/arview/environment-swift.struct/color) but when I set a custom skybox (https://developer.apple.com/documentation/realitykit/arview/environment-swift.struct/background-swift.struct/skybox(_:)) the OcclusionMaterial renders as fully black. I would expect it to properly occlude the content and show through the skybox behind it. This happens with box ARView and RealityView. On current iOS/macOS Betas as well as on older systems, e.g iOS 17 and macOS Sonoma. Feedback ID: FB15081053
0
0
450
Sep ’24
macOS Sequoia – NSToolbar on Catalyst disappeared
Hello, I have a macOS Catalyst app that I now began updating and building against the iOS 18/macOS Sequoia SDKs. Most things appear to be working just fine as before, apart from my NSToolbar. At the root of my app I am presenting a UISplitViewController which gets a custom SidebarViewController and a UITabBarController as its viewControllers. Then at same point in the apps lifecycle the UITabBarController presents another ViewController modally. I then associate the controllers window with a custom NSToolbar like this: let toolbar = NSToolbar(identifier: "mainToolbar") toolbar.displayMode = .iconAndLabel toolbar.delegate = self toolbar.allowsUserCustomization = false titleBar.toolbarStyle = .automatic titleBar.titleVisibility = .hidden titleBar.toolbar = toolbar I also disable automatic NSToolbar hosting via: https://developer.apple.com/documentation/uikit/uinavigationbardelegate/3987959-navigationbarnstoolbarsection (returning .none). Now all of this worked fine on macOS Sonoma and previous versions but on Sequoia my custom toolbar refuses to show up. My suspicion is that is has something to do with the new tab and sidebar behaviour introduced with the new SDKs (https://developer.apple.com/documentation/uikit/uinavigationbardelegate/3987959-navigationbarnstoolbarsection). For now within my UITabBarController I was able to revert to the old look using: if #available(iOS 18.0, *) { mode = .tabSidebar sidebar.isHidden = true isTabBarHidden = true } This result in a look similar to the previous macOS version but my NSToolbar unfortunately remains hidden. Is there an easy fix for this? Since I am a solo developer I would prefer to spend my available resources currently on other features and adopt the new tab/sidebars a couple months down the line. Appreciate any help and hints, thanks! There used to be a toolbar here on the right side. ↑
2
1
927
Sep ’24
Render Menu Items with Icon (e.g SFSymbol) on Catalyst
Hello, is there a recommended way to render Menu items, e.g in a SwiftUI ContextMenu with icon (SFSymbols)? Let's say I have the following setup: Both buttons render fine on native macOS (e.g Sonoma) but Catalyst refuses to render the symbol at all. I tried every possible combination I could think off. The only way I found was to directly copy and paste a symbol from the SF symbols app and inline it with the label string as unicode. Unfortunately I have a couple custom SF symbols so this isn't really an option for me. I feel like this is a perfectly valid usecase, as it makes the menu visually a lot easier scannable. With UIKit and Ventura this at least worked for Menubar items but now also seems broken on Sonoma. I would greatly appreciate any hints. Thanks!
2
0
1.7k
Nov ’23
Environment Texturing – Reflection Probe Management
Hello, I've done a couple tests and noticed that when I run an ARKit Session with for example a world- or geo-tracking configuration which has environment texturing set to either .manual or .automatic and I walk around with the device for an extended distance there is a pretty noticeable increase in memory usage. I'm not implying that it's a leak but it seems like the system creates lots and lots of environment probes and does not remove older ones. An example: This is a barebones Xcode RealityKit starter project that spawns a couple cubes with an ARGeoTracking configuration. After ~7mins of walking around (roughly a distance of 100-200 meters) it uses an additional 300MBs of RAM. I've seen cases where users walked around for a while and the app eventually crashed because of this. When environment texturing is disabled RAM usage pretty much stays the same, no matter how far I walk. Is there a recommended way on how to handle this? Should I remove probes manually after a while or eventually disable environment texturing altogether at some point (will that preserve the current cube map)? I would appreciate any guidance. With the Xcode example project you can easily recreate the issue by modifying it just a little and then walk around for a while with statistics enabled. class ViewController: UIViewController {     @IBOutlet var arView: ARView!     override func viewDidLoad() {         super.viewDidLoad()         arView.automaticallyConfigureSession = false     }     override func viewDidAppear(_ animated: Bool) {         super.viewDidAppear(animated)         let config = ARWorldTrackingConfiguration()         config.environmentTexturing = .manual         arView.session.run(config)         // Load the "Box" scene from the "Experience" Reality File         let boxAnchor = try! Experience.loadBox()         // Add the box anchor to the scene         arView.scene.anchors.append(boxAnchor)     } }
0
0
1.3k
Dec ’22
Codable Conformance results in significant binary size increase
Hello, I recently converted from manual dictionary-based JSON Serialisation to Codable and noticed that this resulted in a pretty significant growth in binary size (I looked that up via App Thinning Size Report). The difference from codable to non codable is ~800KB. As our app also supports App Clips I now can't fulfill the 10MB Universal Bundle Size limit anymore. Is there any way I can make this a little more lean? Would it help to manually implement all Codable methods instead of relying on compiler synthetization? Thanks for any hints!
0
0
872
Nov ’22
Satellite Map Imagery Regression in Berlin Area (Germany)
Hello, after a recent Apple Maps update last week the satellite imagery data now looks a lot worse than before. What previously looked lifelike and lush now looks very sad and wintery. Also the contrast seems way to extreme. Attached is a sample image. FB: FB11716831 Any chance that this could be reverted to the old version?
0
0
1.1k
Oct ’22
RealityKit Sample Project Issues; Performance and Crashes
Hi, please let me know if I should rather file feedback for this, but I figured it's worth to flag it one way or an another: Test Xcode Version: 14.0 beta 6 (14A5294g) 1. Project »Altering RealityKit Rendering with Shader Functions« This project crashes right away when running it on a device (iOS 15 and 16). Screenshot: 2. Project »Altering RealityKit Rendering with Shader FunctionsUsing object capture assets in RealityKit« Suffers from pretty bad performance when run on a device – barely scratching 20-25fps on an iPhone 12 Pro. iPhone XS even less. Screenshot: As these are official sample project I feel like they should work flawlessly out of the box. Best Arthur
3
0
1.1k
Sep ’22
Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Hello, On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material. My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue. On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public. The radar number including a sample project is: FB11420976 Thank you!
4
0
1.4k
Sep ’22