Post

Replies

Boosts

Views

Activity

ARView Frame Timestamp/Elapsed Time
Hello, I've been looking all over the place but so far I haven't found a trivial way to grab ARView's current timestamp. So basically the elapsed time since the scene started rendering. I can access that in the Surface and Geometry shaders but I would like to pass a timestamp as a parameter in order to drive shader animations. I feel that's more efficient than injecting animation progress manually on every frame, especially if there are lot's of objects with that shader. So far what I've done is subscribing to the Scenes Update event and using the delta time to calculate the elapsed time myself. But this is quite error prone and tends to break when I present the scene a second time (e.g closing and reopening to AR experience). The only other option I found was using a render callback and to grab the time property from the PostProcessContext. That works well but do I really have to go that route? Would be great if there is an easy way to achieve this. Pretty much an equivalent to this: https://developer.apple.com/documentation/scenekit/scnscenerenderer/1522680-scenetime NOTE: I'm not looking for the timestamp of the ARSessions current frame. Thank you!
0
0
996
Jan ’22
Satellite Map Imagery Regression in Berlin Area (Germany)
Hello, after a recent Apple Maps update last week the satellite imagery data now looks a lot worse than before. What previously looked lifelike and lush now looks very sad and wintery. Also the contrast seems way to extreme. Attached is a sample image. FB: FB11716831 Any chance that this could be reverted to the old version?
0
0
1.1k
Oct ’22
Codable Conformance results in significant binary size increase
Hello, I recently converted from manual dictionary-based JSON Serialisation to Codable and noticed that this resulted in a pretty significant growth in binary size (I looked that up via App Thinning Size Report). The difference from codable to non codable is ~800KB. As our app also supports App Clips I now can't fulfill the 10MB Universal Bundle Size limit anymore. Is there any way I can make this a little more lean? Would it help to manually implement all Codable methods instead of relying on compiler synthetization? Thanks for any hints!
0
0
872
Nov ’22
Environment Texturing – Reflection Probe Management
Hello, I've done a couple tests and noticed that when I run an ARKit Session with for example a world- or geo-tracking configuration which has environment texturing set to either .manual or .automatic and I walk around with the device for an extended distance there is a pretty noticeable increase in memory usage. I'm not implying that it's a leak but it seems like the system creates lots and lots of environment probes and does not remove older ones. An example: This is a barebones Xcode RealityKit starter project that spawns a couple cubes with an ARGeoTracking configuration. After ~7mins of walking around (roughly a distance of 100-200 meters) it uses an additional 300MBs of RAM. I've seen cases where users walked around for a while and the app eventually crashed because of this. When environment texturing is disabled RAM usage pretty much stays the same, no matter how far I walk. Is there a recommended way on how to handle this? Should I remove probes manually after a while or eventually disable environment texturing altogether at some point (will that preserve the current cube map)? I would appreciate any guidance. With the Xcode example project you can easily recreate the issue by modifying it just a little and then walk around for a while with statistics enabled. class ViewController: UIViewController {     @IBOutlet var arView: ARView!     override func viewDidLoad() {         super.viewDidLoad()         arView.automaticallyConfigureSession = false     }     override func viewDidAppear(_ animated: Bool) {         super.viewDidAppear(animated)         let config = ARWorldTrackingConfiguration()         config.environmentTexturing = .manual         arView.session.run(config)         // Load the "Box" scene from the "Experience" Reality File         let boxAnchor = try! Experience.loadBox()         // Add the box anchor to the scene         arView.scene.anchors.append(boxAnchor)     } }
0
0
1.3k
Dec ’22
OcclusionMaterial renders plain black when custom skybox environment is set
Using OcclusionMaterial on macOS and iOS works fine in Non-AR mode when I set the background to just a simple color (https://developer.apple.com/documentation/realitykit/arview/environment-swift.struct/color) but when I set a custom skybox (https://developer.apple.com/documentation/realitykit/arview/environment-swift.struct/background-swift.struct/skybox(_:)) the OcclusionMaterial renders as fully black. I would expect it to properly occlude the content and show through the skybox behind it. This happens with box ARView and RealityView. On current iOS/macOS Betas as well as on older systems, e.g iOS 17 and macOS Sonoma. Feedback ID: FB15081053
0
0
450
Sep ’24
Building visionOS app with Firebase Source Distribution in Xcode Cloud
I have a working Xcode Cloud setup for my iOS and macOS targets, and I'm trying to add visionOS support. The issue is that Firebase requires using their source distribution for visionOS (instead of their default binary distribution). Locally, this works by launching Xcode with: open -a Xcode --env FIREBASE_SOURCE_FIRESTORE project.xcodeproj For Xcode Cloud, I've added a ci_post_clone.sh script that sets this environment variable for visionOS builds: #!/bin/bash if [[ $CI_PRODUCT_PLATFORM == "xrOS" ]]; then echo "Running setup for visionOS..." export FIREBASE_SOURCE_FIRESTORE=1 fi However, I'm getting the following error during build: an out-of-date resolved file was detected at /.../project.xcworkspace/xcshareddata/swiftpm/Package.resolved, which is not allowed when automatic dependency resolution is disabled So since setting FIREBASE_SOURCE_FIRESTORE=1 changes which SPM dependencies are required: Normal setup uses: abseil-cpp-binary, grpc-binary Source distribution needs: abseil-cpp-swiftpm, grpc-ios, boringssl-swiftpm What's the recommended way to handle this in Xcode Cloud when maintaining builds for all platforms? Should I be using separate workflows with different branches for different platforms? Or is there a better approach? System: Xcode 16.2 Using SPM for dependency management Firebase iOS SDK 10.29.0 Building for iOS, macOS, and visionOS Thanks in advance for any guidance!
0
1
480
Feb ’25
SPM Traits not working correctly with multiple targets in same Xcode workspace (App + App Clip)
Hi, I've just migrated to Swift Tools 6.2 and package traits, and I'm encountering an issue when using traits with multiple targets in the same Xcode workspace. Setup: Main iOS app target App Clip target Both consume the same local packages (e.g., UIComponents) What I'm trying to achieve: Main app imports packages without the COMPACT_BUILD trait App Clip imports packages with the COMPACT_BUILD trait enabled Package configuration (simplified): // UIComponents/Package.swift let package = Package( name: "UIComponents", platforms: [.iOS(.v18)], traits: [ .trait(name: "COMPACT_BUILD", description: "Minimal build for App Clips"), ], // ... targets: [ .target( name: "UIComponents", dependencies: [...], swiftSettings: [ .define("COMPACT_BUILD", .when(traits: ["COMPACT_BUILD"])), ] ), ] ) In the code: #if COMPACT_BUILD // Excluded from App Clip #endif The consumer packages: Main app's package imports without trait: .package(path: "../UIComponents") App Clip's package imports with trait: .package(path: "../UIComponents", traits: ["COMPACT_BUILD"]) The problem: When building the main app target, the COMPACT_BUILD compiler condition is unexpectedly active — even though the main app's dependency chain never enables that trait. It seems like the trait enabled by the App Clip target is "leaking" into the main app build. I confirmed this by adding #error("COMPACT_BUILD is active") — it triggers when building the main app, which shouldn't happen. If I disable the App Clip target from the build scheme, the main app builds correctly with COMPACT_BUILD not defined. I am also able to build the App Clip separately. Environment: Xcode 26.2 swift-tools-version: 6.2 iOS 26.2 Questions: Is this expected behavior with Xcode's SPM integration? Are traits resolved workspace-wide rather than per-target? Is there a workaround to have different trait configurations for different targets consuming the same package? Or do I need to fall back to separate package targets (e.g., UIComponents and UIComponentsCompact) to achieve this? Any guidance would be appreciated. Thanks!
0
0
34
3d
dyld: Library not loaded: RealityFoundation – Error on iOS 14 with RealityKit 2
I've just implemented some necessary fixes to our app to ensure compatibility with iOS 15 and ARKit 2. Now when I deploy the app via XCode 13 RC (Version 13.0 (13A233)) I get the following crash right at launch: dyld: Library not loaded: /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation   Referenced from: /private/var/containers/Bundle/Application/…   Reason: image not found dyld: launch, loading dependent libraries DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib Is this a known issue? I've already tried deleting Derived Data and clearing the project but the problem persists. The minimum deployment target is iOS 13 for the main app and iOS 14 for the AppClip. All iOS 15 related fixes are wrapped into if #available(iOS 15.0, *) { … } This is a pretty major problem for us as we now can't send out Testflights or upload to AppStoreConnect for Monday. Thanks!
1
0
4.1k
Jan ’22
Create MTLLibrary from raw String for use within RealityKit
Hello, I have a usecase where I need to to download and compile metal shaders on demand as strings or .metal files. These should then be used for CustomMaterials and/or postprocessing within RealityKit. Essentially this boils down to having raw source code that needs to be compiled at runtime. My plan was to use the method makeLibrary(source:options:completionHandler:) to accomplish this. The problem is that I get the following error during compilation: RealityKitARExperienceAssetProvider: An error occured while trying to compile shader library »testShaderLibrary« - Error Domain=MTLLibraryErrorDomain Code=3 "program_source:2:10: fatal error: 'RealityKit/RealityKit.h' file not found #include <RealityKit/RealityKit.h> My code for creating the library looks like this (simplified example): let librarySourceString: String = """ #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void mySurfaceShader(realitykit::surface_parameters params) {     params.surface().set_base_color(half3(1, 1, 1)); } """ mtlDevice.makeLibrary(source: librarySourceString, options: nil) { library, error in     if let error = error {         dump(error) return     }     // do something with library } So I'm wondering if there's a way to tell the metal compiler how to resolve this reference to the RealityKit header file? Would I need to replace that part of the source string maybe with an absolute path to the RealityKit framework (if so – how would I get this at runtime)? Appreciate any hints - thanks!
1
0
1.8k
Nov ’21
Precompile/prewarm shaders to avoid jank
Hi, is there a way to force RealityKit to compile/prewarm and cache all shaders that will be used within a Scene in advance – ideally in the background? This would be useful for adding complex models to the scene which sometimes can cause quite a couple dropped frames even on the newest devices (at least I assume the initial delay when displaying them is caused by the shader compilation) but also for CustomMaterials. Note this also happens with models that are loaded asynchronously. Thanks!
1
0
1.2k
Nov ’21
Blurred Background (RealityKit) Shader Graph Node not working on iOS/macOS
The ShaderGraph Node Blurred Background (RealityKit) – https://developer.apple.com/documentation/shadergraph/realitykit/blurred-background-(realitykit) works fine within the RealityComposer Pro 2 editor but isn't working on iOS 18 or macOS 15. Instead of the blurred content it just renders as opaque in a single color (Screenshot 2). Interestingly it also fails to render within RealityComposer Pro when no other entities are within the scene, e.g only a background skybox set. Expected Behavior: It would be great if this node worked the same way as it does on visionOS since this would allow for really interesting and nice effects for scenes. Feedback ID: FB15081190
1
1
781
Jan ’25
Loading USDZ with particle system crashes on Intel Macs
Hello, we have a RealityKit app that also runs on macOS via Catalyst. For specific USD assets containing particle systems we have observed a reproducible crash. Steps to reproduce: Open Reality Composer Pro Create new file Create simple particle system (default one is fine) export as USDZ Create project in Xcode Call Entity.load(… and pass in your USD Running this on an Intel iMac with macOS Sequoia 15.3 will lead to a crash with the following console log: validateWithDevice:4704: failed assertion `Render Pipeline DescrvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' iptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' 32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' 8) must match. ' Xcode version: 16.2.0 iMac 2020 3,8GHz Intel Core i7 macOS Sequoia 15.3 FB16477373 It would be great if this could be fixed quickly or a workaround provided since it affects or production app. Thank you!
1
0
420
Mar ’25
CustomMaterial disable unlit tone mapping
Hi, since iOS 18 UnlitMaterial and ShaderGraphMaterial have the option to disable tone mapping, e.g via https://developer.apple.com/documentation/realitykit/unlitmaterial/init(applypostprocesstonemap:) Is it possible to do the same for CustomMaterial? I tried initializing a CustomMaterial based on an UnlitMaterial where tone mapping is disabled, like so: let unlitMat = UnlitMaterial(applyPostProcessToneMap: false) let customMaterial = try CustomMaterial( from: unlitMat, surfaceShader: surfaceShader, geometryModifier: geometryModifier ) but that does not seem to work. The colors of my texture still look altered in comparison to a plain UnlitMaterial or a ShaderGraphMaterial where its disabled. Any hints? Thank you!
1
0
118
Jun ’25
NSToolbar Space item rendered with Liquid Glass Background
Hi, I have a NSToolbar in my Mac Catalyst app with a space and flexible space item in it (https://developer.apple.com/documentation/appkit/nstoolbaritem/identifier/space). On macOS Tahoe the space item is being rendered with a Liquid Glass effect and seems to be automatically grouped with the previous item. Is there a way to prevent this? It basically adds some undesired padding next to the previous item and looks add. The flexible space is rendered normally and as before. I am talking about the space right next to the back chevron item. Thanks for any hints!
1
0
142
Sep ’25