Post

Replies

Boosts

Views

Activity

dyld: Library not loaded: RealityFoundation – Error on iOS 14 with RealityKit 2
I've just implemented some necessary fixes to our app to ensure compatibility with iOS 15 and ARKit 2. Now when I deploy the app via XCode 13 RC (Version 13.0 (13A233)) I get the following crash right at launch: dyld: Library not loaded: /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation   Referenced from: /private/var/containers/Bundle/Application/…   Reason: image not found dyld: launch, loading dependent libraries DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib Is this a known issue? I've already tried deleting Derived Data and clearing the project but the problem persists. The minimum deployment target is iOS 13 for the main app and iOS 14 for the AppClip. All iOS 15 related fixes are wrapped into if #available(iOS 15.0, *) { … } This is a pretty major problem for us as we now can't send out Testflights or upload to AppStoreConnect for Monday. Thanks!
1
0
4.1k
Jan ’22
Loading of older .reality files is broken on iOS 15 (works on iOS 14)
Hello, in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app. This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work. I've created a little test project and the error message log is not really helpful. 2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene. 2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259' 2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259' 2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 ▿ Failed to load loadRequest.   - generic: "Failed to load loadRequest." Basic code structure that is used for loading: cancellable = Entity.loadAsync(named: entityName, in: .main)     .sink { completion in         switch completion {         case .failure(let error):             dump(error)             print("Done")         case .finished:             print("Finished loading")         }     } receiveValue: { entity in         print("Entity: \(entity)")     } Is there anyway to force it to load in a mode that enforces compatibility? As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14). Thanks for any help!
8
0
3.6k
May ’22
Render Menu Items with Icon (e.g SFSymbol) on Catalyst
Hello, is there a recommended way to render Menu items, e.g in a SwiftUI ContextMenu with icon (SFSymbols)? Let's say I have the following setup: Both buttons render fine on native macOS (e.g Sonoma) but Catalyst refuses to render the symbol at all. I tried every possible combination I could think off. The only way I found was to directly copy and paste a symbol from the SF symbols app and inline it with the label string as unicode. Unfortunately I have a couple custom SF symbols so this isn't really an option for me. I feel like this is a perfectly valid usecase, as it makes the menu visually a lot easier scannable. With UIKit and Ventura this at least worked for Menubar items but now also seems broken on Sonoma. I would greatly appreciate any hints. Thanks!
2
0
1.8k
Nov ’23
Xcode 16 – Symbol not found: ShapeResource generateConvex
Hi, I have a RealityKit app that I am building with Xcode 16. The app has a minimum deployment target of iOS 17. If I run it on an iOS 17 device the app crashes: dyld[15716]: Symbol not found: _$s10RealityKit13ShapeResourceC14generateConvex4fromAcA04MeshD0C_tYaKFZ Referenced from: … Expected in: …/System/Library/Frameworks/RealityFoundation.framework/RealityFoundation My code looks something like this: @available(iOS, introduced: 13.0, obsoleted: 18.0) @MainActor @preconcurrency func generateNonAsyncConvexShapeResource(from meshResource: MeshResource) throws -> ShapeResource { ShapeResource.generateConvex(from: meshResource) } @available(iOS 18.0, *) func generateConvexShapeAsync(from meshResource: MeshResource) async throws -> ShapeResource { // This will only be available for iOS 18 and above return try await ShapeResource.generateConvex(from: meshResource) } if let meshResource = try? modelEntity.model?.mesh.applying(transform: transform.matrix) { if #available(visionOS 1.0, iOS 18.0, *) { try? await generateConvexShapeAsync(from: meshResource)// await shapeResources.append(.generateConvex(from: meshResource)) } else { try? generateNonAsyncConvexShapeResource(from: meshResource) } } So I actually do check for the system and only call the async variant on iOS 18. Any hints how to fix that? Thanks!
2
0
1.4k
Sep ’24
USDZ with Blend Shapes Workflow Recommendations
Hi, since RealityKit 4 now supports Blend Shapes I was wondering if there are any workflow or tooling recommendations to bake/export them into a USDZ. Are Blender or Cinema4D capable to do that out of the box? Should we look into NVIDIA omniverse (https://docs.omniverse.nvidia.com/connect/latest/blender/manual.htm) So far this topic seems very sparsely documented and I would appreciate any hints. Thank you!
3
0
1.4k
Jan ’25
Mac Catalyst SwiftUI – . focused() not working
Hello, given this following simple SwiftUI setup: struct ContentView: View { var body: some View { CustomFocusView() } } struct CustomFocusView: View { @FocusState private var isFocused: Bool var body: some View { color .frame(width: 128, height: 128) .focusable(true) .focused($isFocused) .onTapGesture { isFocused.toggle() } .onKeyPress("a") { print("A pressed") return .handled } } var color: Color { isFocused ? .blue : .red } } If I run this via Mac – Designed for iPad, the CustomFocusView toggles focus as expected and cycles through red and blue. Now if I run this same exact code via Mac Catalyst absolutely nothing happens and so far I wasn't able to ever get this view to accept focused state. Is this expected? I would appreciate if anyone could hint me on how to get this working. Thank and best regards!
5
0
579
Apr ’25
Loading USDZ with particle system crashes on Intel Macs
Hello, we have a RealityKit app that also runs on macOS via Catalyst. For specific USD assets containing particle systems we have observed a reproducible crash. Steps to reproduce: Open Reality Composer Pro Create new file Create simple particle system (default one is fine) export as USDZ Create project in Xcode Call Entity.load(… and pass in your USD Running this on an Intel iMac with macOS Sequoia 15.3 will lead to a crash with the following console log: validateWithDevice:4704: failed assertion `Render Pipeline DescrvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepthvalidateWithDevice:4704: failed assertion `Render Pipeline Descriptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' iptor Validation depthAttachmentPixelFormat (MTLPixelFormatDepth32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' 32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil32Float) and stencilAttachmentPixelFormat (MTLPixelFormatStencil8) must match. ' 8) must match. ' Xcode version: 16.2.0 iMac 2020 3,8GHz Intel Core i7 macOS Sequoia 15.3 FB16477373 It would be great if this could be fixed quickly or a workaround provided since it affects or production app. Thank you!
1
0
450
Mar ’25
NSToolbar Space item rendered with Liquid Glass Background
Hi, I have a NSToolbar in my Mac Catalyst app with a space and flexible space item in it (https://developer.apple.com/documentation/appkit/nstoolbaritem/identifier/space). On macOS Tahoe the space item is being rendered with a Liquid Glass effect and seems to be automatically grouped with the previous item. Is there a way to prevent this? It basically adds some undesired padding next to the previous item and looks add. The flexible space is rendered normally and as before. I am talking about the space right next to the back chevron item. Thanks for any hints!
1
0
156
Sep ’25
Transparent blending with gradient texture leads to color banding (RealityKit 2)
Hello, I have a cylinder with an Unlit Material and a base color set. Now I want to apply a gradient as the alpha mask so I get this kind of halo GTA-like checkpoint look. The code: var baseMaterial = UnlitMaterial(color: UIColor.red) baseMaterial.blending = .transparent(opacity: .init(scale: 100, texture: .init(maskTextureResource))) // maskTextureResource is the gradient mask baseMaterial.opacityThreshold = 0 This works but unfortunately leads to some ugly visible gradient banding. I've also tried to play with the scale of the blending texture but that did not help. As an alternative approach I tried to solve this via a custom surface shader. Code below: [[visible]] void gradientShader(realitykit::surface_parameters params) {     auto surface = params.surface();     float2 uv = params.geometry().uv0();     float h = 0.5; // adjust position of middleColor     half startAlpha = 0.001;     half middleAlpha = 1;     half endAlpha = 0.001;     half alpha = mix(mix(startAlpha, middleAlpha, half(uv.y/h)), mix(middleAlpha, endAlpha, half((uv.y - h)/(1.0 - h))), half(step(h, uv.y))); surface.set_emissive_color(half3(params.material_constants().emissive_color())); surface.set_base_color(half3(params.material_constants().base_color_tint()));     surface.set_opacity(alpha); } The result looks really nice and smooth but unfortunately this now also culls the inner part of the cylinder. Even on the semitransparent parts. What I want is a having the effect applied on both the outer and inner part of the cylinder being visible. So the transparent part of the outside allows you to seethrough to the inside. I've got this working by using a PhysicallyBasedMaterial instead of an UnlitMaterial (which does not support blending out of the box) but again had to issue with the banding. On my Custom Material faceCulling is set to .none. Here is how it currently looks – as you can see in the left one the alpha mask is not smooth and has banding artefacts: Thank you for any help!
5
0
2.0k
Sep ’21
Create MTLLibrary from raw String for use within RealityKit
Hello, I have a usecase where I need to to download and compile metal shaders on demand as strings or .metal files. These should then be used for CustomMaterials and/or postprocessing within RealityKit. Essentially this boils down to having raw source code that needs to be compiled at runtime. My plan was to use the method makeLibrary(source:options:completionHandler:) to accomplish this. The problem is that I get the following error during compilation: RealityKitARExperienceAssetProvider: An error occured while trying to compile shader library »testShaderLibrary« - Error Domain=MTLLibraryErrorDomain Code=3 "program_source:2:10: fatal error: 'RealityKit/RealityKit.h' file not found #include <RealityKit/RealityKit.h> My code for creating the library looks like this (simplified example): let librarySourceString: String = """ #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void mySurfaceShader(realitykit::surface_parameters params) {     params.surface().set_base_color(half3(1, 1, 1)); } """ mtlDevice.makeLibrary(source: librarySourceString, options: nil) { library, error in     if let error = error {         dump(error) return     }     // do something with library } So I'm wondering if there's a way to tell the metal compiler how to resolve this reference to the RealityKit header file? Would I need to replace that part of the source string maybe with an absolute path to the RealityKit framework (if so – how would I get this at runtime)? Appreciate any hints - thanks!
1
0
1.8k
Nov ’21
Precompile/prewarm shaders to avoid jank
Hi, is there a way to force RealityKit to compile/prewarm and cache all shaders that will be used within a Scene in advance – ideally in the background? This would be useful for adding complex models to the scene which sometimes can cause quite a couple dropped frames even on the newest devices (at least I assume the initial delay when displaying them is caused by the shader compilation) but also for CustomMaterials. Note this also happens with models that are loaded asynchronously. Thanks!
1
0
1.2k
Nov ’21
ARView Frame Timestamp/Elapsed Time
Hello, I've been looking all over the place but so far I haven't found a trivial way to grab ARView's current timestamp. So basically the elapsed time since the scene started rendering. I can access that in the Surface and Geometry shaders but I would like to pass a timestamp as a parameter in order to drive shader animations. I feel that's more efficient than injecting animation progress manually on every frame, especially if there are lot's of objects with that shader. So far what I've done is subscribing to the Scenes Update event and using the delta time to calculate the elapsed time myself. But this is quite error prone and tends to break when I present the scene a second time (e.g closing and reopening to AR experience). The only other option I found was using a render callback and to grab the time property from the PostProcessContext. That works well but do I really have to go that route? Would be great if there is an easy way to achieve this. Pretty much an equivalent to this: https://developer.apple.com/documentation/scenekit/scnscenerenderer/1522680-scenetime NOTE: I'm not looking for the timestamp of the ARSessions current frame. Thank you!
0
0
1k
Jan ’22
Updating blending property to .transparent(opacity: …) removes material color texture (iOS 16 Beta)
Hello, On iOS 16 when I'm retrieving an existing material from a model entity and update it's blending property to .transparent(opacity: …) the color or baseColor texture get's removed after reassigning the updated material. My usecase is that I want to fade in a ModelEntity through a CustomSystem and therefore need to repeatedly reassign the opacity value. I've tested this with UnlitMaterial and PhysicallyBasedMaterial – both suffer from this issue. On iOS 15 this works as expected. Please let me know if there is any workaround, as this seems to me like a major regression and ideally I need this to work once iOS 16 gets released to the public. The radar number including a sample project is: FB11420976 Thank you!
4
0
1.4k
Sep ’22
RealityKit Sample Project Issues; Performance and Crashes
Hi, please let me know if I should rather file feedback for this, but I figured it's worth to flag it one way or an another: Test Xcode Version: 14.0 beta 6 (14A5294g) 1. Project »Altering RealityKit Rendering with Shader Functions« This project crashes right away when running it on a device (iOS 15 and 16). Screenshot: 2. Project »Altering RealityKit Rendering with Shader FunctionsUsing object capture assets in RealityKit« Suffers from pretty bad performance when run on a device – barely scratching 20-25fps on an iPhone 12 Pro. iPhone XS even less. Screenshot: As these are official sample project I feel like they should work flawlessly out of the box. Best Arthur
3
0
1.1k
Sep ’22