Post

Replies

Boosts

Views

Created

Precompile/prewarm shaders to avoid jank
Hi, is there a way to force RealityKit to compile/prewarm and cache all shaders that will be used within a Scene in advance – ideally in the background? This would be useful for adding complex models to the scene which sometimes can cause quite a couple dropped frames even on the newest devices (at least I assume the initial delay when displaying them is caused by the shader compilation) but also for CustomMaterials. Note this also happens with models that are loaded asynchronously. Thanks!
1
0
1.2k
Nov ’21
What causes »ARSessionDelegate is retaining X ARFrames« console warning?
Hi, since iOS 15 I've repeatedly noticed the console warning »ARSessionDelegate is retaining X ARFrames. This can lead to future camera frames being dropped« even for rather simple projects using RealityKit and ARKit. Could someone from the ARKit team please elaborate what causes this warning and what can be done to avoid it? If I remember correctly I didn't even assign an ARSessionDelegate. Thank you!
4
1
3.4k
Nov ’21
Create MTLLibrary from raw String for use within RealityKit
Hello, I have a usecase where I need to to download and compile metal shaders on demand as strings or .metal files. These should then be used for CustomMaterials and/or postprocessing within RealityKit. Essentially this boils down to having raw source code that needs to be compiled at runtime. My plan was to use the method makeLibrary(source:options:completionHandler:) to accomplish this. The problem is that I get the following error during compilation: RealityKitARExperienceAssetProvider: An error occured while trying to compile shader library »testShaderLibrary« - Error Domain=MTLLibraryErrorDomain Code=3 "program_source:2:10: fatal error: 'RealityKit/RealityKit.h' file not found #include <RealityKit/RealityKit.h> My code for creating the library looks like this (simplified example): let librarySourceString: String = """ #include <metal_stdlib> #include <RealityKit/RealityKit.h> using namespace metal; [[visible]] void mySurfaceShader(realitykit::surface_parameters params) {     params.surface().set_base_color(half3(1, 1, 1)); } """ mtlDevice.makeLibrary(source: librarySourceString, options: nil) { library, error in     if let error = error {         dump(error) return     }     // do something with library } So I'm wondering if there's a way to tell the metal compiler how to resolve this reference to the RealityKit header file? Would I need to replace that part of the source string maybe with an absolute path to the RealityKit framework (if so – how would I get this at runtime)? Appreciate any hints - thanks!
1
0
1.8k
Nov ’21
Loading of older .reality files is broken on iOS 15 (works on iOS 14)
Hello, in our app we are downloading some user generated content (.reality files and USDZs) and displaying it within the app. This worked without issues in iOS 14 but with iOS 15 (release version) there have been a lot of issues with certain .reality files. As far as I can see USDZ files still work. I've created a little test project and the error message log is not really helpful. 2021-10-01 19:42:30.207645+0100 RealityKitAssetTest-iOS15[3239:827718] [Assets] Failed to load asset of type 'RealityFileAsset', error:Could not find archive entry named assets/Scéna17_9dfa3d0.compiledscene. 2021-10-01 19:42:30.208097+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] Failed to load asset path '#18094855536753608259' 2021-10-01 19:42:30.208117+0100 RealityKitAssetTest-iOS15[3239:827598] [Assets] AssetLoadRequest failed because asset failed to load '#18094855536753608259' 2021-10-01 19:42:30.307040+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307608+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307712+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307753+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307790+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307907+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.307955+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308155+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 2021-10-01 19:42:30.308194+0100 RealityKitAssetTest-iOS15[3239:827598] throwing -10878 ▿ Failed to load loadRequest.   - generic: "Failed to load loadRequest." Basic code structure that is used for loading: cancellable = Entity.loadAsync(named: entityName, in: .main)     .sink { completion in         switch completion {         case .failure(let error):             dump(error)             print("Done")         case .finished:             print("Finished loading")         }     } receiveValue: { entity in         print("Entity: \(entity)")     } Is there anyway to force it to load in a mode that enforces compatibility? As mentioned this only happens on iOS 15. Even ARQuickLook can't display the files anymore (no issues on iOS 14). Thanks for any help!
8
0
3.6k
Oct ’21
dyld: Library not loaded: RealityFoundation – Error on iOS 14 with RealityKit 2
I've just implemented some necessary fixes to our app to ensure compatibility with iOS 15 and ARKit 2. Now when I deploy the app via XCode 13 RC (Version 13.0 (13A233)) I get the following crash right at launch: dyld: Library not loaded: /System/Library/Frameworks/RealityFoundation.framework/RealityFoundation   Referenced from: /private/var/containers/Bundle/Application/…   Reason: image not found dyld: launch, loading dependent libraries DYLD_LIBRARY_PATH=/usr/lib/system/introspection DYLD_INSERT_LIBRARIES=/Developer/usr/lib/libBacktraceRecording.dylib:/Developer/usr/lib/libMainThreadChecker.dylib:/Developer/Library/PrivateFrameworks/DTDDISupport.framework/libViewDebuggerSupport.dylib Is this a known issue? I've already tried deleting Derived Data and clearing the project but the problem persists. The minimum deployment target is iOS 13 for the main app and iOS 14 for the AppClip. All iOS 15 related fixes are wrapped into if #available(iOS 15.0, *) { … } This is a pretty major problem for us as we now can't send out Testflights or upload to AppStoreConnect for Monday. Thanks!
1
0
4.1k
Sep ’21
Transparent blending with gradient texture leads to color banding (RealityKit 2)
Hello, I have a cylinder with an Unlit Material and a base color set. Now I want to apply a gradient as the alpha mask so I get this kind of halo GTA-like checkpoint look. The code: var baseMaterial = UnlitMaterial(color: UIColor.red) baseMaterial.blending = .transparent(opacity: .init(scale: 100, texture: .init(maskTextureResource))) // maskTextureResource is the gradient mask baseMaterial.opacityThreshold = 0 This works but unfortunately leads to some ugly visible gradient banding. I've also tried to play with the scale of the blending texture but that did not help. As an alternative approach I tried to solve this via a custom surface shader. Code below: [[visible]] void gradientShader(realitykit::surface_parameters params) {     auto surface = params.surface();     float2 uv = params.geometry().uv0();     float h = 0.5; // adjust position of middleColor     half startAlpha = 0.001;     half middleAlpha = 1;     half endAlpha = 0.001;     half alpha = mix(mix(startAlpha, middleAlpha, half(uv.y/h)), mix(middleAlpha, endAlpha, half((uv.y - h)/(1.0 - h))), half(step(h, uv.y))); surface.set_emissive_color(half3(params.material_constants().emissive_color())); surface.set_base_color(half3(params.material_constants().base_color_tint()));     surface.set_opacity(alpha); } The result looks really nice and smooth but unfortunately this now also culls the inner part of the cylinder. Even on the semitransparent parts. What I want is a having the effect applied on both the outer and inner part of the cylinder being visible. So the transparent part of the outside allows you to seethrough to the inside. I've got this working by using a PhysicallyBasedMaterial instead of an UnlitMaterial (which does not support blending out of the box) but again had to issue with the banding. On my Custom Material faceCulling is set to .none. Here is how it currently looks – as you can see in the left one the alpha mask is not smooth and has banding artefacts: Thank you for any help!
5
0
2.0k
Sep ’21
Animate transparency of blending property in RealityKit 2
Hello, PhysicallyBasedMaterial in RealityKit 2 contains a blending property to adjust the transparency of a material. Is there a way to animate this over time to fade entities in and out? I've tried the new FromToByAnimation API but could not figure out if there is a supported BindPath for the transparency. Ideally what I would like to achieve is something similar to SceneKits SCNAction.fadeIn(duration: …) which also worked on a whole node. I figured I could also go the route of a custom fragment shader here, though that seems overkill. As RealityComposer also supports fade actions I would assume that this is at least supported behind the scenes. Thanks for any help!
2
0
1.9k
Jul ’21
Set camera feed as texture input for CustomMaterial
Hello, in this project https://developer.apple.com/documentation/arkit/content_anchors/tracking_and_visualizing_faces there is some sample code that describes how to map the camera feed to an object with SceneKit and a shader modifier. I would like know if there is an easy way to achieve the same thing with a CustomMaterial and RealityKit 2. Specifically I'm interested in what would be the best way to pass in the background of the RealityKit environment as a texture to the custom shader. In SceneKit this was really easy as one could just do the following: material.diffuse.contents = sceneView.scene.background.contents As the texture input for custom material requires a TextureResource I would probably need a way to create a CGImage from the background or camera feed on the fly. What I've tried so far is accessing the captured image from the camera feed and creating a CGImage from the pixel buffer like so: guard     let frame = arView.session.currentFrame,     let cameraFeedTexture = CGImage.create(pixelBuffer: frame.capturedImage),     let textureResource = try? TextureResource.generate(from: cameraFeedTexture, withName: "cameraFeedTexture", options: .init(semantic: .color)) else {     return } // assign texture customMaterial.custom.texture = .init(textureResource) extension CGImage {   public static func create(pixelBuffer: CVPixelBuffer) -> CGImage? {     var cgImage: CGImage?     VTCreateCGImageFromCVPixelBuffer(pixelBuffer, options: nil, imageOut: &cgImage)     return cgImage   } } This seems wasteful though and is also quite slow. Is there any other way to accomplish this efficiently or would I need to go the post processing route? In the sample code the displayTransform for the view is also being passed as a SCNMatrix4. CustomMaterial custom.value only accepts a SIMD4 though. Is there another way to pass in the matrix? Another idea I've had was to create a CustomMaterial from an OcclusionMaterial which already seems to contain information about the camera feed but so far had no luck with it. Thanks for the support!
8
0
4.0k
Jul ’21