Hi @radicalappdev and thanks for this informations.
I also tried to get the best integration of my 3D model into the real world. The SceneReconstructionProvider works very well with an OcclusionMaterial(), but using this method I was losing the GroundingShadow().
So I looked into the Shadow Receiving Occlusion Surface (RealityKit). It doesn’t work with the default GroundingShadow, but it seems to work with a DirectionalLightComponent. However, I wasn’t able to achieve a nice rendering — I keep getting a kind of white haze over my shader, which looks bad.
1 ) the first difficulty is , it's difficult to setup a correct light because "proxy:true" is not valid in visionOS
let lightEntity = Entity()
let directionalLight = DirectionalLightComponent(
color: .white,
intensity: 1000
// NO PROXY OPTION IN VISION-OS
)
let shadows = DirectionalLightComponent.Shadow(
maximumDistance: 10,
depthBias: 0.01
)
lightEntity.components[DirectionalLightComponent.self] = directionalLight
lightEntity.components.set( DynamicLightShadowComponent(castsShadow: true))
lightEntity.transform.rotation = simd_quatf(angle: -.pi/4, axis: [1, 0, 0])
content.add(lightEntity)
I made a basic shader to test :
**
And the problem is that the Z-DEPTH of the OcclusionMaterial() blocks the GroundingShadow, even when plane detection is enabled in ARKit. So it’s a bit frustrating — it’s either my 3D models with a GroundingShadow or with occlusion, but not both at the same time.**
let objectWillChange = ObservableObjectPublisher()
let session = ARKitSession()
let worldTracking = WorldTrackingProvider()
let scene = SceneReconstructionProvider(modes: [.classification]) // fournit les MeshAnchor
let planeDetection = PlaneDetectionProvider(alignments: [.horizontal])
func start() async throws {
if(BuildEnv.isPreview || BuildEnv.isSimulator) {
return
}
try await session.run([planeDetection,worldTracking,scene])
print("ARPipeline started")
}
func stop() async {
session.stop()
}
}
Topic:
Spatial Computing
SubTopic:
ARKit
Tags: