Hi Apple Developer Forums,
I'm developing a visionOS video streaming app that uses a custom immersive cinema experience with
RealityKit. I have a question about enabling video reflections in an immersive environment.
My Current Implementation
I'm using VideoMaterial with AVPlayer to display video on a ModelEntity plane in an immersive
space:
// Create screen mesh
let screenMesh = MeshResource.generatePlane(
width: VideoTheater.screenWidth,
height: VideoTheater.screenHeight,
cornerRadius: 0.0
)
let screenEntity = ModelEntity(mesh: screenMesh)
// Apply VideoMaterial with AVPlayer
screenEntity.model?.materials = [VideoMaterial(avPlayer: player)]
The video renders correctly in the immersive space, but I don't see any video reflections on
surrounding surfaces.
Apple's Documentation Approach
According to the documentation at https://developer.apple.com/documentation/visionos/enabling-vid
eo-reflections-in-an-immersive-environment, the recommended approach uses:
AVPlayerViewController for video playback
dockingRegion modifier to specify where the video should appear
The system automatically handles reflections
My Question
Is using AVPlayerViewController with dockingRegion the only way to get video reflections in an
immersive environment? Or is it possible to enable reflections when using VideoMaterial directly
with RealityKit's ModelEntity?
My app requires a custom immersive cinema experience with:
Custom screen positioning and scaling
Danmaku (bullet comments) overlay
Custom gesture controls
HDR/Dolby Vision support
Switching to AVPlayerViewController would require significant architectural changes, so I'd
prefer to keep my current VideoMaterial approach if reflections can be enabled somehow.
If VideoMaterial cannot produce reflections, are there any alternative approaches to achieve
diffuse video reflections with a custom RealityKit setup?
Environment
visionOS 2.x
RealityKit
AVPlayer with custom resource loader (for DASH streams)
Thank you for any guidance!
Selecting any option will automatically load the page