RealityKit fullscreen layer

Hi!

I'm currently trying to render another XR scene in front of a RealityKit one. Actually, I'm anchoring a plane to the head with a shader to display for left/right eye side-by-side images. By default, the camera has a near plane so I can directly draw at z=0.

Is there a way to change the camera near plane? Or maybe there is a better solution to overlay image/texture for left/right eyes?

Ideally, I would layer some kind of CompositorLayer on RealityKit, but that's sadly not possible from what I know.

Thanks in advance and have a good day!

Hello @ldavid , thank you for your question!

I strongly recommend against anchoring an object to the wearer's head, such as a texture that displays a stereo image, as it goes against Apple's human interface guidelines.

Unfortunately there is no API for setting the near plane in a RealityKit app. I encourage you to submit a request for this feature with more details about your use case via Feedback Assistant.

However, I think it is likely possible to create the experience you want by presenting your content in an immersive space with a full immersion style. This would allow you to place one scene or entity in front of another spatially. Alternatively, using Compositor Services would give you greater control over how your app is rendered, and should allow you to create your desired effect.

See our article on Immersive experiences for more guidance on how to design your app.

One more thing that might be applicable is ModelSortGroupComponent. This RealityKit component allows you to render models in front of other models that are closer to the wearer spatially.

It might help if you could you share more about the kind of experience you are trying to create? I'd be happy to suggest more specific implementation details or answer any further questions you may have. Thank you!

Thanks a lot for your response.

Sorry, I have not enoughly described my goal.

I work in a HCI research lab, so I will probably try to do things that are not conventionnal.

My full idea is to stream another XR scene on top of a current RealityKit scene. For that purpose, I try to

  • get a realtime stream of the other scene, rendered side-by-side, that I apply on a plane with a material that render for each eye anchored to the head
  • send to the other scene inputs (head pose, hands, etc...) to keep in sync the rendering

This way, even if I anchor the texture to the head, It will be updated depending on head pose like a normal xr scene, and will just act as a "scene layer".

Ideally, I would also put depth into the stream texture to be also able to render at the right depth and no more do a scene layer but kind of scene fusion. In some way, I'll try to reproduce at a higher level some kind of compositor between multiple immersive scenes.

I currently think that I can only do that with Compositor Services without having access to the nearPlane of camera.

On other thing that I can probably do inside a RealityKit is to render remote scene like a volume scene (by always orienting the plane to the viewer and update the rendering depending on the head pose ; more like a portal)

Thanks for pointing me to the ModelSortGroupComponent, always useful!

Thanks again for your time.

RealityKit fullscreen layer
 
 
Q