Crash when Displaying RealityView on Multiple Screen only Connecting with Xcode

I have an iOS app that uses RealityView to display some models and interact with them, and the app uses regular iOS app navigations, then a challenge I'm facing is how to maintain multiple RealityView on multiplescreens.

For example Screen A has a RealityView, and then I navigate to Screen B (also has a RealityView) using stack based navigation, when I do so I got a crash

-[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5970: failed assertion `Draw Errors Validation
Fragment Function(fsRealityPbr): argument envProbeTable[0] from Buffer(7) with offset(0) and length(16) has space for 16 bytes, but argument has a length(864).
Fragment Function(fsRealityPbr): incorrect type of texture (MTLTextureType2D) bound at Texture binding at index 20 (expect MTLTextureTypeCubeArray) for envProbeDiffuseArray[0].

Interestingly this crash only happens when debugging with Xcode, not happens when the app runs on its own.

I'm not sure what I'm doing is anti-pattern or it's some Xcode debugging limitation.

Crash when Displaying RealityView on Multiple Screen only Connecting with Xcode
 
 
Q