Hi Michael @Vision Pro Engineer
First of all, thank you for your quick answer. I appreciate your effort!
I think it goes in the right direction. To maybe give you a better idea of what I aim to achieve. The goal is basically to have a state of "distraction," which is the case if the user won't have the app window in front of it's vision/orientation of the AVP.
Unfortunately, as far as I understood, ScenePhase only tells me if, let's say, the user closes the app window, it is now in the background, and if reopening the app, it is active, etc. What I was hoping for would be something that I can access that could tell me if the window of the app is either within the visible space of the user (e.g., I just started the app, and the window is in front of me and my AVP is also directed towards the window) or not (e.g., my window is still at the initial spawn space when I started the app but I've turned around so the window would be behind my back, not visible anymore for me).
Furthermore, now that I'm able to track the spatial positioning of the window as you suggested, would there be a way to either be able to track if the window is within the space of the AVP's orientation (e.g., as long as I have the window in my vision or even only a corner of it) OR attach the AVP's originFromAnchorTransform property to the window, so that eventually the app launches and the AVP's orientation rotation data angles are set to 0, having the window basically as the center of orientation, and then making the AVP's orientation depending on the window, speaking of if I would move the window and then look perfectly centered at the new positioned window, I would be back at the 0 values.