If we wanted to simulate the interaction between the vision pro and iPhone companion app for something like location services, is there a plan for something like that? How do we test the accuracy of something like this without having a vision pro or the companion app?
I am trying to await openImmersiveSpaceAction without having it attached to a button view or some view. I currently have a switch statement on a State variable that automatically switches the view depending on that state. I want it to open an immersive space when that switch statement hits a certain case but can't under this context. Are there any solutions for this?
If we wanted to simulate the interaction between the vision pro and iPhone companion app for something like location services, is there a plan for something like that? How do we test the accuracy of something like this without having a vision pro or the companion app?
I am trying to await openImmersiveSpaceAction without having it attached to a button view or some view. I currently have a switch statement on a State variable that automatically switches the view depending on that state. I want it to open an immersive space when that switch statement hits a certain case but can't under this context. Are there any solutions for this?