Post

Replies

Boosts

Views

Activity

Comment on ManipulationComponent Not Translating using indirect input
@Vision Pro Engineer The only solution is to restart the headset. I've heard from dozens of people in my community that they are seeing the exact same behavior. Not always triggered by room changes, but the same issue where they can't move or translate an entity using Manipulation Component. I don't think the room changes are the only trigger for this issue. But it is the best for me to reproduce this.
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25
Comment on ManipulationComponent Not Translating using indirect input
@Vision Pro Engineer This has happened on every version visionOS 26 released, including the shipping version. It is still happening. As I showed in the video: I put the headset on in room A and use manipulation IN ANY APP (not just mine) Then I'd go to room B and use manipulation IN ANY APP (not just mine) When I return to room A, manipulation will break. I will be unable to translate–or move–the entity. Rotation and scaling will still work, but the entity will be stuck in place.
Topic: Spatial Computing SubTopic: General Tags:
Oct ’25
Comment on GestureComponent does not support DragGesture
Yes, I know how to combine gestures like that. But that isn't really the same thing as multiple gestures. For example Tap to toggle a value (show a popover, presentation, etc) Drag to translate an entity We can easily do that by using gestures on a RealityView. But it looks like the new GestureComponent only takes a single gesture. I'll file a feedback re: pass an array of gestures.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Jul ’25
Comment on When placing a TextField within a RealityViewAttachment, the virtual keyboard does not appear in front of the user as expected.
In the mean time as a possible work around, can you use ViewAttachmentComponent instead? The new sample Canyon Crosser has several examples of using this new component. I have already switched all my attachments to ViewAttachmentComponent and can confirm the behavior is the same as regular RealityView attachments. I'll work on a feedback when I have a chance.
Topic: Spatial Computing SubTopic: General Tags:
Jul ’25
Comment on How to get the floor plane with Spatial Tracking Session and Anchor Entity
it's usually better to put the code that configures the SpatialTrackingSession into a class instead of an @State property on your view Sure, I would do that in most apps. This is just an example where I was trying to keep everything in one file. Do you have any details on the why it is better to place SpatialTrackingSession in an observable class instead of state on a view? Several of the WWDC sessions and examples store the session in the view and I was using them as a starting point.
Topic: Spatial Computing SubTopic: ARKit Tags:
Jan ’25
Comment on Do I need a privacy manifest when using UserDefaults and CloudKit in my app?
Very interesting. I didn't realize that User Defaults contained anything other than the data I write to it. I always thought it was essentially an empty plist where I could add key:value pairs. I'll add the manifest for now. Maybe it's time to reconsider using UserDefaults at all. I suppose I could get the same behavior from creating a local JSON file and saving the settings there.
Sep ’24
Comment on Manipulation stops working when changing rooms
I did hear from Apple that they were finally able to reproduce this in early Dec. I haven't heard anything since then. It is still happening on visionOS 26.2 Not sure about the beta versions. I'll check 26.3 when it releases to everyone.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jan ’26
Comment on ManipulationComponent Not Translating using indirect input
@jkdufair oh wow! I have photo widgets in both rooms now that you mention it. I will try removing them and see if I can still trigger this issue.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Comment on ManipulationComponent Not Translating using indirect input
@Vision Pro Engineer The only solution is to restart the headset. I've heard from dozens of people in my community that they are seeing the exact same behavior. Not always triggered by room changes, but the same issue where they can't move or translate an entity using Manipulation Component. I don't think the room changes are the only trigger for this issue. But it is the best for me to reproduce this.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Comment on ManipulationComponent Not Translating using indirect input
@Vision Pro Engineer This has happened on every version visionOS 26 released, including the shipping version. It is still happening. As I showed in the video: I put the headset on in room A and use manipulation IN ANY APP (not just mine) Then I'd go to room B and use manipulation IN ANY APP (not just mine) When I return to room A, manipulation will break. I will be unable to translate–or move–the entity. Rotation and scaling will still work, but the entity will be stuck in place.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’25
Comment on What's in Reality Composer Pro 26
Thanks for the update. It is good know this tool has a future.
Replies
Boosts
Views
Activity
Aug ’25
Comment on Entities moved with Manipulation Component in visionOS Beta 4 are clipped by volume bounds
Thanks, that's good to know. Hopefully this can be resolved soon
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jul ’25
Comment on GestureComponent does not support DragGesture
Yes, I know how to combine gestures like that. But that isn't really the same thing as multiple gestures. For example Tap to toggle a value (show a popover, presentation, etc) Drag to translate an entity We can easily do that by using gestures on a RealityView. But it looks like the new GestureComponent only takes a single gesture. I'll file a feedback re: pass an array of gestures.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Jul ’25
Comment on When placing a TextField within a RealityViewAttachment, the virtual keyboard does not appear in front of the user as expected.
In the mean time as a possible work around, can you use ViewAttachmentComponent instead? The new sample Canyon Crosser has several examples of using this new component. I have already switched all my attachments to ViewAttachmentComponent and can confirm the behavior is the same as regular RealityView attachments. I'll work on a feedback when I have a chance.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jul ’25
Comment on ManipulationComponent Not Translating using indirect input
I ran into this several times during Beta 1, but not yet on Beta 2. When it happened to me, it impacted anything using manipulation across visionOS, not just the app I'm working on. I also happened in Safari and QuickLook. Restarting was the only solution I found. If it happens again on Beta 2, I'll file a feedback. You might consider doing the same thing
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Jun ’25
Comment on How to get the floor plane with Spatial Tracking Session and Anchor Entity
This has absolutely nothing to do with my question.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’25
Comment on How to get the floor plane with Spatial Tracking Session and Anchor Entity
@Vision Pro Engineer thanks for the response. If I understand correctly this doesn't work because the AnchorEntity is a point on a plane, not a the plane itself. Is that correct? Is it possible to use an AnchorEntity (with SpatialTrackingSession) to get the plane/bounds/rect of the floor that visionOS detected?
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’25
Comment on How to get the floor plane with Spatial Tracking Session and Anchor Entity
it's usually better to put the code that configures the SpatialTrackingSession into a class instead of an @State property on your view Sure, I would do that in most apps. This is just an example where I was trying to keep everything in one file. Do you have any details on the why it is better to place SpatialTrackingSession in an observable class instead of state on a view? Several of the WWDC sessions and examples store the session in the view and I was using them as a starting point.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Jan ’25
Comment on PushWindowAction requires the replaced window to be a WindowGroup or DocumentGroup
pushWindow isn't supported for volumes. This definitely worked during the visionOS 2 betas, but now it's not supported? There is nothing in the pushWindow documentation that indicated that this is not supported. Apple should update that documentation to make this clear.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’24
Comment on Do I need a privacy manifest when using UserDefaults and CloudKit in my app?
Very interesting. I didn't realize that User Defaults contained anything other than the data I write to it. I always thought it was essentially an empty plist where I could add key:value pairs. I'll add the manifest for now. Maybe it's time to reconsider using UserDefaults at all. I suppose I could get the same behavior from creating a local JSON file and saving the settings there.
Replies
Boosts
Views
Activity
Sep ’24