Post

Replies

Boosts

Views

Activity

Reply to Can't center entity on AnchorEntity(.plane)
I felt like this should be obvious because it's such an important use case for dropping RealityKit scenes into the world without user interaction, but I tried a few things with translation that failed. For some reason what worked for me was calling box.setPosition.... relative to nil. I'm not sure why it works given that the documentation says "nil" means "world space", when it appears to behave as if nil means "parent space" in this case? class Model: ObservableObject { var wall: AnchorEntity? var child: ModelEntity? } struct ImmersiveView: View { @StateObject var model = Model() var body: some View { RealityView { content in let wall = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 1.5]), trackingMode: .continuous) model.wall = wall let mesh = MeshResource.generateBox(size: 0.3) let box = ModelEntity(mesh: mesh, materials: [SimpleMaterial(color: .green, isMetallic: false)]) model.child = box wall.addChild(box, preservingWorldTransform: false) content.add(wall) box.setPosition([0, 0, 0], relativeTo: wall) } update: { content in if let box = model.child, let wall = model.wall { // box.setPosition([0, 0, 0], relativeTo: wall) // <---- DOES NOT WORK box.setPosition([0, 0, 0], relativeTo: nil) // <---- DOES WORK even though nil means "world space"???? } } } }
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’23
Reply to Dragging coordinates issue in VisionOS
You're probably going through a moment of "What in the world? That wasn't mentioned anywhere!". And yeah, a lot of the demonstrations use an AnchorEntity of type "Plane" to insert a RealityComposer scene into the RealityView at a spot in the world that meets the size criteria, or just do "Content.addEntity" when the realityView loads for an Immersive Scene. It's important to note these are not "world tracked" entities, and will not enable accurate location3D values for interactions. We can use "rotate" and "magnify" in these situations as those gesture change relative to their initial value. Tap gestures can even be used as a sort of tap-boolean, but the location of the tap is not reliable. You're also probably asking "How can I make anything interactive enough to feel immersive this way???" And yeah, I don't have a clue. Maybe if we pay $4000 or get lucky with a developer kit we can figure it out. We can't ask anyone with a developer kit because they're banned from telling us.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’23
Reply to Dragging coordinates issue in VisionOS
Please file a feedback on this to increase pressure to get it added. I've done so on an adjacent issue related to non-"world targeting" entities. The lack of the PlaneDetectionProvider and SceneReconstructionProvider support in the simulator is felt more and more as we run into these issues.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Sep ’23
Reply to Is it possible to place content on the plane detected in the visionOS simulator?
I am not seeing any changes in VisionOS Beta3 with respect to placing content where a user taps on a plane. PlaneDetectors are still not available World Tracking Provider still doesn't return after asking it to track an anchor The Scene Reconstruction provider is still missing. Taps on a scene AnchorEntity of type plane still report inaccurate positions Taps on a plane placed at a world AnchorEntity still report inaccurate positions Feedbacks filed with no response: FB13034747 FB13034803 FB12952565 FB12639395 Maybe beta 4?
Topic: App & System Services SubTopic: Core OS Tags:
Aug ’23
Reply to RealityView attachments do not show up in Vision Pro simulator
Attachments definitely work. You "nominate" an attachment SwiftUI like so: attachments: { Text("hello") .glassBackgroundEffect() .tag("panel") // <---------- NOTE THE TAG } This closure can return different results as the state of your scene changes. So if you want an attachment to disappear, just stop returning it from here. After an attachment is nominated, it needs to be added to the scene in the update method of RealityView. First see if RealityKit has synthesized an entity for the attachment you provided: update: { content, attachments in let panelEntity = attachments.entity(for: "panel") // <------- NOTE THAT IT MATCHES THE NOMINATED TAG NAME. // [...] } Once you have that entity you can transform it, plop add it onto another entity, or straight into the content view itself. content.add(panelEntity)
Topic: Graphics & Games SubTopic: RealityKit Tags:
Aug ’23
Reply to Where can I find software developers for Vision Pros software?
So, realistically, this platform is infantile. The things that are best supported at the moment are 2D windows in 3D space, and even then Window management is absent. Apple’s platform is not billed as VR or AR, it is special computing. As such it lacks many of the VR or AR features that other MR platforms have because It has opinionated design decisions with special computing as the focus. You could explore this from a Unity game-engine angle, or seek out game developers or those familiar with Meta’s VR platforms. However, starting with development is likely the wrong move. While developers could build a prototype or tell you what is and is not supported, the appropriate starting point is with strategy/UX and product design. Drown in the human interface guidelines of various VR platforms and cultivate a product strategy and consult with engineers on technical feasibility. Once you’ve done that you can start prototyping and building the thing.
Topic: Graphics & Games SubTopic: General Tags:
Aug ’23
Reply to Can't center entity on AnchorEntity(.plane)
I felt like this should be obvious because it's such an important use case for dropping RealityKit scenes into the world without user interaction, but I tried a few things with translation that failed. For some reason what worked for me was calling box.setPosition.... relative to nil. I'm not sure why it works given that the documentation says "nil" means "world space", when it appears to behave as if nil means "parent space" in this case? class Model: ObservableObject { var wall: AnchorEntity? var child: ModelEntity? } struct ImmersiveView: View { @StateObject var model = Model() var body: some View { RealityView { content in let wall = AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 1.5]), trackingMode: .continuous) model.wall = wall let mesh = MeshResource.generateBox(size: 0.3) let box = ModelEntity(mesh: mesh, materials: [SimpleMaterial(color: .green, isMetallic: false)]) model.child = box wall.addChild(box, preservingWorldTransform: false) content.add(wall) box.setPosition([0, 0, 0], relativeTo: wall) } update: { content in if let box = model.child, let wall = model.wall { // box.setPosition([0, 0, 0], relativeTo: wall) // <---- DOES NOT WORK box.setPosition([0, 0, 0], relativeTo: nil) // <---- DOES WORK even though nil means "world space"???? } } } }
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Dragging coordinates issue in VisionOS
You're probably going through a moment of "What in the world? That wasn't mentioned anywhere!". And yeah, a lot of the demonstrations use an AnchorEntity of type "Plane" to insert a RealityComposer scene into the RealityView at a spot in the world that meets the size criteria, or just do "Content.addEntity" when the realityView loads for an Immersive Scene. It's important to note these are not "world tracked" entities, and will not enable accurate location3D values for interactions. We can use "rotate" and "magnify" in these situations as those gesture change relative to their initial value. Tap gestures can even be used as a sort of tap-boolean, but the location of the tap is not reliable. You're also probably asking "How can I make anything interactive enough to feel immersive this way???" And yeah, I don't have a clue. Maybe if we pay $4000 or get lucky with a developer kit we can figure it out. We can't ask anyone with a developer kit because they're banned from telling us.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Dragging coordinates issue in VisionOS
Please file a feedback on this to increase pressure to get it added. I've done so on an adjacent issue related to non-"world targeting" entities. The lack of the PlaneDetectionProvider and SceneReconstructionProvider support in the simulator is felt more and more as we run into these issues.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Sep ’23
Reply to Is it possible to place content on the plane detected in the visionOS simulator?
I am not seeing any changes in VisionOS Beta3 with respect to placing content where a user taps on a plane. PlaneDetectors are still not available World Tracking Provider still doesn't return after asking it to track an anchor The Scene Reconstruction provider is still missing. Taps on a scene AnchorEntity of type plane still report inaccurate positions Taps on a plane placed at a world AnchorEntity still report inaccurate positions Feedbacks filed with no response: FB13034747 FB13034803 FB12952565 FB12639395 Maybe beta 4?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to Is it possible to place content on the plane detected in the visionOS simulator?
Here are some related topics and posts that I've made trying to get this to work as well: https://developer.apple.com/forums/thread/735900 https://developer.apple.com/forums/thread/735558 https://developer.apple.com/forums/thread/735537 https://developer.apple.com/forums/thread/735305
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to Is it possible to place content on the plane detected in the visionOS simulator?
It is not currently possible in the simulator. Fingers crossed for beta 3.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to How to reproduce MagnifyGesture on visionOS simulator
Convert the 2D magnify gesture to a 3D one by modifying it with a "target entity" modifier. There are few. The following option enables the gesture on all entities in the scene. MagnifyGesture().targetedToAnyEntity()
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to RealityView is not responding to tap gesture
Try adding a CollisionEntity and an InputTargetComponent to your entity let collisionComponent = CollisionComponent(shapes: [ShapeResource.generateBox(width: 2.0, height: 2.0, depth: 0.02)]) interactionEntity.components.set(collisionComponent) interactionEntity.components.set(InputTargetComponent())
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to RealityView attachments do not show up in Vision Pro simulator
Attachments definitely work. You "nominate" an attachment SwiftUI like so: attachments: { Text("hello") .glassBackgroundEffect() .tag("panel") // <---------- NOTE THE TAG } This closure can return different results as the state of your scene changes. So if you want an attachment to disappear, just stop returning it from here. After an attachment is nominated, it needs to be added to the scene in the update method of RealityView. First see if RealityKit has synthesized an entity for the attachment you provided: update: { content, attachments in let panelEntity = attachments.entity(for: "panel") // <------- NOTE THAT IT MATCHES THE NOMINATED TAG NAME. // [...] } Once you have that entity you can transform it, plop add it onto another entity, or straight into the content view itself. content.add(panelEntity)
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to Dev Kits
Looks like an Appleinsider writer got access to someone’s DevKit so I guess they went out.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to VisionOS SDK for custom integrations
As far as I know, there is a Unity closed beta at the moment.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to visionOS on Xamarin Native app
My guess is that applications built with flutter and Xamarin will be accepted provided they function well. As usual, I don’t expect any communication from Apple until they reject apps during app review.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to Dev Kits
No Response as well.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Aug ’23
Reply to Why doesn't the Apple Vision Pro simulator appear as a run destination?
Have you added Vision as a run destination of your App's product? This is done automatically for new projects, but you'll need to add "Apple Vision - Designed for iPad" if you want to run your app in compatibility mode on a preexisting codebase.
Replies
Boosts
Views
Activity
Aug ’23
Reply to Where can I find software developers for Vision Pros software?
So, realistically, this platform is infantile. The things that are best supported at the moment are 2D windows in 3D space, and even then Window management is absent. Apple’s platform is not billed as VR or AR, it is special computing. As such it lacks many of the VR or AR features that other MR platforms have because It has opinionated design decisions with special computing as the focus. You could explore this from a Unity game-engine angle, or seek out game developers or those familiar with Meta’s VR platforms. However, starting with development is likely the wrong move. While developers could build a prototype or tell you what is and is not supported, the appropriate starting point is with strategy/UX and product design. Drown in the human interface guidelines of various VR platforms and cultivate a product strategy and consult with engineers on technical feasibility. Once you’ve done that you can start prototyping and building the thing.
Topic: Graphics & Games SubTopic: General Tags:
Replies
Boosts
Views
Activity
Aug ’23