Post

Replies

Boosts

Views

Activity

Reply to Document Based SwiftData support in Immersive Scene?
I notice that once I open a second ModelContainer on same URL as the document that's open in the DocumentGroup, then all saves to the DocumentGroup's model container fail with: Error saving the database Error Domain=NSCocoaErrorDomain Code=134020 "The model configuration used to open the store is incompatible with the one that was used to create the store." UserInfo={NSAffectedObjectsErrorKey=<NSManagedObject: 0x6000021b97c0> (entity: Blah; id: 0x60000026e0c0 <x-coredata:///Blah/tAC19CF5F-052B-4CF6-B7CD-EDA188FC54BE13>; data: { id = "91E56F61-CFE0-42E4-9EA9-EAD4256B64AB"; imageData = nil; name = "Untitled Blah"; })} I will assume that this is just not a good idea right now, and conclude that Document based SwiftData apps do not work well with multiple scenes.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Nov ’23
Reply to ARKit, visionOS: Creating my own data provider
Most of the data providers give you some mesh information that you then need to place into an entity as a collision component. If you’re trying to get planes or world meshes to test interactions with, you may try adding those entities yourself instead of going through the ARKit providers. I did this with some Planes so I could test plane interactions like placing things on walls.
Topic: Spatial Computing SubTopic: ARKit Tags:
Nov ’23
Reply to Retrieve Normal Vector of Tap location? RayCast from device(head)?
Alright. Good riddance. This worked for me: NOTE: There's a BIT of oddness with raycasting to a tap gesture's location. Sometimes it fails, which is confusing to me given the tap succeeded. Maybe I'm not converting the locations correctly? Maybe it works better on device? In a tap gesture handler, get the tap location on a collision shape with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) With a running WorldTrackingProvider you can get the current device pose with worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) Then process it like so to get it world-space: let transform = Transform(matrix: pose.originFromAnchorTransform) let locationOfDevice = transform.translation You can then do a raycast to a tap location in world-coordinate-space like so: let raycastResult = scene.raycast(from: locationOfDevice, to: worldPosition) If successful, an entry in the raycast result will have normal information. Here I grab the first one guard let result = raycastResult.first else { print("NO RAYCAST HITS?????") } let normal = result.normal Make a quaternion to rotate from identity to the normal vector's angle: // Calculate the rotation quaternion to align the forward axis with the normal vector let rotation = simd_quatf(from: SIMD3<Float>(0, 1, 0), to: normal) Apply it to an entity: cylinder.transform.rotation = rotation
Topic: App & System Services SubTopic: Core OS Tags:
Nov ’23
Reply to visionOS with GroupActivities
Yes. See https://developer.apple.com/documentation/groupactivities for more information. There are also spatial-specific considerations. See the WWDC presentations for that and see https://developer.apple.com/documentation/groupactivities/systemcoordinator for more info.
Topic: App & System Services SubTopic: General Tags:
Nov ’23
Reply to Placing Item on plane, position is great, but trouble with rotation
My Solution You can get the location of the tap gesture on an entity with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) You can grab he orientation of the tapped plane with let rotation = value.entity.orientation(relativeTo: nil) I want to track this location so it can be persisted in the WorldTrackingProvider (whenever it works??? ) with a world anchor. I do so by making a world anchor with: let pose = Pose3D(position: worldPosition, rotation: rotation) let worldAnchor = WorldAnchor(originFromAnchorTransform: simd_float4x4(pose)) When you get the anchor from your world tracking listener you can apply it's transform (which contains position and orientation) to an entity like so: entityForAnchor.transform = Transform(matrix: anchor.originFromAnchorTransform) If you've got another way, or a way that has a more intuitive use of the API topology please share. This was cobbled together with a ton of trial and error.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Oct ’23
Reply to Send Pointer to Device
Speculation: Vision Pro allows a stationary user to interact with their application windows via keyboard+mouse/trackpad. I suspect this setting lets the cursor function as a mouse/trackpad pointer instead of the user’s eyes In the VisionOS simulator.
Oct ’23