Post

Replies

Boosts

Views

Activity

Reply to Some question about visionOS
Number one is prevented for privacy reasons. There are some ways to communicate to the system which shape should be used when a hover is occurring, but there is no way to act on the event. The Hover highlighting is even performed out-of-process so it can't be used maliciously, such as my ad-companies. Number Two I don't know Number three may be called "Billboarding" and there is an example of that in the splash splash sample code.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Nov ’23
Reply to Document Based SwiftData support in Immersive Scene?
I notice that once I open a second ModelContainer on same URL as the document that's open in the DocumentGroup, then all saves to the DocumentGroup's model container fail with: Error saving the database Error Domain=NSCocoaErrorDomain Code=134020 "The model configuration used to open the store is incompatible with the one that was used to create the store." UserInfo={NSAffectedObjectsErrorKey=<NSManagedObject: 0x6000021b97c0> (entity: Blah; id: 0x60000026e0c0 <x-coredata:///Blah/tAC19CF5F-052B-4CF6-B7CD-EDA188FC54BE13>; data: { id = "91E56F61-CFE0-42E4-9EA9-EAD4256B64AB"; imageData = nil; name = "Untitled Blah"; })} I will assume that this is just not a good idea right now, and conclude that Document based SwiftData apps do not work well with multiple scenes.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Nov ’23
Reply to ARKit, visionOS: Creating my own data provider
Most of the data providers give you some mesh information that you then need to place into an entity as a collision component. If you’re trying to get planes or world meshes to test interactions with, you may try adding those entities yourself instead of going through the ARKit providers. I did this with some Planes so I could test plane interactions like placing things on walls.
Topic: Spatial Computing SubTopic: ARKit Tags:
Nov ’23
Reply to Retrieve Normal Vector of Tap location? RayCast from device(head)?
Alright. Good riddance. This worked for me: NOTE: There's a BIT of oddness with raycasting to a tap gesture's location. Sometimes it fails, which is confusing to me given the tap succeeded. Maybe I'm not converting the locations correctly? Maybe it works better on device? In a tap gesture handler, get the tap location on a collision shape with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) With a running WorldTrackingProvider you can get the current device pose with worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) Then process it like so to get it world-space: let transform = Transform(matrix: pose.originFromAnchorTransform) let locationOfDevice = transform.translation You can then do a raycast to a tap location in world-coordinate-space like so: let raycastResult = scene.raycast(from: locationOfDevice, to: worldPosition) If successful, an entry in the raycast result will have normal information. Here I grab the first one guard let result = raycastResult.first else { print("NO RAYCAST HITS?????") } let normal = result.normal Make a quaternion to rotate from identity to the normal vector's angle: // Calculate the rotation quaternion to align the forward axis with the normal vector let rotation = simd_quatf(from: SIMD3<Float>(0, 1, 0), to: normal) Apply it to an entity: cylinder.transform.rotation = rotation
Topic: App & System Services SubTopic: Core OS Tags:
Nov ’23
Reply to visionOS with GroupActivities
Yes. See https://developer.apple.com/documentation/groupactivities for more information. There are also spatial-specific considerations. See the WWDC presentations for that and see https://developer.apple.com/documentation/groupactivities/systemcoordinator for more info.
Topic: App & System Services SubTopic: General Tags:
Nov ’23
Reply to Plane detection does not work in simulators
Correct. You can place your own plane, give it a collision shape, and use that to test for the time being.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Dec ’23
Reply to How does an indirect drag gesture work?
One more: What if the user is holding something for away and they rotate their torso away from it? Will it follow along or stay static?
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Dec ’23
Reply to Will I ever receive a rejection notice for "Apple Vision Pro developer kit"
Yeah, it felt really misleading, but Apple's developer communication is usually about this bad.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Dec ’23
Reply to .OnHover function disabled in visionOS
It will compile, but I believe it won't fire so we can't track user's eyes.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Dec ’23
Reply to Some question about visionOS
Number one is prevented for privacy reasons. There are some ways to communicate to the system which shape should be used when a hover is occurring, but there is no way to act on the event. The Hover highlighting is even performed out-of-process so it can't be used maliciously, such as my ad-companies. Number Two I don't know Number three may be called "Billboarding" and there is an example of that in the splash splash sample code.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to ImmersiveSpace, drag a specific view
Drag gestures can be targeted to any entity via .targetedToAnyEntity() or a specific one with .targetedToEntity If you use "any entity", you can check the entities identifiers and components inside your drag processing closure.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to Document Based SwiftData support in Immersive Scene?
I notice that once I open a second ModelContainer on same URL as the document that's open in the DocumentGroup, then all saves to the DocumentGroup's model container fail with: Error saving the database Error Domain=NSCocoaErrorDomain Code=134020 "The model configuration used to open the store is incompatible with the one that was used to create the store." UserInfo={NSAffectedObjectsErrorKey=<NSManagedObject: 0x6000021b97c0> (entity: Blah; id: 0x60000026e0c0 <x-coredata:///Blah/tAC19CF5F-052B-4CF6-B7CD-EDA188FC54BE13>; data: { id = "91E56F61-CFE0-42E4-9EA9-EAD4256B64AB"; imageData = nil; name = "Untitled Blah"; })} I will assume that this is just not a good idea right now, and conclude that Document based SwiftData apps do not work well with multiple scenes.
Topic: UI Frameworks SubTopic: SwiftUI Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to ARKit, visionOS: Creating my own data provider
Most of the data providers give you some mesh information that you then need to place into an entity as a collision component. If you’re trying to get planes or world meshes to test interactions with, you may try adding those entities yourself instead of going through the ARKit providers. I did this with some Planes so I could test plane interactions like placing things on walls.
Topic: Spatial Computing SubTopic: ARKit Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to RayCasting to Surface Returns Inconsistent Results?
I encourage anyone from Apple to file the report. I don't see the reports I file having much impact.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to RealityView entity recognition in onContinuousHover and OnHover
Why’s your use case? “Hover” on visionOS is what the user’s gaze is on, and since our eyes move so much, it may not be feel right to key behavior on it. Curious what you had in mind.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to Add light to a RealityView on visionOS
I noticed the other day on Xcode 15.1.0b2 if you create a new VisionOS app project with a progressive immersive scene, it includes code with a light box example.
Topic: Graphics & Games SubTopic: RealityKit Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to Retrieve Normal Vector of Tap location? RayCast from device(head)?
Alright. Good riddance. This worked for me: NOTE: There's a BIT of oddness with raycasting to a tap gesture's location. Sometimes it fails, which is confusing to me given the tap succeeded. Maybe I'm not converting the locations correctly? Maybe it works better on device? In a tap gesture handler, get the tap location on a collision shape with: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) With a running WorldTrackingProvider you can get the current device pose with worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) Then process it like so to get it world-space: let transform = Transform(matrix: pose.originFromAnchorTransform) let locationOfDevice = transform.translation You can then do a raycast to a tap location in world-coordinate-space like so: let raycastResult = scene.raycast(from: locationOfDevice, to: worldPosition) If successful, an entry in the raycast result will have normal information. Here I grab the first one guard let result = raycastResult.first else { print("NO RAYCAST HITS?????") } let normal = result.normal Make a quaternion to rotate from identity to the normal vector's angle: // Calculate the rotation quaternion to align the forward axis with the normal vector let rotation = simd_quatf(from: SIMD3<Float>(0, 1, 0), to: normal) Apply it to an entity: cylinder.transform.rotation = rotation
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to Back camera for the new Vision Pro
Hello, this is a forum for software developers to learn and share information. Please use https://www.apple.com/feedback/ to provide product feedback to Apple.
Topic: App & System Services SubTopic: Core OS Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to visionOS with GroupActivities
Yes. See https://developer.apple.com/documentation/groupactivities for more information. There are also spatial-specific considerations. See the WWDC presentations for that and see https://developer.apple.com/documentation/groupactivities/systemcoordinator for more info.
Topic: App & System Services SubTopic: General Tags:
Replies
Boosts
Views
Activity
Nov ’23
Reply to Unreal Engine on Apple Vision Pro
As far as I know there is no public information about support for unreal on visionOS. Given the bad blood between the companies that may be true for a long time.
Topic: Spatial Computing SubTopic: General Tags:
Replies
Boosts
Views
Activity
Oct ’23