Post

Replies

Boosts

Views

Activity

[Getting Started] Selecting default navLink on startup.
So, I've got a SwiftUI app with a triple column splitView. When the app starts on an 11 inch iPad, the "primary" column is offscreen. The primaryColumn has a List full of navigationLinks. Like so: List { ForEach(items, id: \.itemID) { item in NavigationLink(tag: item, selection: $selectedItem) {               ItemDetailsView(item: item) ... Now, the selection of the first Column in the split view cascades through the rest of the app, so populating it is pretty important. I've tried having the selectedItem be set from an EnvironmentObject. I've also tried having it set in onAppear. Everything I try only causes a selection the "pop into place" whenever I expose the primary column of the sidebar. Am I going about this the wrong way? Is it because the sidebar is hidden by default?
1
0
824
Jul ’21
[iOS15] Fruta Doesn't Launch With Populated State
Asking with the WWDC-10220 tag because Fruta is the sample code for this presentation. When launching Fruta on a landscape iPad Pro 11", the "State" of the application is completely empty until the user taps the back button. After tapping back everything appears to pop into place. Is this expected behavior in SwiftUI when using split screens? NavigationLinks are finicky and I'm expecting the programmatic setting of the primary column "selection" to be resolved on launch, not when the user taps back.
0
0
893
Jul ’21
File type limitations of sendResource?
Hello, Im noticing a behavior when I try to send a package file as a resource to peer. A file like this is basically a folder with an extension, and, and despite receiving a progress object from the send call, I’m not seeing it rise past 0.0%. Attaching it to a ProgressView also does not show any progress. The completion handler of the send is never called with an error and the receiver will only get the “finished receiving” callback if the host cancels or disconnects. I didn’t see anything in the sendResource documentation about not supporting bundle files, but it would not surprise me. Any thoughts?
1
0
899
Feb ’22
Create a World Anchor from a Spatial Tap Gesture?
World Anchor from SpatialTapGesture ?? At 19:56 in the video, it's mentioned that we can use a SpatialTapGesture to "identify a position in the world" to make a world anchor. Which API calls are utilized to make this happen? World anchors are created with 4x4 matrices, and a SpatialTapGestures doesn't seem to generate one of those. Any ideas?
0
0
828
Aug ’23
RayCasting to Surface Returns Inconsistent Results?
On Xcode 15.1.0b2 when rayacsting to a collision surface, there appears to be a tendency for the collisions to be inconsistent. Here are my results. Green cylinders are hits, and red cylinders are raycasts that returned no collision results. NOTE: This raycast is triggered by a tap gesture recognizer registering on the cube... so it's weird to me that the tap would work, but the raycast not collide with anything. Is this something that just performs poorly in the simulator? My RayCasting command is: guard let pose = self.arSessionController.worldTracking.queryDeviceAnchor(atTimestamp: CACurrentMediaTime()) else { print("FAILED TO GET POSITION") return } let transform = Transform(matrix: pose.originFromAnchorTransform) let locationOfDevice = transform.translation let raycastResult = scene.raycast(from: locationOfDevice, to: destination, relativeTo: nil) where destination is retrieved in a tap gesture handler via: let worldPosition: SIMD3<Float> = value.convert(value.location3D, from: .local, to: .scene) Any findings would be appreciated.
2
0
733
Nov ’23
How does an indirect drag gesture work?
Hello, I’ve got a few questions about drag gestures on VisionOS in Immersive scenes. Once a user initiates a drag gesture are their eyes involved anymore in the gesture? If not and the user is dragging something farther away, how far can they move it using indirect gestures? I assume the user’s range of motion is limited because their hands are in their lap, so could they move something multiple meters along a distant wall? How can the user cancel the gesture If they don’t like the anticipated / telegraphed result? I’m trying to craft a good experience and it’s difficult without some of these details. I have still not heard back on my devkit application. Thank you for any help.
2
0
728
Dec ’23
Retrieve AnchorEntity Location relative to Scene?
I want to place a ModelEntity at an AnchorEntity's location, but not as a child of the AnchorEntity. ( I want to be able to raycast to it, and have collisions work.) I've placed an AnchorEntity in my scene like so: AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: [2.0, 2.0]), trackingMode: .continuous) In my RealityView update closure, I print out this entity's position relative to "nil" like so: wallAnchor.position(relativeTo: nil) Unfortunately, this position doesn't make sense. It's very close to zero, even though it appears several meters away. I believe this is because AnchorEntities have their own self contained coordinate spaces that are independent from the scene's coordinate space, and it is reporting its position relative to its own coordinate space. How can I bridge the gap between these two? WorldAnchor has an originFromAnchorTransform property that helps with this, but I'm not seeing something similar for AnchorEntity. Thank you
0
0
695
Dec ’23
SceneReconstruction alongside WorldTracking silently fails?
Hello, I've noticed that when I have my ARSession run the sceneReconstruction provider and the world tracking provider at the same time, I receive no scene reconstruction mesh updates. My catch closure doesn't receive any errors, it just doesn't send anything to the async list. If I run just the scene reconstruction provider by itself, then I do get mesh updates. Is this a bug? Is it expected that it's not possible to do this? Thank you
1
0
739
Feb ’24
[NewbQs] Is this possible with AppIntentDomains?
As a user, when viewing a photo or image, I want to be able to tell Siri, “add this to ”, similar to example from the WWDC presentation where a photo is added to a note in the notes app. Is this... possible with app domains as they are documented? I see domains like open-file and open-photo, but I don't know if those are appropriate for this kind of functionality?
1
0
586
Sep ’24