Hello,
I'm curious if anyone has some useful debug tools for out-of-bounds issues with Volumes.
I am opening a volume with a size of 1m, 1m, 10cm.
I am adding a RealityView with a ModelEntity that is 0.5m tall and I am seeing the model clip at the top and bottom. I find this odd, because I feel like it should be within the size of the Volume....
I was curious what size SwiftUI says the Volume's size is so I tried using a GeometryReader3D to tell me....
GeometryReader3D { proxy in
VStack {
Text("\(proxy.size.width)")
Text("\(proxy.size.height)")
Text("\(proxy.size.depth)")
}
.padding().glassBackgroundEffect()
}
Unfortunately I get, 680, 1360, and 68. I'm guessing these units are in points, but that's not very helpful. The documentation says to use real-world units for Volumes, but none of the SwiftUI frame setters and getters appear to support different units.
Is there a way to convert between the two? I'm not clear if this is a bug or a feature suggestion.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hello,
With the advent of widget interactivity, in order to support state management, I'd like to differentiate one widget from another, even if they share the same configuration.
Is this possible? Many of my search results are turning up iOS 15 era information, and I am not sure if that's still valid.
Thank you
World Anchor from SpatialTapGesture ??
At 19:56 in the video, it's mentioned that we can use a SpatialTapGesture to "identify a position in the world" to make a world anchor.
Which API calls are utilized to make this happen?
World anchors are created with 4x4 matrices, and a SpatialTapGestures doesn't seem to generate one of those.
Any ideas?
While PlaneAnchors are still not generated by the PlaneDetectionProvider in the simulator, I am still brainstorming how to detect a tap on one of the planes.
In an iOS ARKit application I could use a raycastQuery on existingPlaneGeometry to make an anchor with the raycast result's world transform.
I've not yet found the VisionOS replacement for this.
A possible hunch is that I need to install my own mesh-less PlaneModelEntities for each planeAnchor that's returned by the PlaneDetectionProvider. From there I can use a TapGesture targeted to those models? And then I could build a an WorldAnchor from the tap location on those entities.
Anyone have any ideas?
Disclaimer: I am new to all things 3D. There could be a variety of things wrong with what I'm doing that are not unique to RealityKit. Any domain info would be appreciated.
So, I'm following, what I think are, the recommended steps to import a shader-node material from reality composer pro and apply it to another modelEntity.
I do the following:
guard let entity = try? Entity.load(named: "Materials", in: RealityKitContent.realityKitContentBundle) else { return model }
let materialEntity = entity.findEntity(named: "materialModel") as? ModelEntity
guard let materialEntity else { return model }
I then configure a property on it like so:
guard var material = materialEntity.model?.materials[0] as? ShaderGraphMaterial else { return model }
try coreMaterial.setParameter(name: "BaseColor", value: .color(matModel.matCoreUIColor))
I then apply it.
This is what my texture looks like in RealityComposer:
I notice that my rendered object has distortions in the actual RealityView. Note the diagonal lines that appear "Stretched".
What could be doing this? I thought Node Shaders were supposed to be more resilient to distortions like this? I'm not sure if I've got a bug or if I'm using it wrong.
FWIW, this is a shader based on apple's felt material shader. My graph looks like this:
Thanks
I've noticed that the bounds of my ModelEntity is not impacted when I transform one of the mesh's Joints.
I've attached an image those demonstrates this. It appears it will cause issues with the collision bounding box as well.
Is this a bug or user error? Not many of this framework's methods are documented.
I'd like an Image subview of a lock screen widget to render as itself, and not with the multiply-like effect it gets today.
I've tried .widgetAccentable(true) and .widgetAccentable(false), but none have the appearance I'm looking for.
Is there maybe a new modifier that lets me "force" the rendering mode? Hoping there is and it's just not jumping out at me.
Thanks for your help.
When using the showAnchorGeometry I see lots of green surface anchors in my scene and it has been really helpful for debugging the placement of objects when I tap the screen of my device.
But I also get some blue shapes too, and I'm not quite sure what those mean... Is there a document that explains what showAnchorGeometry is actually... showing?
Googling "showAnchorGeometry blue" wasn't helpful! (I promise I tried)
It is possible via the new AppIntents framework to open your app from via a shortcuts intent, but I am currently very confused about how to ensure that a particular window is opened in a SwiftUI-runtime app.
If the use says "Open View A" via a shortcuts, or Siri, I'd like to make sure it opens the window for "View A", though a duplicate window could be acceptable too.
The WWDC22 presentation has the following:
@MainActor
func perform() async throws -> some IntentResult {
Navigator.shared.openShelf(.currentlyReading)
return .result()
}
Where, from the perform method of the Intent structure, they tell an arbitrary Navigator (code not provided) to just open a view of the app. (How convenient!)
But for a multiwindow swiftUI app, I'm not sure how to make this work. @Environment variables are not available within the Intent struct, and even if I did have a "Navigator Singleton", I'm not sure how it could get the @Environment for openWindow since it's a View environment. AppIntents exist outside the View environment tree AFAIK.
Any Ideas? I'd be a little shocked if this is a UIKit only sort of thing, but at the same time... ya never know.
I have the following parameter:
@Parameter(title: "Image", description: "Image to copy", supportedTypeIdentifiers: ["com.image"], inputConnectionBehavior: .connectToPreviousIntentResult)
var imageFile: IntentFile?
When I drop my AppIntent into a shortcut, though, I am unable to connect this parameter to the output of the previous step.
Given the documentation I have no idea how to achieve this, if the above, is not the correct way to do so.
Anything new this year to support reordering outline group items or items across sections in a multi-section list?
I really want to code my sidebar in swiftUI but user driven ordering is a must for me.
Hello, I'm trying to make a grid of container-relative shapes, where outside gutters match the gutters in between the items.
This stickiest part of this problem is the fact that calling .inset on a ContainerRelativeShape doubles the gutter in between the items.
I've tried LazyVGrid, and an HStack of VStacks, and they all have this double gutter in between.
I think I could move forward with some gnarly frame math, but I was curious if I'm missing some SwiftUI layout feature that could make this easier and more maintainable.
Hello,
I’m noticing that during a collaborative session anchors created on the host device are appearing in a different location on client devices and it’s making it challenging to test other collaboration logic.
For example, when the client places a textured plane mesh at an anchor placed by the host, this placement can sometimes be considered as behind a surface and it gets clipped by RealityKits mesh occlusion.
I’d prefer if I could see it floating in space when testing so I can see that something is happening.
Im drawing a blank on if there are any debugging options to help me out. Nothing in render or debug options jumped out at me.
Thoughts?
Hello,
Im noticing a behavior when I try to send a package file as a resource to peer.
A file like this is basically a folder with an extension, and, and despite receiving a progress object from the send call, I’m not seeing it rise past 0.0%. Attaching it to a ProgressView also does not show any progress.
The completion handler of the send is never called with an error and the receiver will only get the “finished receiving” callback if the host cancels or disconnects.
I didn’t see anything in the sendResource documentation about not supporting bundle files, but it would not surprise me.
Any thoughts?