Post

Replies

Boosts

Views

Activity

Tab view display bug on visionOS
I noticed a weird tab view display bug on visionOS if the tab labels are changed at runtime, e.g. to switch from one locale to another. If the longest label on the tabs is smaller than the previous longest tab label, the tab ornament's width shrinks and thus the texts and icons can become barely visible, even if the tab labels are not being displayed. If the longest tab label gets longer, however, additional padding is added. It seems as if the calculation for the tab width does not take dynamic changes into account. Is there a workaround for this behavior?
2
0
662
Apr ’24
What is the best way to lighten up a scene via ambient lighting?
Per default, RealityKit scenes are way too dark. The only solution I could find for this problem so far was to add an image based light to it. But it looks pretty weird, if I use a central IBL. The shadows are way too strong, and if I increase the intensity, the highlights get way too light. If I attach a light to each object, it kinda works, but feels strange to do so. I could not find any option to just setup an ambient light. Isn't there a way to do this, or am I just too dumb?
0
0
415
Apr ’24
How to rotate a SwiftUI view?
I just tried out the app "Blue Moon" (Solitaire Game) from the App Store. They managed to add a secondary SwiftUI tutorial view that resides to the left of the main window and is rotated towards the user. How can this be achieved? I tried to use ornaments, but couldn't find a tilting / rotating option.
0
0
554
Apr ’24
How to create a immersive panorama like the environments?
I would like to create a immersive panorama like the environments where the user can look around 360°, yet interactive, i.e. the user should be able to interact with entities placed on that panorama. My current approach is to create a sphere around the user and invert the normals, so the texture is placed inwards, towards the user. This works, but open SwiftUI windows show pretty weird behaviors, as described here: https://developer.apple.com/forums/thread/749956 Windows don't show their handles anymore, and the glass effects do not recognize my sphere but show the world "outside" of it. This is not the case for Apple's environments. Is there a better way to create a fully immersive sphere around the user?
0
0
494
Apr ’24
QR codes on visionOS
Our app needs to scan QR codes (or a similar mechanism) to populate it with content the user wants to see. Is there any update on QR code scanning availability on this platform? I asked this before, but never got any feedback. I know that there is no way to access the camera (which is an issue in itself), but at least the system could provide an API to scan codes. (It would be also cool if we were able to use the same codes Vision Pro uses for detecting the Zeiss glasses, as long as we could create these via server-side JavaScript code.)
2
0
1.3k
Jun ’24
SwiftUI previews don't work in multi-platform app
I created a native visionOS app which I am now trying to convert into a multi-platform app, so iOS is supported as well. I also have Swift packages which differ from platform to platform, to handle platform-specific code. My SwiftUI previews work fine if I just setup visionOS as the target. But as soon as I add iOS 17 (with a minimum deployment of 17), they stop working. If I try to display them in the canvas, compilation fails and I get errors that my packages require iOS 17, but the device supports iOS 12. Which I never defined anywhere. This even happens if I set the preview to visionOS. If I run the same setup on a real device or a simulator, everything works just fine. Only the previews are affected by this. How do the preview device decide which minimum deployment version it should use, and how can I change this?! Update: This only happens if the app has a package dependency for a Swift package that itself includes a RealityKitContent package as a sub-dependency. I defined to only include this package in visionOS builds, and also the packages themselves define the platform as .visionOS(.v1) If I remove this package completely from "Frameworks, Libraries, and Embedded Content" the previews work again. Re-adding the package results in this weird behavior that the preview canvas thinks it is building for iOS 12.
0
0
557
May ’24
RealityKit on iOS: New anchor entity takes ages to show up
I'm implementing an AR app with Image Tracking capabilities. I noticed that it takes very long for the entities I want to overlay on a detected image to show up in the video feed. When debugging using debugOptions.insert(.showAnchorOrigins), I realized that the image is actually detected very quickly, the anchor origins show up almost immediately. And I can also see that my code reacts with adding new anchors for my ModelEntities there. However, it takes ages for these ModelEntities to actually show up. Only if I move the camera a lot, they will appear after a while. What might be the reason for this behaviour? I also noticed that for the first image target, a huge amount of anchors are being created. They start from the image and go all up towards the user. This does not happen with subsequent (other) image targets.
1
0
563
Jun ’24
Tracking of moving images is super slow on visionOS
I noticed that tracking moving images is super slow on visionOS. Although the anchor update closure is called multiple times per second, the anchor's transform seems to be updated only once in a while. Another issue might be that the SwiftUI isn't updating more often. On iOS, image tracking is pretty smooth. Is there a way to speed this up somehow on visionOS, too?
1
0
708
Jun ’24