Post

Replies

Boosts

Views

Activity

App Store Connect: Publish visionOS-specific app for the same app store entry?
We have an existing AR app built for Android and iOS, using Unity. We now want to add a visionOS version of this app. However, this version is built natively, using Xcode directly, no Unity involved. I saw that I can add a new platform to my app in App Store Connect. But can I upload two different builds, and how will App Store Connect tell which uploaded bundle belongs to which platform?
1
0
525
Mar ’24
How to visualize collision components in RealityKit Composer Pro?
I setup an entity with a collision component on it. But it was hard to target the object for I tap gesture, until I increased the radius quite a bit. Now I am unsure if it is too large. Is there a way to visualize these components somehow, maybe even in a running scene? Also, I find it pretty confusing that the size is given in cm. This made me wonder if this cm setting is affected by the entity's size at all? In Unity, it's just (local) "units".
1
0
1.2k
Apr ’24
Sign in with Apple: No e-mail address provided on visionOS when access had been removed
On iOS, Sign in with Apple will provide an e-mail address if the user is logging in for the first time. On all subsequent logins, the e-mail address will be missing. However, this can be reset by removing the app from your Apple ID. If you then try to login again, the e-mail dialog will popup again, and the app will receive this e-mail. On visionOS, however, the latter does not happen. Even if I have removed the app from my Apple ID, the e-mail dialog won't show up again. The only way to resolve this is to reset the visionOS simulator (haven't tried it on a real device).
1
0
868
Oct ’24
How to exclude RealityKitContent from Swift package for iOS?
I've created an app for visionOS that uses a custom package that includes RealityKitContent as well (as a sub-package). I now want to turn this app into a multi-platform app that also supports iOS. When I try to compile the app for this platform, I get this error message: Building for 'iphoneos', but realitytool only supports [xros, xrsimulator] Thus, I want to exclude the RealityKitContent from my package for iOS, but I don't really know how. The Apple docs are pretty complicated, and ChatGPT did only give me solutions that did not work at all. I also tried to post this on the Swift forum, but no-one could help me there either - so I am trying my luck here. Here is my Package.swift file: // swift-tools-version: 5.10 import PackageDescription let package = Package( name: "Overlays", platforms: [ .iOS(.v17), .visionOS(.v1) ], products: [ .library( name: "Overlays", targets: ["Overlays"]), ], dependencies: [ .package( path: "../BackendServices" ), .package( path: "../MeteorDDP" ), .package( path: "Packages/OverlaysRealityKitContent" ), ], targets: [ .target( name: "Overlays", dependencies: ["BackendServices", "MeteorDDP", "OverlaysRealityKitContent"] ), .testTarget( name: "OverlaysTests", dependencies: ["Overlays"]), ] ) Based on a recommendation in the Swift forum, I also tried this: dependencies: [ ... .package( name: "OverlaysRealityKitContent", path: "Packages/OverlaysRealityKitContent" ), ], targets: [ .target( name: "Overlays", dependencies: [ "BackendServices", "MeteorDDP", .product(name: "OverlaysRealityKitContent", package: "OverlaysRealityKitContent", condition: .when(platforms: [.visionOS])) ] ), ... ] but this won't work either. The problem seems to be that the package is listed under dependencies, which makes the realitytool kick in. Is there a way to avoid this? I definitely need the RealityKitContent package being part of the Overlay package, since the latter depends on the content (on visionOS). And I would not want to split the package up in two parts (one for iOS and one for visionOS), if possible.
1
0
1.1k
Jun ’24
Multi-platform app for visionOS and iOS: How to include 3D models for both?
I created an app for visionOS, using Reality Composer Pro. Now I want to turn this app into a multi-platform app for iOS as well. RCP files are not supported on iOS, however. So I tried to use the "old" Reality Composer instead, but that doesn't seem to work either. Xcode 15 does not include it anymore, and I read online that files created with Xcode 14's Reality Composer cannot be included in Xcode 15 files. Also, Xcode 14 does not run on my M3 Mac with Sonoma. That's a bummer. What is the recommended way to include 3D content in apps that support visionOS AND iOS?! (I also read that a solution might be using USDZ for both. But how would that workflow look like? Are there samples out there that support both platforms? Please note that I want to setup the anchors myself, using code. I just need the composing tool to the create 3D content that will be placed on these anchors.)
1
0
980
Jun ’24
RealityKit on iOS: New anchor entity takes ages to show up
I'm implementing an AR app with Image Tracking capabilities. I noticed that it takes very long for the entities I want to overlay on a detected image to show up in the video feed. When debugging using debugOptions.insert(.showAnchorOrigins), I realized that the image is actually detected very quickly, the anchor origins show up almost immediately. And I can also see that my code reacts with adding new anchors for my ModelEntities there. However, it takes ages for these ModelEntities to actually show up. Only if I move the camera a lot, they will appear after a while. What might be the reason for this behaviour? I also noticed that for the first image target, a huge amount of anchors are being created. They start from the image and go all up towards the user. This does not happen with subsequent (other) image targets.
1
0
562
Jun ’24
Apple: please unlock "enterprise features" for all visionOS devs!
We are developing apps for visionOS and need the following capabilities for a consumer app: access to the main camera, to let users shoot photos and videos reading QR codes, to trigger the download of additional content So I was really happy when I noticed that visionOS 2.0 has these features. However, I was shocked when I also realized that these capabilities are restricted to enterprise customers only: https://developer.apple.com/videos/play/wwdc2024/10139/ I think that Apple is shooting itself into the foot with these restrictions. I can understand that privacy is important, but these limitations restrict potential use cases for this platform drastically, even in consumer space. IMHO Apple should decide if they want to target consumers in the first place, or if they want to go the Hololens / MagicLeap route and mainly satisfy enterprise customers and their respective devs. With the current setup, Apple is risking to push devs away to other platforms where they have more freedom to create great apps.
1
3
740
Jun ’24
Tracking of moving images is super slow on visionOS
I noticed that tracking moving images is super slow on visionOS. Although the anchor update closure is called multiple times per second, the anchor's transform seems to be updated only once in a while. Another issue might be that the SwiftUI isn't updating more often. On iOS, image tracking is pretty smooth. Is there a way to speed this up somehow on visionOS, too?
1
0
708
Jun ’24
No object detection on visionOS?
I recently had a chat with a company in the manufacturing business. They were asking if Vision Pro could be used to guide maintenance workers through maintenance processes, a use-case that is already established on other platforms. I thought Vision Pro would be perfect for this as well, until I read in this article from Apple that object detection is not supported: https://developer.apple.com/documentation/visionos/bringing-your-arkit-app-to-visionos#Update-your-interface-to-support-visionOS To me, this sounds like sacrificing a lot of potential for business scenarios, just for the sake of data privacy. Is this really the case, i.e. is there no way to detect real-world objects and place content on top of them? Image recognition would not be enough in this use-case.
2
0
1.4k
Dec ’23
Using WKWebKit and Safari on visionOS
I have implemented a custom view that shows a page in WKWebKit: import SwiftUI import WebKit struct WebView: UIViewRepresentable { let urlString: String func makeUIView(context: Context) -> WKWebView { let webView = WKWebView() webView.navigationDelegate = context.coordinator return webView } func updateUIView(_ uiView: WKWebView, context: Context) { if let url = URL(string: urlString) { let request = URLRequest(url: url) uiView.load(request) } } func makeCoordinator() -> Coordinator { Coordinator(self) } class Coordinator: NSObject, WKNavigationDelegate { var parent: WebView init(_ parent: WebView) { self.parent = parent } } } It works, but it shows a grey button in the upper left with no icon. If I click on that button, nothing happens. But I can see this error message in the Xcode logs: Trying to convert coordinates between views that are in different UIWindows, which isn't supported. Use convertPoint:fromCoordinateSpace: instead. What is this button, and how can I get rid of it? As a second question: I also tried to spawn Safari in a separate window, using this view: import SafariServices import SwiftUI struct SafariView: UIViewControllerRepresentable { let url: URL func makeUIViewController(context: Context) -> SFSafariViewController { return SFSafariViewController(url: url) } func updateUIViewController(_ uiViewController: SFSafariViewController, context: Context) { // No update logic needed for a simple web view } } This works, but Safari shows up behind the view that is including the Safari view. Instead, I would want Safari to show up in front - or even better: next to my main view (either left or right). Is there a way to do this?
2
0
1.1k
Feb ’24