Post

Replies

Boosts

Views

Activity

Apple: please unlock "enterprise features" for all visionOS devs!
We are developing apps for visionOS and need the following capabilities for a consumer app: access to the main camera, to let users shoot photos and videos reading QR codes, to trigger the download of additional content So I was really happy when I noticed that visionOS 2.0 has these features. However, I was shocked when I also realized that these capabilities are restricted to enterprise customers only: https://developer.apple.com/videos/play/wwdc2024/10139/ I think that Apple is shooting itself into the foot with these restrictions. I can understand that privacy is important, but these limitations restrict potential use cases for this platform drastically, even in consumer space. IMHO Apple should decide if they want to target consumers in the first place, or if they want to go the Hololens / MagicLeap route and mainly satisfy enterprise customers and their respective devs. With the current setup, Apple is risking to push devs away to other platforms where they have more freedom to create great apps.
1
3
740
Jun ’24
visionOS simulator broken on Intel macBook since upgrade to Sonoma
On my macBook Pro 2019 with Intel processor, I could run apps in the visionOS simulator without any problems when I was running macOS Ventura. But since I upgraded the Mac to Sonoma, the visionOS simulator seems to be broken. The display in Xcode sticks to "Loading visionOS 1.0", and the simulator page under "Devices and Simulators" says "No runtime". This is independent of which Xcode version I am using. I used Xcode 15 beta2, but also tried out more recent versions. Could it be that developing on Intel Macs was dropped on macOS Sonoma without any notice? I can see that the Xcode 15.1 specs state you need a Silicon Mac, but the Xcode 15 specs don't. And it worked for me, at least on Ventura. The "only" change I made since was upgrading the OS to Sonoma.
3
2
1.9k
Jan ’24
Lots of "garbage" in the Xcode logs, like "Decoding completed without errors"
Hi, if I run an app on the visionOS simulator, I get tons of "garbage" messages in the Xcode logs. Please find some samples below. Because of these messages, I can hardly see really relevant logs. Is there any way to get rid of these? [0x109015000] Decoding completed without errors [0x1028c0000] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 11496 [0x1028c0000] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060 [0x1021f3200] Releasing session [0x1031dfe00] Options: 1x-1 [FFFFFFFF,FFFFFFFF] 00054060 [0x1058eae00] Releasing session [0x10609c200] Decoding: C0 0x01000100 0x00003048 0x22111100 0x00000000 10901 [0x1058bde00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20910 [0x1028d5200] Releasing session [0x1060b3600] Releasing session [0x10881f400] Decoding completed without errors [0x1058e2e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 9124 [0x1028d1e00] Decoding: C0 0x01000100 0x0000304A 0x22111100 0x00000000 20778 [0x1031dfe00] Decoding completed without errors [0x1031fe000] Decoding completed without errors [0x1058e2e00] Options: 256x256 [FFFFFFFF,FFFFFFFF] 00025060```
0
1
426
Jan ’24
Map view in SwiftUI on visionOS: Bad performance with many markers
I am trying to build a visionOS app that uses a map as a central user interface. This works fine on high zoom levels when there are only a couple of markers present. But as soon as I zoom out and the markers number gets to hundreds or even thousands, the performance gets super, super bad. It takes seconds for the map to render, and pans are also laggy. What makes things worse is that the SwiftUI map does not support clustering yet. Has anyone found a solution to this? I found this example by Apple about how to implement clustering: https://developer.apple.com/documentation/mapkit/mkannotationview/decluttering_a_map_with_mapkit_annotation_clustering It works, but it's using UIKit and storyboards and I could not get it transformed into SwiftUI compatible code. I also found this blog post that created a neat SwiftUI integration for a clusterable map: https://www.linkedin.com/pulse/map-clustering-swiftui-dmitry-%D0%B2%D0%B5l%D0%BEv-j3x7f/ However, I wasn't able to adapt it so the map would update itself in a reactive way. I want to retrieve new data from our server if the user changes the visible region of the map and zooms in or out. I have no clue how to transfer my .onChange(of:) and .onMapCameraChange() modifiers to the UIKit world.
2
1
851
Mar ’24
DSA compliance in the EU: How to update address from DUNS?
Apple asked me today to add the compliance information for the Digital Services Act in the EU. I tried to do so, but ran into a major issue here. When I created the developer account many years ago, it was a personal account used by me as a natural person / freelancer in Germany. When I later founded my US company, I converted the existing developer account into a business account for that company. While doing this, I obtained a DUNS number which is linked to the business address in the States (California). However, it seems as if this US address never made it into App Store Connect. It still shows my personal address in Germany, which is not correct. I cannot modify it either. The address page says that I have to update it at DUNS. However, in their system, everything is ok. The problem seems to be related to the transfer of the address data between DUNS and App Store Connect. I opened up a ticket in the DUNS system, but I need to publish a new version of our app soon. So I am wondering if there is a faster way to get this resolved somehow?
3
1
2.2k
Oct ’24
Can the Vision Pro scan QR codes?
We want to use QR code to open and activate certain features in our app. We don't want these QR codes to be hard-coded in our app (i.e. image tracking). Instead we want to use them like you typically would do with your smart phone camera: Just detect them if the user looks at them. And if they encode a certain URL pointing to our app, start the app via an URL handler and hand over the link, like it is possible on iOS. I tried this in normal Shared Space and also with the camera app. But neither recognized a QR code. Is this feasible with the Vision Pro?
0
1
1.2k
Mar ’24
Set anchor point for SwiftUI attachment
We want to overlay a SwiftUI attachment on a RealityView, like it is done in the Diorama sample. By default, the attachments seem to be placed centered at their position. However, for our use-case we need to set a different anchor point, so the attachment is always aligned to one of the corners of the attachment view, e.g. the lower left should be aligned with the attachment's position. Is this possible?
0
1
601
May ’24
API for turning regular photos into spatial photos?
With quite some excitement I read about visionOS 2's new feature to automatically turn regular 2D photos into spatial photos, using machine learning. It's briefly mentioned in this WWDC video: https://developer.apple.com/wwdc24/10166 My question is: Can developers use this feature via an API, so we can turn any image into a spatial image, even if it is not in the device photo library? We would like to download an image from our server, convert it on the visionPro on-the-fly and display it as a spatial photo.
3
1
1.1k
Jun ’24